You will be working closely with the database teams and data engineering building specific systems facilitating the extraction and transformation of Creditsafe data. The role will define and build data pipelines that will enable faster, better, data-informed decision-making both within the business and for Creditsafe customers. This is an opportunity to gain exposure to big data architectures.
• Provide leadership and mentorship to team members
• Play a hands on role as part of an Agile team to develop, test and maintain high quality systems that fulfil business needs.
• Extracting data from various files, systems, cloud sources, databases or APIs through writing and executing code (SQL and similar)
• Cleaning and combining offline, online and mixed sources into datasets, building in manual or automatic validation and accuracy checks. Making use of Python, SQL or specialist Big Data frameworks
• Help support the team in maintaining existing Oracle data processes and data infrastructure
• Strong focus on quality. Execute practices such as continuous integration and test driven development to enable the rapid delivery of working code.
• Write documentation of new processes and products you’ve developed so that knowledge is shared
• Design and build a pattern based data pipeline using Python and SQL in accordance with guidelines set by the Data Architect
• Help to design, build and launch data extraction models
• Create functioning data pipelines using industry standard processing patterns
• 4+ years solid development experience within a commercial environment
• Knowledge of Agile development methodologies
• Some experience of working with multiple data sources and scripting languages
• Knowledge of SQL programming and code optimisation.
• Proficient knowledge of PLSQL and Oracle native processes
• Awareness of cloud technology particularly AWS.
• Knowledge of automated delivery processes
• Some experience designing and building autonomous data pipelines
• Hands on experience of best engineering practices (handling and logging errors, system monitoring and building human-fault-tolerant applications)
• Experience of working with large datasets
• Ability to write efficient code and comfortable undertaking system optimisation and performance tuning tasks
• Experience working in a windows or unix environment with the ability to perform basic database administration tasks
• Comfortable working with relational databases such as Oracle, PostgreSQL, MySQL, and MariaDB
• Teamwork – Encourages cooperation, collaboration and partnerships
• Quality Improvement – strives for high quality performance.
• Problem Solving – Identifies problems and seeks best solutions by being creative and innovative