Data Engineer

New Yesterday

Overview

Join to apply for the Data Engineer role at NewDay

We are seeking a Data Engineer to contribute to the data platform and Data Lake initiatives across the organisation.

Responsibilities

  • Hands on development on Nexus Platform
  • Monitoring daily BAU data pipelines and ensure our data solution is refreshed up to date every day
  • Enhance the daily BAU process, making it easier to monitor and less likely to fail; hands on development on Data Lake build, change and defect fix
  • Building new data pipelines using existing frameworks and patterns
  • Working with the team within an Agile framework using agile methodology tooling that controls development and CI/CD release processes
  • Contributing to the new Data Lake technology across the organisation to address a broad set of use cases across data science and data warehousing

Essential

  • Experience with data solution BAU processes (ETL, table refresh etc.)
  • Experience with integration of data from multiple data sources
  • Experience in Big Data data integration technologies such as Spark, Scala, Kafka
  • Experience in programming languages such as Python or Scala; experience using AWS, DBT and Snowflake
  • Analytical and problem-solving skills applied to data solutions
  • Experience of CI/CD
  • Good aptitude in multi-threading and concurrency concepts
  • Familiarity with Linux scripting fundamentals
  • Experience in ETL technologies
  • AWS exposure (Athena, Glue, EMR, Step Functions)
  • Experience with Snowflake and DBT
  • Experience with data solution BAU processes (ETL, table refresh etc.)
  • Proficiency with ETL technologies (e.g. Talend, Informatica, Ab Initio)
  • Previous exposure to Python
  • Previous exposure to data solution BAU monitoring and enhancement
  • Exposure to building applications for a cloud environment

You will deliver

  • Hands on development on Nexus Platform
  • Monitoring daily BAU data pipelines and ensure our data solution is refreshed up to date every day
  • Enhance the daily BAU process, making it easier to monitor and less likely to fail; hands on development on Data Lake build, change and defect fix
  • Building new data pipelines using existing frameworks and patterns
  • Working with the team within an Agile framework using agile methodology tooling that controls our development and CI/CD release processes
  • Contributing to the new Data Lake technology across the organisation to address a broad set of use cases across data science and data warehousing

Essential

Skills and Experience

  • Experience with data solution BAU processes (ETL, table refresh etc.)
  • Experience with integration of data from multiple data sources
  • Experience in Big Data data integration technologies such as Spark, Scala, Kafka
  • Experience in programming languages such as Python or Scala; experience using AWS, DBT and Snowflake
  • Analytical and problem-solving skills applied to data solutions
  • Experience of CI/CD
  • Good aptitude in multi-threading and concurrency concepts
  • Familiarity with Linux scripting fundamentals

Desirable

  • Experience of ETL technologies
  • AWS exposure (Athena, Glue, EMR, Step Functions)
  • Experience with Snowflake and DBT
  • Experience with data solution BAU processes (ETL, table refresh etc)
  • Proficiency with ETL technologies (e.g. Talend, Informatica, Ab Initio)
  • Previous exposure to Python
  • Previous exposure to data solution BAU monitoring and enhancement
  • Exposure to building applications for a cloud environment

We work with Textio

We’re unlocking community knowledge in a new way. Experts add insights directly into each article, started with the help of AI.

Employment details

  • Permanent
  • Seniority level: Entry level
  • Employment type: Full-time
  • Job function: Information Technology

Referrals increase your chances of interviewing at NewDay by 2x

#J-18808-Ljbffr
Location:
City Of London
Job Type:
FullTime
Category:
IT & Technology

We found some similar jobs based on your search