Data Engineer

New Today

OverviewJoin to apply for the Data Engineer role at NewDayWe are seeking a Data Engineer to contribute to the data platform and Data Lake initiatives across the organisation.ResponsibilitiesHands on development on Nexus PlatformMonitoring daily BAU data pipelines and ensure our data solution is refreshed up to date every dayEnhance the daily BAU process, making it easier to monitor and less likely to fail; hands on development on Data Lake build, change and defect fixBuilding new data pipelines using existing frameworks and patternsWorking with the team within an Agile framework using agile methodology tooling that controls development and CI/CD release processesContributing to the new Data Lake technology across the organisation to address a broad set of use cases across data science and data warehousingEssentialExperience with data solution BAU processes (ETL, table refresh etc.)Experience with integration of data from multiple data sourcesExperience in Big Data data integration technologies such as Spark, Scala, KafkaExperience in programming languages such as Python or Scala; experience using AWS, DBT and SnowflakeAnalytical and problem-solving skills applied to data solutionsExperience of CI/CDGood aptitude in multi-threading and concurrency conceptsFamiliarity with Linux scripting fundamentalsExperience in ETL technologiesAWS exposure (Athena, Glue, EMR, Step Functions)Experience with Snowflake and DBTExperience with data solution BAU processes (ETL, table refresh etc.)Proficiency with ETL technologies (e.g. Talend, Informatica, Ab Initio)Previous exposure to PythonPrevious exposure to data solution BAU monitoring and enhancementExposure to building applications for a cloud environmentYou will deliverHands on development on Nexus PlatformMonitoring daily BAU data pipelines and ensure our data solution is refreshed up to date every dayEnhance the daily BAU process, making it easier to monitor and less likely to fail; hands on development on Data Lake build, change and defect fixBuilding new data pipelines using existing frameworks and patternsWorking with the team within an Agile framework using agile methodology tooling that controls our development and CI/CD release processesContributing to the new Data Lake technology across the organisation to address a broad set of use cases across data science and data warehousingEssentialSkills and ExperienceExperience with data solution BAU processes (ETL, table refresh etc.)Experience with integration of data from multiple data sourcesExperience in Big Data data integration technologies such as Spark, Scala, KafkaExperience in programming languages such as Python or Scala; experience using AWS, DBT and SnowflakeAnalytical and problem-solving skills applied to data solutionsExperience of CI/CDGood aptitude in multi-threading and concurrency conceptsFamiliarity with Linux scripting fundamentalsDesirableExperience of ETL technologiesAWS exposure (Athena, Glue, EMR, Step Functions)Experience with Snowflake and DBTExperience with data solution BAU processes (ETL, table refresh etc)Proficiency with ETL technologies (e.g. Talend, Informatica, Ab Initio)Previous exposure to PythonPrevious exposure to data solution BAU monitoring and enhancementExposure to building applications for a cloud environmentWe work with TextioWe’re unlocking community knowledge in a new way. Experts add insights directly into each article, started with the help of AI.Employment detailsPermanentSeniority level: Entry levelEmployment type: Full-timeJob function: Information TechnologyReferrals increase your chances of interviewing at NewDay by 2x #J-18808-Ljbffr
Location:
City Of London, England, United Kingdom
Job Type:
FullTime

We found some similar jobs based on your search