Data Engineer, Application Programming Interface
New Today
Overview
Join us as a Data Engineer, Application Programming Interface
What you'll do
As a Data Engineer, you'll be looking to simplify our organisation by developing innovative data driven solutions through data pipelines, modelling and ETL design, inspiring to be commercially successful while keeping our customers, and the bank's data, safe and secure.
You'll drive customer value by understanding complex business problems and requirements to correctly apply the most appropriate and reusable tool to gather and build data solutions. You'll support our strategic direction by engaging with the data engineering community to deliver opportunities, along with carrying out complex data engineering tasks to build a scalable data architecture.
Your responsibilities will also include:
Building advanced automation of data engineering pipelines through removal of manual stages
Developing scalable, secure microservices that integrate with existing banking infrastructure, ensuring compliance with financial industry standards
Embedding new data techniques into our business through role modelling, training, and experiment design oversight
Delivering a clear understanding of data platform costs to meet your departments cost saving and income targets
Sourcing new data using the most appropriate tooling for the situation
Developing solutions for streaming data ingestion and transformations in line with our streaming strategy
The skills you'll need
To thrive in this role, you'll need experience of designing and implementing RESTful Application Programming Interfaces (API) within microservices architecture, as well as a strong understanding of data usage and dependencies and experience of extracting value and features from large scale data. You'll also bring practical experience of programming languages such as Java, Python and Spark, alongside knowledge of data and software engineering fundamentals.
Additionally, you'll need:
Experience of ETL technical design, data quality testing, cleansing and monitoring, data sourcing, and exploration and analysis
A good understanding of cloud platforms such as AWS, AWS EMR, AWS Lambda, Airflow or Step Functions, Snowflake, PostgreSQL and Redshift
Data warehousing, data modelling and Lakehouse capabilities
A good understanding of modern code development practices
Experience of working in a governed, and regulatory environment
Strong communication skills with the ability to proactively engage and manage a wide range of stakeholders
Hours
35
Job Posting Closing Date: 23/10/2025
Ways of Working: Hybrid
#J-18808-Ljbffr
- Location:
- City Of Edinburgh, Scotland, United Kingdom
- Job Type:
- FullTime