Data Engineer- Systematic Fund

New Yesterday

Job Description

This research-centric fund leverages quantitative analysis and cutting-edge technology to identify and capitalize on opportunities across global financial markets. Fostering a collaborative and intellectually stimulating environment, they bring together individuals with Mathematics, Physics and Computer Science backgrounds who are passionate about applying rigorous scientific methods to financial challenges. As a fundamentally data-driven business, their success is heavily linked to the acquisition, processing, and analysis of vast datasets. High-quality, well-managed data forms the critical foundation for quantitative research, strategy development, and automated trading systems.

As a Data Engineer within the Quantitative Platform team, you'll play a pivotal role in building and maintaining the data infrastructure that fuels research and trading strategies. You will be responsible for the end-to-end lifecycle of diverse datasets - including market, fundamental, and alternative sources - ensuring their timely acquisition, rigorous cleaning and validation, efficient storage, and reliable delivery through robust data pipelines.

Working closely with quantitative researchers and technologists, you will tackle complex challenges in data quality, normalization, and accessibility, ultimately providing the high-fidelity, readily available data essential for developing and executing sophisticated investment models in a fast-paced environment.

Responsibilities:

  • Evaluating, onboarding, and integrating complex data products from diverse vendors, serving as a key technical liaison to ensure data feeds meet the stringent requirements for research and live trading.
  • Designing, implementing, and optimizing robust, production-grade data pipelines to transform raw vendor data into analysis-ready datasets, adhering to software engineering best practices and ensuring seamless consumption by automated trading systems.
  • Engineering and maintaining sophisticated automated validation frameworks to guarantee the accuracy, timeliness, and integrity of all datasets, directly upholding the quality standards essential for the efficacy of quantitative strategies.
  • Providing expert operational support for the data pipelines, rapidly diagnosing and resolving critical issues to ensure the uninterrupted flow of high-availability data powering daily trading activities.
  • Participating actively in team rotations, including on-call schedules, to provide essential coverage and maintain the resilience of data systems outside of standard business hours.

Requirements:
  • 5+ years' experience building ETL/ELT pipelines using Python and pandas within a financial environment.
  • Strong knowledge of relational databases and SQL.
  • Familiarity with various technologies, such as S3, Kafka, Airflow, Iceberg.
  • Proficiency working with large financial datasets from various vendors.
  • A commitment to engineering excellence and pragmatic technology solutions.
  • A desire to work in an operational role at the heart of a dynamic data-centric enterprise.
  • Excellent communication and collaboration skills, and the ability to work in a team.

Nice to have:
  • Strong understanding of financial markets.
  • Experience working with hierarchical reference data models.
  • Proven expertise in handling high-throughput, real-time market data streams.
  • Familiarity with distributed computing frameworks such as Apache Spark.
  • Operational experience supporting real-time systems.


Whilst we carefully review all applications, to all jobs, due to the high volume of applications we receive it is not possible to respond to those who have not been successful.

Contact
If this sounds like you, or you'd like more information, please get in touch:

George Hutchinson-Binks
george.hutchinson-binks@oxfordknight.co.uk
(+44) 07885 545220
linkedin.com/in/george-hutchinson-binks-a62a69252

Location:
London
Category:
Technology