Quant Trading Software Engineer - Data Platform

New Today

Senior Software Engineer - Data Platform

Join to apply for the Senior Software Engineer - Data Platform role at ALGOQUANT

Location: Remote, Global  |  Team: Technology  |  Reporting to: Head of Data and Machine Learning

About AlgoQuant

At AlgoQuant, we’re building the future of digital asset management — grounded in rigorous research, world-class technology, and a relentless focus on performance. We began as a proprietary trading firm, developing sophisticated algorithmic strategies and operating in some of the most complex and fast-moving markets. That DNA remains at our core, but today we are evolving into a globally distributed investment management business.

Our quantitative environment empowers innovation by combining vast data capabilities, disciplined model development, and highly automated execution. Risk management is embedded in every layer of our systems and decision-making. Technology isn’t just an enabler for us, it’s a core competency and a strategic edge.

Role Overview

We are seeking an experienced Software Engineer to design, build, and maintain the next-generation data infrastructure that powers AlgoQuant’s quantitative research. This role is foundational to the creation of our internal Data Lake, which will enable a world-class research environment for our quantitative researchers.

You will work closely with our Head of Data and Machine Learning, to architect a robust, scalable data ecosystem that integrates seamlessly with our research and trading workflows. This is a hands-on engineering role that combines deep technical expertise with collaboration across research, execution, and technology teams.

Key Responsibilities

Market Data Infrastructure

  • Design, build, and maintain real-time and batch data processing pipelines for market, alternative, and on-chain data sources.
  • Ensure high availability and low latency across critical ingestion and transformation processes.

Data Lake and Processing Platform

  • Develop and evolve our internal Data Lake and the surrounding data processing ecosystem.
  • Implement modern lakehouse technologies to enable scalable, queryable, and versioned data storage.

Data Quality & Monitoring

  • Build validation, monitoring, and alerting systems to guarantee the accuracy, consistency, and completeness of data.
  • Establish robust data quality frameworks and observability tooling across the pipeline stack.

Core Libraries & Frameworks

  • Develop and maintain internal Python and C++ libraries for feature calculation, data processing, backtesting, and ML inference.
  • Promote code reuse, performance optimization, and reproducibility across teams.

Collaboration & Stakeholder Support

  • Work closely with quant researchers, traders, and the execution team to understand data requirements and support their workflows.
  • Translate research and trading needs into reliable, production-grade data infrastructure.

Requirements

  • Experience: 10+ years in software engineering or data infrastructure development.
  • Languages: Expert-level Python and C++.
  • Distributed Data Systems: Proven experience with Spark, Flink, Slurm, Dask, or similar frameworks.
  • Data Lakehouse Technologies: Hands-on with Apache Iceberg, Delta Lake, or equivalent systems.
  • Messaging & Streaming: Strong experience with Kafka or similar streaming platforms.
  • Infrastructure: Proficient with Linux, Kubernetes, Docker, and workflow orchestrators like Airflow.
  • Machine Learning Exposure: Familiarity with PyTorch, TensorFlow, or model inference frameworks.
  • Cloud Platforms: Experience deploying and maintaining systems on AWS, GCP, or Azure.
  • AI Engineering Tools: Experience using Claude Code, GitHub Copilot, Codex, or similar AI-assisted coding tools.

Nice to Have

  • Experience working with financial market data, cryptocurrency exchange APIs, on-chain data, or alternative data sources.
  • Familiarity with quantitative research environments or systematic trading systems.
  • Contributions to open-source data or ML infrastructure projects.

What Success Looks Like

  • The Data Lake and surrounding platform become the trusted foundation of AlgoQuant’s research environment.
  • Data pipelines and libraries are robust, scalable, and observable, enabling fast iteration for researchers.
  • Researchers and traders have reliable, self-service access to clean, validated data.
  • Collaboration between research, trading, and infrastructure teams results in accelerated innovation and strategy deployment.

Why Join AlgoQuant

  • Help build the data backbone of a cutting-edge digital asset investment platform.
  • Collaborate directly with world-class engineers, researchers, and quants.
  • Be part of a fully remote, high-performance culture that values innovation, autonomy, and continuous learning.
  • Shape the future of data infrastructure in a company where technology drives alpha.
#J-18808-Ljbffr
Location:
England, United Kingdom
Salary:
£80,000 - £100,000
Job Type:
FullTime
Category:
IT & Technology