Senior Data Engineer (Kafka) (Remote - Europe)
New Yesterday
Senior Data Engineer (Kafka) – Europe (Remote)
This role offers the opportunity to own and shape a cutting-edge real-time data streaming platform, ensuring reliable and scalable event-driven solutions across complex systems. You will design and operate Kafka pipelines, enforce governance and data quality standards, and collaborate with cross-functional teams to deliver innovative solutions in a fast-paced, international, and fully remote environment.
Accountabilities
- Designing and maintaining Kafka-based pipelines (Confluent Cloud or self-managed) to support scalable, reliable event flows
- Architecting topics, partitions, and retention policies while implementing ingestion solutions with Kafka Connect/Debezium
- Enforcing data quality and governance with schema validation (Avro/Protobuf), compliance handling (GDPR/CCPA), and monitoring tools
- Ensuring high availability, disaster recovery, and secure operations through CI/CD, infrastructure as code, and cost optimization
- Collaborating with data, analytics, product, and marketing teams to define event-driven interfaces and service-level agreements
- Continuously improving latency, throughput, and reliability while evaluating new streaming tools and emerging best practices
Requirements
- 6+ years of experience in data engineering, with strong expertise in event-driven systems
- At least 3 years of hands-on production experience with Kafka (Confluent Platform/Cloud), including Kafka Connect, Schema Registry, and Kafka Streams/ksqlDB (or Flink/Spark alternatives)
- Proficiency in Python for building services, tooling, and test frameworks
- Strong knowledge of event modeling, data formats (Avro, Protobuf, JSON), idempotency, and DLQ handling
- Excellent communication skills in English (C1 level) and experience working in consulting, agency, or client-facing environments
- Familiarity with cloud platforms (AWS, Azure, or GCP), container orchestration (Docker/Kubernetes), CI/CD pipelines, and Git
- Experience with observability tools (Prometheus, Grafana, Datadog) and incident response processes is highly valued
- Additional experience with Kotlin for Kafka Streams, Jupyter for analysis, IaC (Terraform/CloudFormation), CDPs (mParticle), and tag management tools (Tealium) is a strong plus
Benefits
- 100% remote work setup within Europe, with flexibility across time zones
- A fast-moving, innovative environment where learning and growth are constant
- Career progression paths with opportunities for advancement
- International exposure, working with teams across Europe, the US, and beyond
- A supportive and collaborative work culture that values initiative and creativity
- Competitive compensation package with room for growth
- Regular virtual and in-person team events to foster strong connections
Notes: This description reflects the role and requirements as published; any additional company or process descriptions have been kept to maintain context and do not alter the responsibilities or qualifications.
- Location:
- United Kingdom
- Job Type:
- FullTime
- Category:
- IT & Technology