Senior Software Engineer - Data Infrastructure (Kafka, IaC)

New Yesterday

Senior Software Engineer - Data Infrastructure (Kafka, IaC)

London, UK

Snyk is the leader in secure AI software development, helping millions of developers develop fast and stay secure as AI transforms how software is built. Our AI-native Developer Security Platform integrates seamlessly into development and security workflows, making it easy to find, fix, and prevent vulnerabilities — from code and dependencies to containers and cloud.

Our mission is to empower every developer to innovate securely in the AI era — boosting productivity while reducing business risk. We’re not your average security company - we build Snyk on One Team, Care Deeply, Customer Centric, and Forward Thinking.

It’s how we stay driven, supportive, and always one step ahead as AI reshapes our world.

Overview

As a Senior Software Engineer, you’ll play a key role in shaping how modern organizations build and secure software at scale. You’ll join a collaborative, forward-thinking team and help drive our mission to embed security into every part of the AI-native development lifecycle.

You’ll join a team of exceptionally strong and talented engineers who are the backbone of our data infrastructure. You\'ll take ownership of high-scale Kafka clusters on Confluent and AWS MSK and our data warehousing in Snowflake, empowering our organization with real-time data and shaping our foundational platform.

Within our Data Infrastructure team, you’ll be maintaining and improving our Confluent and AWS MSK Kafka clusters and their integration with our data warehouse in Snowflake. This is a true platform role where you\'ll not only build but also empower other engineering teams to leverage these systems effectively.

Responsibilities

  • Designing, implementing, and managing large-scale Confluent and AWS MSK Kafka clusters, ensuring high availability, performance, and scalability.
  • Developing and maintaining Infrastructure as Code solutions for Kafka and Snowflake platforms using Terraform.
  • Crafting and architecting complex data pipelines within cloud-based distributed systems.
  • Defining and enforcing best practices, governance, and self-service capabilities for data platforms to improve developer experience.
  • Coding mainly Go, and occasionally Python or TypeScript, deploying solutions in a CI/CD workflow.
  • Conducting performance tuning, monitoring, and troubleshooting of Kafka clusters and data pipelines to maintain operational excellence.
  • Collaborating with engineering teams to understand data needs and enable them to build reliable, high-throughput applications on top of our platforms.
  • Leading technical projects, mentoring junior engineers, and contributing to the technical roadmap of the team.
  • Staying ahead of industry trends and evaluating new technologies to continually improve our data infrastructure.

What You’ll Need

  • At least 5 years of commercial experience as a Software or Platform Engineer, with a strong focus on data infrastructure.
  • Hands-on production experience with Kafka, including cluster administration, security (SSL, SASL, ACLs), and performance optimization.
  • Extensive experience with Infrastructure as Code, particularly Terraform.
  • Understanding of distributed systems architecture, including fault tolerance, message ordering, and partition strategies.
  • Proficiency in at least one core programming language (Go or TypeScript), and willingness to learn new languages and technologies.
  • Experience with cloud platforms (AWS, GCP, or Azure), especially data-related services.
  • Experience with data warehousing concepts and tools, specifically Snowflake.
  • A track record of building and operating reliable, scalable back-end systems.
  • Strong commitment to code quality, testing, and operational excellence.
  • Excellent communication and collaboration skills, with the ability to work with diverse technical teams.

Nice to Have

  • Experience with Snowflake, Kafka, and Terraform.
  • Experience with Kubernetes in production.
  • Experience with other Kafka ecosystem tools like Schema Registry and Kafka Connect.
  • Passion for data and solving large-scale technical problems.
  • Strong sense of ownership and drive to build sustainable, long-term solutions.

We care deeply about an inclusive environment and welcome applications from candidates typically underrepresented in tech. If you like the sound of this role but are not totally sure whether you’re the right person, please apply anyway.

About Snyk

Snyk is committed to creating an inclusive environment where our employees can thrive as we rally behind our mission to make the digital world a safer place. We aim to support our employees along their entire journeys here at Snyk.

Benefits & Programs

Prioritize health, wellness, financial security, and life balance with programs tailored to your location and role.

  • Flexible working hours, work-from-home allowances, in-office perks, and time off for learning and self development
  • Generous vacation and wellness time off, country-specific holidays, and 100% paid parental leave for all caregivers
  • Health benefits, employee assistance plans, and annual wellness allowance
  • Country-specific life insurance, disability benefits, and retirement/pension programs, plus mobile phone and education allowances

Note: This description omits application form boilerplate content and focuses on the role and requirements.

We care deeply about safeguarding applicant privacy and use data in accordance with our Recruitment Privacy Statement. If you have questions about data processing, contact privacy@snyk.io.

#J-18808-Ljbffr
Location:
London
Job Type:
FullTime
Category:
IT & Technology

We found some similar jobs based on your search