Job Description
Team: IT
This position is posted by Jobgether on behalf of a partner company. We are currently looking for a Senior Software Engineer, AI Data in United Kingdom.
This role offers the opportunity to design and build scalable, high-performance systems that power next-generation AI data platforms. You will work on mission-critical pipelines that support large-scale model training and evaluation, impacting millions of inference calls and hours of processed data. The role combines software engineering rigor with AI-focused infrastructure, providing a chance to shape technical execution, optimize data workflows, and drive innovation in a fast-paced, high-impact environment. You will collaborate closely with researchers, platform engineers, and other stakeholders to deliver reliable, maintainable, and cost-efficient systems that accelerate AI model development. This is an ideal position for engineers who thrive in a startup-like culture where ownership, technical excellence, and measurable impact are paramount.
Accountabilities:
- Architect and implement scalable AI data infrastructure to support model training and evaluation at scale
- Build efficient, self-serve data processing pipelines leveraging cloud services and distributed systems
- Design cost-effective storage, monitoring, and resource management solutions to maximize efficiency
- Lead adoption of cutting-edge ML/AI tools and frameworks to enhance team velocity and system reliability
- Streamline workflows, introduce new tooling, and maintain high-quality documentation for engineering processes
- Troubleshoot and resolve complex technical issues while improving system performance, quality, and cost-efficiency
- Participate in on-call rotations to ensure operational reliability of AI data platforms
- 5+ years of professional software engineering experience with strong Python and SQL proficiency
- Solid understanding of software engineering fundamentals: data structures, algorithms, system design, architectural patterns, and testing strategies
- Experience with RESTful APIs, distributed systems, and containerization (Docker) in cloud environments
- Proven ability to deliver high-quality, maintainable code in collaborative team settings
- Strong communication and stakeholder management skills, with the ability to explain technical concepts clearly
- Startup mindset: able to navigate changing priorities, rapid iteration, and pragmatic decision-making
- Experience with GCP services (BigQuery, GCS, Cloud Run, GKE)
- Familiarity with distributed processing frameworks (Apache Beam, PySpark)
- Knowledge of workflow orchestration tools (Airflow, Prefect, Dagster)
- Background in ML/AI infrastructure, monitoring tools (Datadog), or data engineering roles
- Experience collaborating directly with researchers
- Competitive salary with equity grants and location-adjusted compensation
- Fully remote work with flexible hours and autonomy over work-life balance
- Comprehensive employer-paid health benefits
- Access to cutting-edge AI tools and frameworks, fostering skill growth and innovation
- Collaborative, high-impact environment with opportunities to shape technical strategy
- Professional development opportunities including mentorship, training, and learning resources
Requirements:
Preferred Qualifications:
Benefits:
Explore More
Date Posted
04/08/2026
Views
0