Data Engineer - Databricks / AI
Jacobs
Software Engineering, Data Science
Manchester, UK
At Jacobs, we're challenging today to reinvent tomorrow by solving the world's most critical problems for thriving cities, resilient environments, mission-critical outcomes, operational advancement, scientific discovery and cutting-edge manufacturing, turning abstract ideas into realities that transform the world for good.
Enjoy designing elegant data systems, shipping production-grade code, and seeing your work make a measurable difference, then this role is for you!!
Our team build data platforms and AI solutions that power critical infrastructure, transform operations, and move entire industries.
In this role you will work with a broad set of clients on high-scale problems, with the backing of a global organisation, investing heavily in Azure, Databricks, and applied AI. Working primarily on Azure + Databricks (Spark, Delta Lake, Unity Catalog) - Ship modern ELT/ETL, streaming and batch data products, and ML/AI pipelines - Operating at serious scale across water, transport, energy, and more - Join a collaborative, engineering-led culture with real investment in platforms & tooling.
Utilising stack such as;
- Azure: ADLS Gen2, Event Hubs, ADF/Azure Data Factory or Synapse pipelines, Functions, Key Vault, VNets
- Databricks: Spark, Delta Lake, Unity Catalog, Workflows, MLflow (experiments, model registry)
- Languages: Python (PySpark), SQL (Delta SQL), optional Scala
- Engineering: Git, pull requests, code review, unit/integration tests, dbx, notebooks as code
- Platform & Ops: Azure DevOps/GitHub, CI/CD, Terraform or Bicep, monitoring/alerting
Your remit and responsibilites will include;
- Design & build robust data platforms and pipelines on Azure and Databricks (batch + streaming) using Python/SQL, Spark, Delta Lake, and Data Lakehouse patterns.
- Develop AI-enabling foundations: feature stores, ML-ready datasets, and automated model-serving pathways (MLflow, model registries, CI/CD).
- Own quality & reliability: testing (dbx/pytest), observability (metrics, logging, lineage), and cost/performance optimisation.
- Harden for enterprise: security-by-design, access patterns with Unity Catalog, data governance, and reproducible environments.
- Automate the boring stuff: IaC (Terraform/Bicep), CI/CD (Azure DevOps/GitHub Actions), and templated project scaffolding.
- Partner with clients: translate business problems into technical plans, run workshops, and present trade-offs with clarity.
- Ship value continuously: iterate, review, and release frequently; measure outcomes, not just outputs.
Our team would be delighted to hear from candidates with a good mix of the following skills, experience and attributes.
- Utilising SQL and Python for building reliable data pipelines.
- Hands-on with Spark (preferably Databricks) and modern data modelling (e.g., Kimball/Inmon/Data Vault, lakehouse). -
- Experience running on a cloud data platform (ideally Azure). - Sound software delivery practices: Git, CI/CD, testing, Agile ways of working.
- Streaming/event-driven designs (Event Hubs, Kafka, Structured Streaming).
- MPP/Data Warehouses (Synapse, Snowflake, Redshift) and NoSQL (Cosmos DB).
- ML enablement: feature engineering at scale, MLflow, basic model lifecycle know‑how.
- Infrastructure-as-code (Terraform/Bicep) and platform hardening.
Don’t meet every single bullet? We’d still love to hear from you. We hire for mindset and potential as much as current skills!!
#-LINM1
We value collaboration and believe that in-person interactions are crucial for both our culture and client delivery. We empower employees with our hybrid working policy, allowing them to split their work week between Jacobs offices/projects and remote locations enabling them to deliver their best work.
Your application experience is important to us, and we’re keen to adapt to make every interaction even better. If you require further support or reasonable adjustments with regards to the recruitment process (for example, you require the application form in a different format), please contact the team via Careers Support.
| City | State | Country |
|---|---|---|
| London | Greater London | United Kingdom |
| Glasgow | Lanarkshire | United Kingdom |
| Manchester | Greater Manchester | United Kingdom |


