We’re seeking experienced Data Engineers to join a major long-term public sector data transformation programme. This is an opportunity to take ownership of end-to-end data engineering solutions—designing, building, and optimising complex data pipelines while solving challenging data quality and integration issues.
You’ll play a key role in delivering modern data architectures using Azure and Databricks, collaborating with cross-functional teams to ensure performance, scalability, and security across the data estate.
What you’ll need
✅ Proven experience designing, implementing, and maintaining complex data engineering solutions
✅ Deep technical expertise in SQL, Azure, Databricks, and Python
✅ Strong understanding of data pipelines, data quality frameworks, and ETL orchestration
✅ Familiarity with tools such as Delta Lake, Unity Catalog, and Power BI is advantageous
What’s on offer