Job Description
We are currently seeking an experienced Modern Data Engineer to join a major UK government programme on a long-term contract basis. This is a remote-first, long-term engagement supporting critical national data platforms.
Role Overview
You will design, build, and maintain scalable data platforms and pipelines using modern lakehouse and cloud technologies, ensuring high standards of security, performance, and reliability.
Key Responsibilities
- Design and develop data pipelines using Python, Spark, and Databricks
- Implement Lakehouse architecture using the Medallion framework (Bronze, Silver, Gold)
- Build and optimise Delta Lake solutions
- Develop reusable accelerators and automation frameworks
- Apply best practices in data modelling and analytics engineering
- Integrate AI and automation into data workflows
- Ensure compliance with government security and data governance standards
- Collaborate with technical and business stakeholders
Required Skills & Experience
-
Active SC Clearance (non-negotiable)
- Strong experience with Python, Spark, and Databricks
- Hands-on experience with Delta Lake and Lakehouse platforms
- Solid SQL and data modelling expertise
- Proven experience building enterprise-grade data pipelines
- Experience in secure or regulated environments
Desirable Skills
- Cloud experience (Azure / AWS / GCP)
- Exposure to AI/ML pipelines
- Public sector or government project experience