Be Part Of A High-Performing Team
This opportunity supports a global financial services institution operating at the intersection of banking, risk management, and enterprise data governance. The organization is known for building highly scalable, secure platforms that support mission-critical financial operations and regulatory reporting.
The team focuses on modernizing enterprise data platforms using cloud-native architectures, Databricks Lakehouse frameworks, and advanced data engineering practices. Working within a highly regulated banking environment, the group collaborates closely with risk, compliance, and technology leaders to deliver secure, auditable, and high-performance data and application platforms. The work environment is collaborative and technically driven, with a strong emphasis on building enterprise-grade solutions that meet stringent regulatory and governance requirements.
What’s In Store For You
- Engagement: W2 only (no C2C/1099)
- Opportunity to contribute to enterprise data platform modernization initiatives within a leading financial institution.
- Work on cutting-edge Databricks Lakehouse architecture and data engineering frameworks.
- Hybrid work model providing a balance of in-office collaboration and remote flexibility.
- Exposure to enterprise-scale governance, risk, and compliance platforms within the banking industry.
How You Will Make An Impact
- Design and implement Databricks Lakehouse platforms using Medallion Architecture and Unity Catalog.
- Build and deploy Databricks applications and services using advanced Python development.
- Develop scalable batch and streaming data pipelines on Azure Databricks.
- Create full-stack enterprise applications using Python frameworks including Django and Flask.
- Design and deploy Streamlit-based user interfaces supporting enterprise workflows and data exploration.
- Deploy Python-based services and applications through Kubernetes and cloud infrastructure services.
- Implement BPMN workflow orchestration within Django applications for enterprise process automation.
- Integrate enterprise data platforms with Collibra for governance, lineage, and certification.
- Ensure solutions meet enterprise-grade security, regulatory compliance, and auditability requirements.
- Collaborate with data engineering, risk, and compliance stakeholders to deliver highly scalable and governed platforms.
Are you a proven Databricks and Python Full Stack professional ready to make an impact?
- 15+ years of experience in software development and enterprise data platforms.
- Deep hands-on expertise with Databricks platform development and deployment.
- Strong experience implementing Lakehouse or Lakebase architectures using Delta Lake.
- Proven implementation of Medallion Architecture (Bronze, Silver, Gold layers).
- Advanced Python development experience building scalable enterprise applications.
- Backend framework expertise with Django and/or Flask.
- Experience building data-driven applications directly on Databricks environments.
- UI development experience with Streamlit.
- Hands-on experience deploying applications using Kubernetes and Azure cloud services.
- Experience implementing CI/CD pipelines using GitHub or similar DevOps platforms.
- Knowledge of Azure identity, security, and access control integration.
- Experience with BPMN workflow orchestration within Python-based applications.
- Data governance and lineage experience using Collibra.
- Mandatory banking or financial services industry experience.
- Prior work supporting Risk, Finance, Compliance, or Regulatory Reporting platforms.
Required certifications:
- Databricks Certification
- Azure Certification
- Collibra Certification