Be Part Of A High-Performing Team
This role supports a large-scale digital transformation within a global financial services organization focused on modernizing its technology stack and strengthening a data-driven operating model. The team sits within the Data Strategy function supporting capital markets and securities businesses across the Americas, working closely with global stakeholders to build a strategic enterprise data platform.
The environment is collaborative and delivery-focused, with strong emphasis on engineering standards, cloud-first architecture, and scalable data solutions. This team is responsible for foundational data platforms that support critical reference, pricing, and market data domains, enabling analytics, reporting, and downstream business systems.
What’s In Store For You
This engagement offers hands-on exposure to enterprise-scale Azure data platforms within a regulated financial services environment. The role provides the opportunity to work on strategic, long-term data initiatives, collaborate with distributed global teams, and contribute to the buildout of core data capabilities that support capital markets operations.
How You Will Make An Impact
- Design, build, and enhance data pipelines and services on Azure to support a strategic enterprise data platform
- Develop and maintain reference and pricing data solutions used across capital markets and securities teams
- Collaborate with data strategy, engineering, and business stakeholders to deliver scalable cloud-based solutions
- Implement and support ETL/ELT processes aligned with enterprise data standards
- Build and expose data services and APIs to enable downstream consumption
- Contribute to cloud modernization efforts following internal development, security, and DevOps standards
Are you an experienced Data Engineer ready to make an impact?
- 5–7 years of experience as a Data Engineer or Software/Data Engineer in enterprise environments
- Strong hands-on experience with Azure cloud services, including Azure Data Factory and Azure Data Lake Gen2
- Experience working with Azure Databricks for data processing and transformation
- Proficiency in Python, including API development using FastAPI or similar frameworks
- Strong SQL skills with experience across relational and NoSQL databases
- Experience with Azure-native components such as Azure Functions, API Gateway, and Azure databases
- Solid understanding of ETL/ELT concepts and data pipeline design
- Familiarity with CI/CD and DevOps practices (Git, Jenkins, automated deployments)
- Exposure to financial services, capital markets, financial instruments, or market data is a plus
- Strong communication skills and ability to work with distributed, cross-functional teams
#dice