Be Part Of A High-Performing Team
This opportunity supports a global financial services institution recognized for its strong presence in banking, capital markets, and financial technology innovation. The organization invests heavily in modern data platforms and advanced analytics to enable faster decision-making, operational transparency, and scalable reporting across the enterprise.
The technology organization operates in a collaborative, fast-paced environment focused on modernizing data infrastructure and enabling enterprise-wide insights. Teams work closely with business stakeholders, data analysts, and engineering groups to build scalable data solutions and high-quality reporting platforms. Current initiatives include expanding cloud-based analytics capabilities, improving data quality frameworks, and implementing modern lakehouse architectures to support real-time and batch data processing.
What's In Store For You
- Engagement: W2 only (no C2C/1099)
- Opportunity to work on modern Azure-based data platforms and analytics ecosystems
- Exposure to enterprise data engineering practices in a highly regulated financial environment
- Collaborative Agile environment leveraging modern DevOps and CI/CD practices
- Opportunity to contribute to large-scale data infrastructure modernization initiatives
How You Will Make An Impact
- Design and implement scalable data pipelines and ETL workflows that integrate data from multiple enterprise sources.
- Build and maintain modern cloud-based data pipelines using Azure Data Factory, Databricks, and Spark.
- Develop and support data warehouse and reporting solutions on SQL Server environments.
- Optimize data ingestion and transformation processes to improve performance, reliability, and scalability.
- Implement data models, schemas, and data dictionaries that support efficient data storage and reporting.
- Perform data analysis, data profiling, and data quality validation to ensure data integrity.
- Develop Power BI dashboards, reports, and visualizations to support business intelligence initiatives.
- Monitor and tune database performance through query optimization and indexing strategies.
- Maintain documentation for data pipelines, schemas, and reporting solutions.
- Collaborate within Agile teams using Azure DevOps, Jira, and Confluence to deliver incremental features and improvements.
Are you a proven data engineering professional ready to build modern enterprise data platforms?
- 10+ years of experience in data engineering, data warehousing, or data platform development
- Strong experience designing and implementing ETL pipelines and data integration workflows
- Hands-on expertise with Azure Data Factory, Databricks, Spark, Python, or Scala
- Experience working with Azure Data Lake Storage and Azure SQL databases
- Experience implementing Delta Lake, Medallion Architecture, and Unity Catalog
- Strong SQL skills and experience with on-prem SQL Server environments and SSIS
- Experience developing Power BI dashboards, reports, and data models
- Experience with data profiling, cleansing, and quality validation
- Experience with DevOps tools such as Azure DevOps, GitHub, or Jenkins
- Experience working in Agile environments using Jira and Confluence
- Strong analytical, troubleshooting, and communication skills
- Bachelor’s degree in Computer Science, Information Systems, or related field
Preferred Qualifications
- Experience with API/REST integrations and JSON data processing
- Experience in banking or financial services environments
- Cloud certifications such as Azure, AWS, or GCP
- Familiarity with programming languages such as Java, JavaScript, or C/C++
- Experience with Docker, Kubernetes, or containerized workloads
- Experience with Power Apps, Power Automate, or Collibra data governance tools
#dice