District Partners has been retained by a well-established financial investment organization based in Cleveland, OH that is actively modernizing and restructuring its enterprise data platform. As part of a broader Azure platform evolution initiative, the firm is expanding its data engineering capability.
This is a foundational role within a ground-up data team environment. The current platform includes thousands of SQL tables and legacy workflows that require stabilization, rationalization, and architectural normalization. The first phase of this position will focus heavily on cleaning up and stabilizing the existing data architecture to restore consistency, reliability, and governance across the environment.
This is not a build-and-transition role. Stability, performance, and data integrity are core expectations. The right candidate takes ownership of production pipeline health and plays a key role in transforming the data platform from reactive operations into a scalable, well-architected Azure ecosystem.
Strategic Context
The data function underpins several enterprise-critical initiatives, including:
- Salesforce roadmap execution and CRM expansion.
- Complex ERP integration across core financial systems.
- Development of a client-facing data portal.
- Formalization of governance and steering processes.
- Evolution toward standardized Azure data lake architecture.
The role begins with operational stabilization and transitions into architectural refinement and scalability.
Core Responsibilities
- Design, develop, and maintain scalable ETL/ELT pipelines using Azure Data Factory and related Azure services.
- Support and optimize Azure SQL, Synapse, and Azure Data Lake environments.
- Write, tune, and optimize complex SQL queries and stored procedures.
- Develop data transformation logic using Python.
- Stabilize and normalize thousands of SQL tables and production workflows.
- Own monitoring, logging, alerting, and incident response across live data pipelines.
- Conduct structured root-cause analysis and implement durable remediation.
- Implement validation frameworks and data quality controls.
- Maintain documentation, change history, and deployment traceability.
- Contribute to CI/CD workflows and Azure DevOps processes.
- Partner with analytics stakeholders to ensure datasets are structured for downstream reporting and visualization.
Required Qualifications
- 5–7+ years of experience building and supporting data integrations in production environments.
- Advanced SQL expertise with demonstrated performance tuning experience.
- Hands-on Azure Data Factory experience.
- Strong Python development experience.
- Proven ownership of live production data platforms.
- Familiarity with data visualization tools (Power BI, Tableau, or comparable).
- Demonstrated experience operating effectively in lean, evolving, or non-mature data environments.
- Required industry experience in financial services, fintech, insurance, or portfolio management data environments.
- Familiarity with financial datasets such as portfolio systems, broker-dealer platforms, or retirement/IRA data structures.
- U.S. Citizen or U.S. Permanent Resident status required.
- Strong documentation and stakeholder communication skills.
Additional Experience Valued
- Experience with Azure Synapse or comparable analytics platforms.
- ERP and CRM integration experience.
- CI/CD and infrastructure-as-code exposure.
- Azure-related certifications.
Growth Trajectory
Clear progression from Mid-Level Engineer to Senior Engineer and Architecture-level contributor as the data platform stabilizes and scales.
Location: Hybrid, Cleaveland, OH. (Remote Considered)
Employment Type: Full-Time or Contract-to-Hire