Principal Data Engineer (Mortgage Services)
We are looking for a Principal Data Engineer to transform our mortgage data into a high-performance competitive advantage. In this role, you will own the end-to-end health of our data ecosystem—optimizing our current SQL Server and SSIS foundations on Windows Servers while leading our strategic transition toward Azure agile platforms. You will be a hands-on leader focused on ensuring our Power BI dashboards provide the real-time accuracy required for mission-critical risk management and loan origination.
What You’ll Do
· Data Architecture & Strategy: Lead the development and maintenance of our SQL Server and cloud-based Data Warehouses (Snowflake/AWS), ensuring a focus on data governance, security, and performance optimization.
· Pipeline Engineering: Design and operate automated ETL/ELT pipelines to ingest data from Loan Origination Systems (LOS) like Encompass. You will build and optimize these processes using tools like SSIS, Snowpipe, dbt, or Azure Data Factory.
· Mortgage Data Modeling: Create and optimize Star and Snowflake schemas for complex mortgage domains (Staging, Funding, Servicing). You will ensure these structures provide fast, accurate data for Power BI dashboards and downstream applications.
· Governance & Compliance: Implement automated checks to ensure accuracy for sensitive financial data, ensuring 100% compliance with HMDA and CCPA regulations.
· Process Innovation: Identify and implement internal process improvements, such as automating manual tasks and re-designing infrastructure for greater scalability.
· Mentorship & Collaboration: Partner with financial analysts to translate complex business rules (DTI/LTV ratios) into technical logic. You will also provide hands-on mentorship and lead code reviews for the engineering team.
What You’ll Bring
· 5-7+ years of hands-on experience as a SQL Server Developer or Data Engineer, with a deep background in high-volume mortgage or financial data sets.
· Expert SQL: Proven mastery of T-SQL, including stored procedures, indexing, and query optimization (using tools like SSMS, Profiler, and Redgate).
· Programming: Proficiency in Python, PowerShell, or R for data scripting and automation.
· Cloud & Big Data: 2+ years of experience with Snowflake, AWS RDS, or Azure. Familiarity with Event Streaming (Kafka) and containerization (Docker) is preferred.
· Hands-on experience with orchestration and ETL tools such as dbt, SSIS, Fivetran, or Apache Airflow.
· Demonstrated experience in cloud security methodologies and role-based access controls to meet CISO and financial compliance requirements.
Preferred Industry Expertise
· Strong understanding of the mortgage industry lifecycle (Origination -> Underwriting -> Closing -> Servicing).
· Familiarity with LLMs, generative AI workflows, and data lake architectures to power future decision-making models.
· Experience working within Agile Software Development methodologies to support business continuity.