IT Data Architect (Modern BI / Data Platform)
- Hands-on Data Architect with deep experience in Property & Casualty (P&C) insurance, designing enterprise data models across Claims, Policy, Billing, and Underwriting domains
- Strong domain expertise translating insurance concepts such as loss ratio, combined ratio, reserves, premium calculations, and underwriting profitability into scalable, analytics-ready data structures
- Experience building data models that support key insurance use cases including claims analytics, fraud detection, policy lifecycle reporting, and underwriting profitability marts
- Proven ability to modernize legacy BI and ETL ecosystems (DataStage, Oracle, Informatica) into modern, cloud-native data platforms
- Extensive experience with Snowflake, dbt, Airflow, and Python, designing scalable, high-performance data pipelines and analytics layers
- Leads end-to-end architecture design, including current-state analysis, future-state definition, and incremental migration strategies that reduce risk and ensure business continuity
- Strong background in enterprise data modeling techniques, including dimensional modeling, normalized models, and hybrid/data mesh approaches
- Designs and enforces conformed data layers and governed data products, enabling consistency, reuse, and cross-domain analytics at scale
- Builds component-based, reusable data engineering frameworks to eliminate redundant pipelines and standardize ingestion, transformation, and delivery patterns
- Drives federated yet governed data architecture, balancing domain autonomy with enterprise standards and governance
- Owns and operationalizes DataOps best practices, including data-as-code, Git-based version control, CI/CD pipelines, automated testing, and deployment frameworks
- Implements data quality, lineage, monitoring, and observability solutions to ensure trust, reliability, and transparency across the data ecosystem
- Deep expertise in Snowflake architecture, including RBAC, workload isolation, warehouse sizing, performance tuning, environment strategy, and cost governance
- Establishes standardized dbt development frameworks, including project structure, reusable macros, testing strategies, and documentation standards
- Defines and implements orchestration patterns using tools like Airflow, Dagster, or Prefect for scalable and maintainable pipeline execution
- Partners with engineering and platform teams to integrate modern ingestion tools (e.g., Fivetran, Airbyte) and data catalog/observability platforms (e.g., Alation, Monte Carlo)
- Acts as a player-coach leader, mentoring engineers, leading workshops, and embedding with teams to drive adoption of modern data engineering and analytics best practices
- Collaborates closely with front-line managers, business stakeholders, and technical teams to align data architecture with business priorities and regulatory requirements
- Experienced working within insurance regulatory and compliance frameworks, ensuring data models and pipelines support reporting and audit requirements
- Designs data platforms with a forward-looking approach to support AI/ML initiatives, including enabling structured data layers for RAG-based solutions, analytics assistants, and intelligent data products