Share this job
Senior Data Engineer (AWS / Python / SQL / Snowflake) - 3372348
Charlotte, NC
Apply for this job

*Remote Contract-to-Hire Opportunity*

 

Lighthouse Technology Services is partnering with our client to fill their Senior Data Engineer (AWS / Python / SQL / Snowflake) position! This is a 6+ month contract-to-hire opportunity. This role can be remote within the United States with quarterly travel to Charlotte, NC. This role will be a W2 employee of Lighthouse Technology Services. 


Position Overview


We are seeking a Senior Data Engineer with 10+ years of experience to design, build, and optimize scalable data ingestion pipelines within a modern cloud data platform. This role will focus heavily on Python development, advanced SQL, and Snowflake data warehouse administration, ensuring data pipelines are performant, reliable, and cost-optimized.


The ideal candidate has deep hands-on experience building enterprise-scale data ingestion pipelines, integrating multiple source systems (databases and APIs), and implementing data quality validation. This individual will play a key role in improving existing ingestion frameworks while helping design and implement future data architecture solutions.


What You'll Be Doing:


Build & Optimize Data Pipelines

  • Design, develop, and maintain scalable data ingestion pipelines that move data from various source systems into Snowflake.
  • Build ingestion pipelines using Python, SQL, and modern ingestion tools such as Informatica IDMC and Fivetran.
  • Connect to external APIs using Python, designing custom ingestion frameworks where pre-built tools are not sufficient.
  • Implement data validation logic and quality checks within ingestion pipelines to ensure the integrity of replicated data.


Snowflake Data Warehouse Management

  • Administer and optimize the Snowflake data warehouse environment, including performance tuning and cost optimization.
  • Analyze and optimize query performance and data models.
  • Apply best practices around Snowflake scaling strategies, including understanding and implementing vertical vs. horizontal scaling approaches.
  • Monitor and improve warehouse performance to ensure pipelines run efficiently with minimal runtime and cost.


Data Replication & Integration

  • Build solutions that replicate data from SQL Server and other enterprise source systems into Snowflake.
  • Design ingestion frameworks to replicate data from external APIs into Snowflake using Python and external access networks.
  • Leverage ingestion tools such as Fivetran where appropriate while also developing custom Python-based pipelines when needed.


Cloud Data Engineering

  • Build and orchestrate data workflows using AWS services, including:
  • AWS Airflow for pipeline orchestration
  • S3 for storage
  • CloudWatch for monitoring and debugging
  • Implement robust deployment practices and integrate pipelines with CI/CD processes.


Engineering Best Practices

  • Write clean, maintainable, and well-documented Python code including clear functions, comments, and modular design.
  • Ensure pipelines are scalable, performant, and fault tolerant.
  • Document all pipeline designs, ingestion logic, and system integrations.


What You’ll Need to Have


  • 10+ years of professional experience in Data Engineering
  • Strong Python development expertise with the ability to design, build, and optimize ingestion frameworks
  • Advanced SQL skills, including writing efficient and performant queries
  • Deep experience working with Snowflake, including administration, performance tuning, and cost optimization
  • Hands-on experience building scalable data ingestion pipelines


Data Platform & Integration

  • Experience with Informatica IDMC for data ingestion
  • Experience using Fivetran or similar ingestion tools (must be able to explain hands-on usage)
  • Experience replicating data from SQL Server or similar relational systems into Snowflake
  • Experience ingesting data from external APIs using Python


Cloud & Pipeline Orchestration

  • Experience working within AWS cloud environments
  • Experience using AWS Airflow for scheduling and orchestration
  • Familiarity with AWS CloudWatch for monitoring and debugging
  • Experience with CI/CD pipeline implementation and deployment strategies


Data Engineering Expertise

  • Strong understanding of ETL workflows and pipeline design
  • Expertise in performance tuning for pipelines and data warehouses
  • Knowledge of Snowflake scaling strategies (vertical vs. horizontal scaling)
  • Experience implementing data quality validation within ingestion pipelines
  • Experience documenting engineering solutions and maintaining technical documentation


Preferred / Nice to Have

  • Hands-on experience with Fivetran ingestion architecture
  • Experience with Lambda or additional AWS services
  • Experience improving existing ingestion frameworks and platform architecture


Important: The environment is AWS-based. This role does not use Azure or Hadoop-based ecosystems, and those backgrounds alone will not translate directly to this role.


Pay Range: $84-$89/hr +


Questions about any of our jobs? Email us at recruiting@lhtservices.com 

 

View all of our open jobs here: jobs.lhtservices.com 

Apply for this job
Powered by