Apply

Ready to go for it?

AI Apply speeds things up—apply directly if you prefer.

Apply faster with JobTailor

Recommended
Apply

Apply your way

Use the standard apply link, or let JobTailor help you move faster.

  • Apply directly in one click
  • No setup required
  • Best if you’re in a hurry
Start AI Apply
GHX

Data Engineer III

GHX

. Design and build ETL/ELT pipelines and dimensional data models using dbt, Airflow, Python, PySpark, and AWS services (S3, Glue, Lambda) .

Posted 4/20/2026full-timeRemote • 🇺🇸 United StatesMid-LevelSenior💰 $98,000 - $130,500 per yearWebsite

Tech Stack

Tools & technologies
AirflowAmazon RedshiftAWSCloudETLPySparkPythonSQLTableau

About the role

Key responsibilities & impact
  • Design and build ETL/ELT pipelines and dimensional data models using dbt, Airflow, Python, PySpark, and AWS services (S3, Glue, Lambda)
  • Create executive dashboards and perform complex SQL analysis to drive strategic decisions (Tableau, Sigma, SAP BO)
  • Optimize SQL queries, data structures, and warehouse resources for performance and cost efficiency at scale (Snowflake, Redshift)
  • Partner with stakeholders to translate business requirements into self-service analytics capabilities
  • Implement infrastructure-as-code (CloudFormation/CDK) and contribute to CI/CD automation
  • Troubleshoot production issues across data pipelines, queries, and APIs; perform root cause analysis
  • Provide technical mentorship, establish development standards, and drive data engineering best practices
  • Document solutions and communicate designs to cross-functional teams in Confluence/JIRA
  • Apply data governance, security, and monitoring/alerting best practices
  • Leverage AI-assisted development tools (GitHub Copilot, Claude, etc.) to increase productivity and accelerate delivery

Requirements

What you’ll need
  • Bachelor's degree in Computer Science, Data Science, Mathematics, Statistics, or related quantitative field
  • 6+ years of data engineering experience building BI applications and data platforms
  • 5+ years of ETL/ELT development in cloud data warehouses (AWS, Snowflake, Redshift, or similar)
  • 4+ years creating dashboards and visualizations in enterprise BI tools (Tableau, Sigma, SAP BO, Power BI, or Looker)
  • Proven track record delivering production data solutions in Agile environments (Scrum/Kanban)
  • Expert-level SQL and Python proficiency
  • Proven experience designing dimensional data models (star/snowflake schema) optimized for analytics
  • Demonstrated SQL optimization and performance tuning in large-scale production environments
  • Strong business acumen with ability to translate technical solutions into business value
  • Excellent communication skills for presenting to executive and non-technical audiences
  • Deep analytical and troubleshooting skills with root cause analysis capabilities
  • Must be located in the United States (remote position)

Benefits

Comp & perks
  • health, vision, and dental insurance
  • accident and life insurance
  • 401k matching
  • paid-time off
  • education reimbursement

ATS Keywords

✓ Tailor your resume
Applicant Tracking System Keywords

Tip: use these terms in your resume and cover letter to boost ATS matches.

Hard Skills & Tools
ETLELTPythonPySparkSQLdimensional data modelsdata governanceinfrastructure-as-codeCI/CD automationdata optimization
Soft Skills
technical mentorshipcommunication skillsanalytical skillstroubleshooting skillsbusiness acumenstakeholder collaborationproblem-solvingpresentation skillscross-functional teamworkroot cause analysis
Certifications
Bachelor's degree in Computer ScienceBachelor's degree in Data ScienceBachelor's degree in MathematicsBachelor's degree in Statistics