Angi

Senior Data Engineer

Angi

full-time

Posted on:

Origin:  • 🇺🇸 United States • Colorado, New York

Visit company website
AI Apply
Apply

Salary

💰 $110,000 - $185,000 per year

Job Level

Senior

Tech Stack

AirflowAmazon RedshiftApacheAWSCloudDockerETLGoKotlinPythonSQLTerraformVault

About the role

  • Design and implement scalable data models and pipelines using industry-standard techniques (Kimball, Data Vault, etc.).
  • Build, optimize, and maintain ELT workflows leveraging dbt and cloud-based ETL/ELT platforms (Fivetran, Stitch, etc.).
  • Develop orchestration workflows using tools like Airflow or Dagster.
  • Write efficient SQL and develop in programming languages such as Python, Go, or Kotlin.
  • Implement DevOps practices including containerization (Docker), CI/CD pipelines (GitLab), and infrastructure-as-code (Terraform, Helm, EKS).
  • Work with AWS services (S3, IAM, AWS CLI, Glue) and data lake/lakehouse frameworks (Apache Iceberg, Glue Catalog).
  • Integrate with and optimize cloud data warehouse solutions (Snowflake, Redshift, Trino).
  • Develop and maintain integrations with BI tools such as Looker, including LookML development and embedded analytics.
  • Collaborate closely with stakeholders to understand requirements and deliver solutions that meet business needs.
  • Operate autonomously—creating work items, prioritizing tasks, and tracking progress independently.
  • Communicate effectively with both technical and non-technical audiences.

Requirements

  • Proven experience in data modeling and warehouse design.
  • Deep expertise with dbt and modern ELT patterns.
  • Strong SQL skills.
  • Programming experience in Python, Go, or Kotlin.
  • Experience with ELT platforms (Fivetran, Stitch) and cloud ETL/ELT workflows.
  • Experience developing orchestration workflows using Airflow or Dagster.
  • Experience with containerized deployments (Docker), CI/CD pipelines (GitLab), and infrastructure-as-code (Terraform, Helm, EKS).
  • Proficiency with AWS services (S3, IAM, AWS CLI, Glue).
  • Hands-on experience with modern data lake/lakehouse architectures (Apache Iceberg, Glue Catalog).
  • Experience integrating and optimizing cloud data warehouses (Snowflake, Redshift, Trino).
  • Knowledge of developing within BI tools such as Looker, including LookML and embedded analytics.
  • Strong problem-solving skills and ability to work independently.
  • Excellent communication and stakeholder management skills.