Tech Stack
AirflowAWSETLPostgresPythonSQL
About the role
- Design and maintain ETL pipelines from AWS Postgres databases to the Snowflake data warehouse.
- Audit and optimize SQL queries, PostgreSQL internals, and Snowflake performance for scalability and efficiency.
- Implement monitoring, alerting, and automation using tools like Datadog, CloudWatch, and Python scripting.
- Support data modeling, schema design, and orchestration with platforms such as Airflow and dbt.
Requirements
- Hands-on experience with ETL processes from AWS Postgres to Snowflake.
- Strong ability to audit and optimize SQL queries for performance and scalability.
- Deep understanding of PostgreSQL internals (indexes, query planner, VACUUM, etc.).
- Expertise in Snowflake performance tuning and warehouse management.
- Experience with monitoring and alerting tools (e.g., Datadog, CloudWatch).
- Skills in data modeling and schema design.
- Proficiency in Python or scripting for workflow automation.
- Familiarity with orchestration tools such as Airflow and dbt.
- 100% Remote Work: Enjoy the freedom to work from the location that helps you thrive. All it takes is a laptop and a reliable internet connection.
- Highly Competitive USD Pay: Earn an excellent, market-leading compensation in USD, that goes beyond typical market offerings.
- Paid Time Off: We value your well-being. Our paid time off policies ensure you have the chance to unwind and recharge when needed.
- Work with Autonomy: Enjoy the freedom to manage your time as long as the work gets done. Focus on results, not the clock.
- Work with Top American Companies: Grow your expertise working on innovative, high-impact projects with Industry-Leading U.S. Companies.
Applicant Tracking System Keywords
Tip: use these terms in your resume and cover letter to boost ATS matches.
Hard skills
ETL processesSQL optimizationPostgreSQL internalsSnowflake performance tuningdata modelingschema designPython scriptingworkflow automationorchestration toolsdbt