Tech Stack
AirflowAWSPythonSQL
About the role
- Design and optimize ingestion, storage, and transformation pipelines using Python, SQL, Snowflake, and Snowpark
- Build and enhance real-time data pipelines with AWS Lambda and Snowpipe
- Collaborate with data scientists and analysts to deliver business-ready datasets
- Create internal and external data views (logical, materialized, and secure)
- Test and evaluate new features in Snowflake, Airflow, and AWS for proof of concepts
- Participate in peer code reviews and provide constructive feedback
- Maintain clean, efficient, and scalable code
- Join sprint ceremonies (planning, stand-ups, reviews, retrospectives)
- Ensure alignment with stakeholders on deliverables and timelines
- Coordinate deployments with PMs and IT
- Ensure smooth release cycles with minimal downtime
Requirements
- 5+ years of experience as a Data Engineer
- Strong expertise with Snowflake and orchestration tools like Airflow
- Advanced Python and SQL programming skills
- Hands-on experience with AWS services: Lambda, S3, and real-time data streaming
- Solid understanding of ELT pipelines, data modeling, and efficient storage strategies
- Great communication and collaboration skills.
- Contractor agreement with payment in USD
- 100% remote work
- Argentina's public holidays
- English classes
- Referral program
- Access to learning platforms
Applicant Tracking System Keywords
Tip: use these terms in your resume and cover letter to boost ATS matches.
Hard skills
PythonSQLSnowflakeSnowparkAWS LambdaSnowpipeAirflowELT pipelinesdata modelingefficient storage strategies
Soft skills
communicationcollaborationconstructive feedbackstakeholder alignmentteamwork