Tech Stack
AirflowApacheAWSPythonSQLTerraform
About the role
- Develop and maintain efficient and secure data pipelines using Snowflake;
- Design and optimize data structures to support analytics and reporting;
- Integrate data from various sources (APIs, relational databases, files, etc.);
- Implement best practices for data governance, security, and data quality;
- Automate data ingestion, transformation, and loading processes;
- Collaborate with business stakeholders;
- Provide maintenance and technical support, ensuring data availability and reliability;
- Monitor Snowflake platform performance and costs;
- Document data processes and architecture.
Requirements
- Bachelor's degree in Computer Science, Engineering, Information Systems, or a related field;
- Experience with Snowflake (modeling, ingestion, optimization);
- Advanced knowledge of SQL and data engineering best practices;
- Experience with Python for data manipulation;
- Experience with data engineering tools (e.g., dbt, Apache Airflow);
- Familiarity with cloud environments (preferably AWS);
- Knowledge of code versioning (Git) and agile methodologies;
- Clear communication skills and the ability to collaborate with multidisciplinary teams.
- Preferred:
- Knowledge of Terraform (IaC) to manage and automate cloud resources;
- Hands-on experience developing, integrating, and orchestrating autonomous agents and AI applications.
- Not specified.
📊 Resume Score
Upload your resume to see if it passes auto-rejection tools used by recruiters
Check Resume Score
Applicant Tracking System Keywords
Tip: use these terms in your resume and cover letter to boost ATS matches.
Hard skills
data pipelinesdata structuresdata governancedata qualitySQLPythondbtApache AirflowTerraformAI applications
Soft skills
clear communicationcollaboration
Certifications
Bachelor's degree in Computer ScienceBachelor's degree in EngineeringBachelor's degree in Information Systems