
Data Engineer – Snowflake, DBT, Airflow
Miratech
full-time
Posted on:
Location Type: Remote
Location: India
Visit company websiteExplore more
About the role
- Design and develop robust ETL/ELT pipelines using Python, Airflow, and DBT
- Build and optimize Snowflake data models for performance, scalability, and cost efficiency
- Implement ingestion pipelines for internal and external financial datasets (Market, Securities, Pricing, ESG, Ratings)
- Develop DBT models using best practices (sources, staging, marts)
- Ensure consistent data transformations aligned with EDP standards
- Create and manage Airflow DAGs with dependency handling, retries, and alerting
- Monitor pipeline health and troubleshoot failures proactively
- Implement validation, reconciliation, and completeness checks
Requirements
- 3–5 years of experience in Data Engineering
- Strong hands-on experience with Python, Snowflake, DBT, and Apache Airflow
- Solid understanding of SQL, data warehousing concepts, and ELT patterns
- Experience working with large-scale, structured financial datasets
- Familiarity with Git, CI/CD, and agile delivery practices
Benefits
- Health insurance
- Relocation program
- Professional development opportunities
- Certification programs
- Mentorship programs
- Internal mobility opportunities
- Internship opportunities
- Flexible work arrangements
Applicant Tracking System Keywords
Tip: use these terms in your resume and cover letter to boost ATS matches.
Hard Skills & Tools
PythonETLELTDBTSnowflakeSQLdata warehousingdata modelingApache Airflowdata transformation
Soft Skills
troubleshootingproblem-solvingcommunicationcollaborationattention to detail