
Lead Data Engineer, Snowflake/DBT
Tide
full-time
Posted on:
Location Type: Hybrid
Location: Delhi NCR • 🇮🇳 India
Visit company websiteJob Level
Senior
Tech Stack
AirflowApacheAWSETLPythonSQL
About the role
- Build and run data pipelines and services to support business functions, reports and dashboards
- Develop end-to-end ETL/ELT pipelines in collaboration with Data Analysts
- Design, develop and implement scalable automated processes for data extraction, processing and analysis in a Data Mesh architecture
- Mentor and support junior engineers on the team
- Serve as a go-to expert for data technologies and solutions
- Troubleshoot and resolve technical, architecture and design challenges
- Continuously improve how data pipelines are delivered by the department
- Translate business requirements into technical requirements (entities, dbt models, timings, tests, reports)
- Own delivery of data models and reports end-to-end
- Perform exploratory data analysis to identify data quality issues and implement preventive tests
- Ensure data feeds are optimised and available at required times (CDC/delta loading, Change Data Control)
- Discover, transform, test, deploy and document data sources
- Apply and champion data warehouse governance: data quality, testing, coding best practices and peer review
- Build Looker dashboards for use cases if required
Requirements
- 7+ years of extensive development experience using Snowflake or similar data warehouse technology
- Working experience with dbt and modern data stack technologies (Snowflake, Apache Airflow, Fivetran, AWS, git, Looker)
- Experience in agile processes, such as SCRUM
- Extensive experience in writing advanced SQL statements and performance tuning them
- Experience in Data Ingestion techniques using custom or SAAS tools like Fivetran
- Experience in data modelling and ability to optimise existing/new data models
- Experience in data mining, data warehouse solutions, and ETL with large-scale complex datasets
- Experience architecting analytical databases (Data Mesh architecture) is an advantage
- Experience working in agile cross-functional delivery teams
- High development standards: code quality, code reviews, unit testing, CI/CD
- Strong technical documentation skills and ability to be clear with business users
- Business-level English and good communication skills
- Basic understanding of various systems across the AWS platform (good to have)
- Preferably experience in a digitally native company, ideally fintech
- Experience with Python and governance tools (e.g., Atlan, Alation, Collibra) or data quality tools (e.g., Great Expectations, Monte Carlo, Soda) is an advantage
Benefits
- Competitive salary
- Self & Family Health Insurance
- Term & Life Insurance
- OPD Benefits
- Mental wellbeing through Plumm
- Learning & Development Budget
- WFH Setup allowance
- 15 days of Privilege leaves
- 12 days of Casual leaves
- 12 days of Sick leaves
- 3 paid days off for volunteering or L&D activities
- Stock Options
- Flexible working arrangements (Working Out of Office policy): remote from home or anywhere in assigned Indian state; work from different country or Indian state for 90 days of the year
Applicant Tracking System Keywords
Tip: use these terms in your resume and cover letter to boost ATS matches.
Hard skills
ETLELTSQLdata modellingdata miningdata warehouse governancedata qualityperformance tuningdata ingestiondata mesh architecture
Soft skills
mentoringtroubleshootingcommunicationtechnical documentationcollaborationagile processesbusiness requirements translationproblem-solvingpeer reviewcontinuous improvement