Building and running the data pipelines and services that support business functions, reports and dashboards
Developing end to end ETL/ELT pipelines working with Data Analysts of business functions
Designing, developing, and implementing scalable, automated processes for data extraction, processing, and analysis in a Data Mesh architecture
Mentoring other junior engineers in the team
Acting as a go-to expert for data technologies and solutions
Providing on the ground troubleshooting and diagnosis to architecture and design challenges
Troubleshooting and resolving technical issues as they arise
Improving how data pipelines are delivered by the department
Translating business requirements into technical requirements, including entities to be modelled, dbt models, timings, tests and reports
Owning the delivery of data models and reports end to end
Performing exploratory data analysis to identify data quality issues and implementing tests to prevent them
Working with Data Analysts to ensure data feeds are optimised and available at required times (including Change Capture, Change Data Control and other delta loading approaches)
Discovering, transforming, testing, deploying and documenting data sources
Applying, helping define, and championing data warehouse governance: data quality, testing, coding best practices, and peer review
Building Looker Dashboards for use cases if required
Requirements
7+ years of extensive development experience using snowflake or similar data warehouse technology
Working experience with dbt and other technologies of the modern data stack, such as Snowflake, Apache Airflow, Fivetran, AWS, git, Looker
Experience in agile processes, such as SCRUM
Extensive experience in writing advanced SQL statements and performance tuning them
Experience in Data Ingestion techniques using custom or SAAS tool like fivetran
Experience in data modelling and ability to optimise existing/new data models
Experience in data mining, data warehouse solutions, and ETL, and using databases in a business environment with large-scale, complex datasets
Experience architecting analytical databases (in Data Mesh architecture) is added advantage
Experience working in agile cross-functional delivery team
High development standards, especially for code quality, code reviews, unit testing, continuous integration and deployment
Strong technical documentation skills and the ability to be clear and precise with business users
Business-level of English and good communication skills
Basic understanding of various systems across the AWS platform (Good to have)
Preferably, worked in a digitally native company, ideally fintech
Experience with python, governance tool (e.g. Atlan, Alation, Collibra) or data quality tool (e.g. Great Expectations, Monte Carlo, Soda) will be added advantage
Benefits
Make work, work for you! We are embracing new ways of working and support flexible working arrangements. With our Working Out of Office (WOO) policy our colleagues can work remotely from home or anywhere in their assigned Indian state; additionally, you can work from a different country or Indian state for 90 days of the year.
Competitive salary
Self & Family Health Insurance
Term & Life Insurance
OPD Benefits
Mental wellbeing through Plumm
Learning & Development Budget
WFH Setup allowance
15 days of Privilege leaves
12 days of Casual leaves
12 days of Sick leaves
3 paid days off for volunteering or L&D activities
Stock Options
Applicant Tracking System Keywords
Tip: use these terms in your resume and cover letter to boost ATS matches.