
Senior Data Engineer
Forbes
full-time
Posted on:
Location Type: Remote
Location: Remote • 🇮🇳 India
Visit company websiteJob Level
Senior
Tech Stack
AirflowAWSBigQueryGoogle Cloud PlatformKafkaPythonSparkSQL
About the role
- Build, maintain, and optimize data pipelines using Spark, Kafka, Airflow, and Python
- Orchestrate workflows across GCP (GCS, BigQuery, Composer) and AWS-based systems
- Model data using dbt, with an emphasis on quality, reuse, and documentation
- Ingest, clean, and normalize data from third-party sources such as Google Ads, Meta, Taboola, Outbrain, and Google Analytics
- Write high-performance SQL and support analytics and reporting teams in self-serve data access
- Monitor and improve data quality, lineage, and governance across critical workflows
- Collaborate with engineers, analysts, and business partners across the US, UK, and India
Requirements
- 4+ years of data engineering experience, ideally in a global, distributed team
- Strong Python development skills and experience
- Expert in SQL for data transformation, analysis, and debugging
- Deep knowledge of Airflow and orchestration best practices
- Proficient in DBT (data modeling, testing, release workflows)
- Experience with GCP (BigQuery, GCS, Composer); AWS familiarity is a plus
- Strong grasp of data governance, observability, and privacy standards
- Excellent written and verbal communication skills
Benefits
- Day off on the 3rd Friday of every month (one long weekend each month)
- Monthly Wellness Reimbursement Program to promote health well-being
- Monthly Office Commutation Reimbursement Program
- Paid paternity and maternity leaves
Applicant Tracking System Keywords
Tip: use these terms in your resume and cover letter to boost ATS matches.
Hard skills
PythonSQLSparkKafkaAirflowdbtdata modelingdata transformationdata governancedata quality
Soft skills
communicationcollaborationproblem-solvingattention to detailorganizational skills