Kobie

Senior Data Engineer

Kobie

full-time

Posted on:

Location Type: Hybrid

Location: BengaluruIndia

Visit company website

Explore more

AI Apply
Apply

Job Level

About the role

  • Lead the design and implementation of scalable, reusable ETL/ELT frameworks and data pipelines across multiple client environments.
  • Define and enforce engineering best practices for data modeling, quality assurance, and performance optimization.
  • Mentor and support Data Engineers, conducting peer reviews and ensuring alignment with architectural standards.
  • Oversee the operational stability of production data pipelines, including monitoring, troubleshooting, and capacity planning.
  • Partner with data architects and product teams to design end-to-end data solutions aligned to business goals.
  • Conduct source data analysis and profiling to guide schema and model design decisions.
  • Implement and optimize data models including slowly changing dimensions, transactional, accumulating snapshot, and periodic snapshot fact tables.
  • Collaborate with platform engineers to evolve event-driven and streaming architectures supporting real-time analytics.
  • Drive automation initiatives for testing, auditing, and metadata capture to ensure transparency and reproducibility

Requirements

  • 5–7 years of Data Engineering experience, with at least 3 years operating in Snowflake.
  • Proven success designing and deploying large-scale data pipelines and dimensional models using modern cloud data platforms.
  • Deep understanding of Kimball and Data Vault methodologies, and how they integrate within Lakehouse and event-driven architectures.
  • Advanced proficiency with SQL and Python for pipeline development, data validation, and automation.
  • Experience leading data engineering projects through the full SDLC, from requirements through deployment and monitoring.
  • Expertise in integrating batch and streaming data sources, including Kafka, APIs, and message queues.
  • Experience with orchestration and workflow management tools such as Airflow, Matillion or dbt.
  • Cloud experience with Azure and/or AWS, including cost optimization and data security best practices.
  • Strong communication and leadership skills to guide cross-functional teams and present technical concepts clearly to non-technical stakeholders.
  • Value-add: familiarity with vector and graph databases, Apache NiFi, and modern data sharing or clean-room technologies.
Benefits
  • highly competitive benefits
  • comprehensive health coverage
  • well-being perks that support our teammates and their dependents
  • flexible time off
  • prioritizing work-life balance
Applicant Tracking System Keywords

Tip: use these terms in your resume and cover letter to boost ATS matches.

Hard Skills & Tools
ETLELTdata modelingSQLPythondata validationdata pipelinesdimensional modelsbatch data integrationstreaming data integration
Soft Skills
leadershipcommunicationmentoringcollaborationtroubleshootingcapacity planningpeer reviewsguiding cross-functional teamspresenting technical conceptssupporting alignment with architectural standards