HUMAN&HUMAN

Senior Data Engineer

HUMAN&HUMAN

full-time

Posted on:

Location Type: Hybrid

Location: Tel Aviv • 🇮🇱 Israel

Visit company website
AI Apply
Apply

Job Level

Senior

Tech Stack

AirflowApacheAWSCloudDockerETLKafkaPythonSparkSQL

About the role

  • Lead the Design and Development of Scalable Data Pipelines: You'll be instrumental in architecting, building, and optimizing data ingestion and processing pipelines from the ground up. This includes designing robust ETL/ELT processes to handle massive datasets, potentially reaching dozens of terabytes daily.
  • Collaborate Closely with Data Consumers: You'll partner with data analysts, data scientists, and engineering teams to understand their data requirements, translate them into technical solutions, and optimize data delivery for their specific use cases.
  • Champion Data Modeling and Data Quality: You'll apply your deep understanding of data modeling principles to design efficient and well-structured data schemas that facilitate insightful analysis and reporting. You'll also be a strong advocate for data quality and implement processes to ensure data accuracy and integrity.
  • Build and Maintain a Modern Data Platform: You'll contribute to the development and evolution of our cloud-based data platform, ensuring its reliability, performance, and scalability to meet the growing needs of our data consumers.
  • Drive Innovation in Data Processing: You'll continuously explore and implement new technologies and methodologies to enhance our data platform's capabilities and efficiency.

Requirements

  • 5+ Years of Data Engineering: building and maintaining high-scale data pipelines in a cloud environment.
  • SQL Expert: You go beyond writing queries. You understand database internals, cost-based optimizations, and how to debug bottlenecks in distributed SQL engines.
  • Coding: You write clean, modular, and maintainable Python code that goes beyond ad-hoc scripting. You are comfortable using version control (Git) for collaboration, participating in code reviews, and have a working understanding of containerization (e.g., Docker) and CI/CD basics.
  • Modern Data Stack: Deep experience with modern transformation and orchestration tools (dbt, Airflow) and an understanding of the "why" behind their architecture.
  • Business-Aligned Mindset: You treat data as a product. You have a strong interest in understanding the business domain and actively partner with stakeholders to ensure your technical designs drive meaningful impact for the product and the company.
  • Clear Communicator: You can articulate technical challenges and architectural decisions clearly, ensuring that both technical peers and non-technical partners understand the rationale behind your technical choices.
  • Bonus Points: Experience with stream processing technologies (e.g., Apache Kafka, Flink, Spark Streaming, AWS Kinesis).
  • Familiarity with infrastructure-as-code (IaC) principles and tools.
  • Experience with data governance and data quality frameworks.
Benefits
  • Well-being and learning stipends
  • Flexible work options
  • Dedicated time off

Applicant Tracking System Keywords

Tip: use these terms in your resume and cover letter to boost ATS matches.

Hard skills
data engineeringETLELTSQLPythondbtAirflowcontainerizationstream processingdata governance
Soft skills
collaborationdata quality advocacybusiness-aligned mindsetclear communicationproblem-solving