Tech Stack
AirflowAWSCloudETLPythonTerraform
About the role
- Design, build, and maintain data pipelines and ETL processes
- Develop automations to reduce friction and speed up workflows
- Integrate data into everyday team processes, helping colleagues bridge cultural and knowledge gaps
- Collaborate closely with stakeholders to gather requirements, flesh out plans, and deliver end-to-end solutions
- Make strategic decisions about architecture, tools, and integrations (buy vs. build)
- Ensure stability and scalability while fostering adoption across the company
- Play a pivotal role in unlocking data for the team, building automations, and helping integrate data into processes and culture
- Blend of hands-on technical work (70%) and process-oriented, customer-facing collaboration (30%)
Requirements
- Strong background in data engineering (pipelines, ETL, data modeling)
- Experience with ML / NLP, ideally with exposure to AI / LLM applications
- Python proficiency and comfort working with APIs and system integrations
- Familiarity with tools like Snowflake, Pipedream, AWS
- Understanding of process modeling and how data connects to business needs
- Excellent communication skills in English, with the ability to translate technical solutions into business value
- Cloud-native data processing experience with AWS services (S3, etc.)
- Workflow orchestration experience (Dagster, Airflow, Prefect, or similar)
- Experience with deploying and scaling ML/LLM workloads
- Ability to design reproducible environments with Infrastructure as Code (Terraform or similar)
- Strong understanding of observability, monitoring, and restart-resilient workflows
- Skilled at building data accessibility layers for non-technical teams (dashboards, APIs, Airtable/Snowflake integrations)
- Being Berlin-based (a plus)
- German language skills (written and spoken) (a plus)
- Familiarity with investment environments (VC, PE, or similar) (a plus)
- Exposure to start-up, VC, or other high-growth environments is a plus