Ledgy

Data Engineer

Ledgy

full-time

Posted on:

Location Type: Remote

Location: Remote • 🇩🇪 Germany

Visit company website
AI Apply
Apply

Job Level

JuniorMid-Level

Tech Stack

BigQueryCloudETLGoogle Cloud PlatformPandasPythonSQL

About the role

  • Manage and optimize data infrastructure and ETL pipelines using Fivetran, Airbyte, and Google Cloud Platform, ensuring reliable data flow from multiple sources into our analytics ecosystem
  • Develop, test, and maintain DBT models that transform raw data into analytics-ready datasets following best practices
  • Create and manage LookML models in Looker to enable self-service analytics for stakeholders across the company
  • Drive continuous improvement of our data engineering practices, tooling, and infrastructure as a key member of the Operations team

Requirements

  • 2-3+ years experience building production data pipelines and analytics infrastructure, with DBT, SQL, and Python (Pandas, etc.)
  • Experience implementing and managing ETL/ELT tools such as Fivetran or Airbyte
  • Ideally hands-on experience with GCP (BigQuery)
  • Proficiency in Looker, including LookML development
  • Strong plus if you have experience using n8n or similar automation tools
  • Experience with SaaS data sources (HubSpot, Stripe, Vitally, Intercom)
  • Familiarity with AI-powered development tools (Cursor, DBT Copilot) and a strong interest in leveraging cutting-edge tools to improve workflow
  • Strong problem-solving skills and ability to debug complex data issues
  • Excellent communication skills with ability to explain technical concepts to non-technical stakeholders.

Applicant Tracking System Keywords

Tip: use these terms in your resume and cover letter to boost ATS matches.

Hard skills
DBTSQLPythonETLELTLookMLdata pipelinesdata infrastructureanalyticsdata transformation
Soft skills
problem-solvingcommunication