AND Digital

Lead Data Engineer

AND Digital

full-time

Posted on:

Location Type: Hybrid

Location: LondonUnited Kingdom

Visit company website

Explore more

AI Apply
Apply

Job Level

About the role

  • Lead new Data Pipeline Delivery: Implement the new Python-based ETL pipelines for data movement from Luna’s microservice DynamoDB tables to RDS transactional DB (Postgres OLTP) and subsequently to the RDS data warehouse (Postgres OLAP).
  • Platform Optimization: Leverage expertise from previous data setups to optimize the new Postgres OLTP for transactional performance and the OLAP environment for analytics effectiveness, with both being conscious of on-going cost management.
  • Standard Setting: Establish and enforce DataOps standards, including version control (Git), automated CI / CD deployment, and schema migration management using tools like Liquibase or Drizzle ORM.
  • Hands-on Coaching: Actively mentor team members, elevating their skills in Python, Git, and engineering workflows through code reviews and workshops.
  • Code Quality: Conduct rigorous code reviews, providing detailed, educational feedback that explains best practices and clearly outlines required changes to elevate team standards.
  • AI Engineering patterns: Support the definition and implementation of AI tooling and practices to augment the Data engineering team.
  • Initiative Planning: Break down larger reporting initiatives into manageable epics and stories within our Agile framework (a mix of Scrum for larger work items, and Kanban for smaller continuous flow items).
  • Stakeholder Management: Network with business domains to capture requirements and provide strategic guidance on the data platform migration plan.
  • Product Partnership: Support the Data Product Owner by providing the technical context necessary to prioritize the team's backlog effectively.

Requirements

  • Data Engineering - 7 to 10 years experience building and supporting full-stack data pipelines from source to reporting.
  • AWS Mastery - 5+ years deep experience with AWS, ideally administration and optimization of RDS / Aurora (Postgres) and Redshift.
  • Python - 5+ years expert-level Python, with specific experience working on ETL pipelines, including libraries like Pandas and orchestration tools like Airflow or Prefect.
  • PostgreSQL - 5+ years expert-level SQL and database architecture (partitioning, indexing) for both OLTP and OLAP.
  • DataOps / Tools - 3+ years expertise in Git, with solid understanding of Git branching and workflow, and schema management (e.g., Liquibase, Flyway, or Drizzle).
Benefits
  • 25 days holiday allowance + bank holidays
  • Share scheme
  • A £1000 flexifund to use on a personalised list of benefits such Gym membership, Cycle to Work Scheme, Health, dental and optical cash plan
  • 6% pension
  • PLUS many more
Applicant Tracking System Keywords

Tip: use these terms in your resume and cover letter to boost ATS matches.

Hard Skills & Tools
PythonETLPostgreSQLSQLData EngineeringDataOpsAWSGitData PipelineAI Engineering
Soft Skills
mentoringcode reviewsstakeholder managementstrategic guidanceinitiative planning