UBDS

Data Engineer

UBDS

full-time

Posted on:

Origin:  • 🇬🇧 United Kingdom

Visit company website
AI Apply
Manual Apply

Job Level

Mid-LevelSenior

Tech Stack

AWSAzureCloudCyber SecurityETLPySparkPythonSparkSQLUnityVault

About the role

  • We are seeking a talented Data Engineer to join us at UBDS Group to support the growth of our Data and AI services and develop client data platforms and ETL/ELT pipelines
  • Development of data platform solutions, including data storage and ETL pipelines to serve data for analytics or ML
  • Develop and maintain data models, data integration, and pipeline monitoring and support clients to implement best practices and appropriate architectures against their needs
  • Work on greenfield projects as well as migration and modernisation projects
  • Identify opportunities for re-usable components and lead their development, maintenance and sharing
  • Collaborate with stakeholders including client teams, Cloud Architect, Scrum Master, BA, and partner technical representatives (Databricks, Microsoft, AWS) to ensure scalable, secure and compliant data solutions
  • Support technical pre-sales by providing guidance on client requirements for proposals and technical approach
  • Advance technology vendor partnerships, maintain relevant certifications and build relationships with product teams and solution architects
  • Contribute to marketing events and thought leadership, and present at conferences and webinars as appropriate

Requirements

  • Continuously seeks to understand and apply industry best practice and technologies as relevant to the client
  • Understands your limits and when to ask for help or collaborate with other SMEs
  • Confident leading the design, development and maintenance of data platforms and ETL solutions in a highly-regulated industry
  • Knowledge of all stages of the data lifecycle, and applying DataOps principles to manage pipeline testing, deployment, integration and monitoring
  • Data modelling expertise to develop low-level designs and implement models against business requirements, using design patterns such as Inmon, Kimball and Data Vault
  • Excellent Databricks, Python, PySpark and Spark SQL knowledge, including writing, testing and quality assuring code and knowledge of Unity Catalog best practice to govern data assets as well as Azure and/or AWS experience
  • Experienced in working with sensitive data (e.g. health or financial), understands and applies security, ethics and privacy best practices
  • Consulting experience