Tech Stack
ApacheAWSAzureBigQueryCloudETLGoogle Cloud PlatformJavaPythonSparkSQL
About the role
- Build and operate the Data and Analytics platform for Rentokil Initial
- Define data principles, data architecture and data governance for the data platform
- Deliver data quality assessments and improvement plans
- Deliver key reports and analytical insight to a wide variety of stakeholders
- Support the data agenda with platform reporting and strategy
- Develop and maintain data integration processes to ensure data quality and accuracy
- Deliver quality data engineering solutions of low-moderate complexity without clear requirements
- Transform data to support data analysts and business leaders
- Build and design a scalable and extensible data architecture
- Develop and maintain data processing platforms and frameworks
- Build infrastructure, data pipelines and production of analytical models
- Research industry innovation and encourage continuous learning
- Design and implement data warehousing solutions to support reporting and analytics
- Identify and troubleshoot data issues and provide solutions
- Work closely with business teams to understand data needs
- Collaborate with other data engineers to ensure data consistency and integrity
- Continuously monitor and optimise data performance and scalability
- Stay up-to-date with new technologies and best practices in data engineering
- Ensure data platform security standards are met with the Information Security team
Requirements
- Good understanding and track record of delivering complex data solutions using Agile methods including Scrum, SAFe etc.
- Excellent communication skills, capable of talking to people across IT and business, as well as to stakeholders at various levels of the company
- Hands-on approach, proactive and self starting
- Desire to deliver the best quality and meet the client’s needs
- Advanced experience in designing and creating data models
- Strong with SQL for data interrogation and transformation, a robust understanding of relational data and the ability to manipulate fact data along multiple dimensions
- Experience with deploying solutions in Cloud (Azure, AWS, GCP), ideally GCP
- Overall business intelligence knowledge
- Experience using ETL tools to deliver data integration for batch and streaming use cases
- Willingness to self-study and learn new skills to handle any upcoming tasks
- Hands-on experience of modern software CI/CD techniques to automate the build and deployment of data solutions
- Use of source code version control (e.g. Git, Bitbucket)
- Desirable: experience with real-time processing frameworks (e.g., Apache Spark or Apache Beam)
- Desirable: experience working with BigQuery, Java and/or Python
- Experience working with and adhering to Information Security standards, support procedures and incident response