
Data Engineer
BT Group
full-time
Posted on:
Location Type: Office
Location: Bengaluru • India
Visit company websiteExplore more
About the role
- Deliver the development, testing, and deployment of ETL/ELT data pipelines for ingesting, transforming, and curating data, ensuring adherence to data security and privacy standards.
- Contribute to designing and enhancing data warehouse architecture and reusable engineering patterns for processing and storing large scale structured and unstructured datasets.
- Support major engineering initiatives by shaping technical approaches and contributing innovative solutions to complex or ambiguous problems.
- Produce business relevant, high quality data outputs that strengthen the organisation’s data assets and overall data engineering capability.
- Apply engineering best practices and Agile delivery methodologies to ensure consistent, high quality, and iterative delivery of data engineering solutions.
- Execute initiatives to build and improve data and analytics infrastructure, enabling scalable and reusable data engineering capabilities.
- Identify and implement improvements to data engineering processes by analysing data, optimising workflows, and driving efficiency gains.
- Conduct thorough quality assurance and testing of data pipelines, models, and engineering frameworks to ensure accuracy, reliability, and fitness for business use.
Requirements
- Handson experience with key Google Cloud Platform services such as BigQuery, Cloud Storage, Cloud Composer, Dataflow, and Pub/Sub
- Proficiency in SQL, including writing complex queries, optimizing performance, handling large datasets, and implementing data validation and quality checks across analytical platforms
- Experience in designing and implementing Data Warehouse solutions across different platforms, including schema design, ETL/ELT patterns and performance tuning
- Experience in Python for building data pipelines and automation using cloud services, APIs, and SDKs, with an understanding of modular coding, error handling, and reusable components
- Experience with Terraform for infrastructure as code; willingness and ability to learn quickly is essential.
- Understanding of DevOps principles with hands-on experience in version control, CI/CD pipelines, automation, and defect management
- Mandatory experience with GitLab for code management and deployments
- Working knowledge of Agile delivery practices and tools such as Jira and Confluence, with the ability to collaborate in fast paced, iterative development cycles.
- Strong analytical and communication skills, with the ability to interpret complex data and present insights in clear, user friendly formats for technical and nontechnical stakeholders.
- Experience managing deadlines and delivering high quality work under pressure in dynamic and fastmoving environments.
- Experience creating processes and documentation, including developing standard operating procedures, architectural diagrams, and technical documentation.
- Ability to work effectively with multiple stakeholders, collaborating across engineering, product, business, and operations teams to drive aligned outcomes.
- Experience in data quality frameworks, including implementing data validation, monitoring, and continuous improvement techniques.
- Familiarity with incident and service management tools such as ServiceNow for handling production issues, change management, and operational workflows.
Benefits
- Flexible working hours
- Professional development opportunities
Applicant Tracking System Keywords
Tip: use these terms in your resume and cover letter to boost ATS matches.
Hard Skills & Tools
ETLELTSQLPythonTerraformData Warehousedata validationdata quality frameworksDevOpsAgile
Soft Skills
analytical skillscommunication skillscollaborationproblem-solvingtime managementdocumentationstakeholder managementadaptabilityattention to detailcreativity