CI&T

Data DevOps Engineer, Infrastructure

CI&T

full-time

Posted on:

Origin:  • 🇵🇭 Philippines

Visit company website
AI Apply
Apply

Salary

💰 ₱120,000 - ₱150,000 per month

Job Level

Mid-LevelSenior

Tech Stack

ApacheAWSAzureCloudETLGoogle Cloud PlatformPythonShell ScriptingTerraform

About the role

  • Design, implement, and maintain automated deployment pipelines for Microsoft Fabric, Azure Data Factory, and related Azure services.
  • Develop and manage Infrastructure as Code (IaC) using Terraform (or equivalent) to provision, configure, and manage Azure resources.
  • Collaborate with architects, developers, and data engineering teams to integrate DevOps practices into end-to-end data platform delivery.
  • Optimise CI/CD workflows for data pipelines, semantic models, and associated infrastructure.
  • Implement environment configuration management and governance to ensure compliance with enterprise standards, security, and performance requirements.
  • Monitor, troubleshoot, and improve deployment processes, proactively identifying and resolving issues affecting delivery or stability.
  • Maintain technical documentation for DevOps processes, IaC configurations, and deployment standards.
  • Support cross-team collaboration by providing guidance on branching strategies, release management, and deployment best practices.
  • Ensure operational readiness of deployed solutions through post-deployment validation, performance checks, and integration testing.

Requirements

  • Proven experience in data platform DevOps, including automation of deployment pipelines for Azure-based data solutions.
  • Strong proficiency in Azure DevOps for CI/CD pipeline creation, management, and optimisation.
  • Hands-on expertise with Infrastructure as Code (IaC) tools such as Terraform for provisioning, configuring, and managing Azure resources.
  • Experience deploying and managing components in Microsoft Fabric, Azure Data Factory, and related Azure data services.
  • Familiarity with modern data file formats such as Delta Tables, Apache Iceberg, and Parquet.
  • Proficiency in Python and Shell scripting to support automation, deployment, and monitoring processes.
  • Working knowledge of data modelling principles and common practices in enterprise data platforms.
  • Experience with cloud data services within the Microsoft Intelligent Data Platform ecosystem, with additional exposure to AWS or GCP as an advantage.
  • Understanding of data pipeline orchestration patterns and best practices for automated deployment of ETL/ELT workflows.
  • Solid grasp of data governance, security, and compliance considerations for cloud-hosted data solutions.
  • Knowledge of design patterns, clean architecture, and coding best practices for maintainable deployment scripts and automation workflows.
  • Familiarity with unit, integration, and end-to-end testing strategies within a DevOps framework to ensure reliable platform delivery.
  • Mentorship ability: ability to mentor and share knowledge with junior team members.
  • Consulting mindset: adaptability in diverse situations and managing changing priorities.
  • Strong communication skills and ability to present ideas effectively to clients and teams.
  • Interest in AI technologies such as Generative AI.