
Data Platform DevOps
Siam Makro Public Company Limited
full-time
Posted on:
Location Type: Hybrid
Location: Bangkok • 🇹🇭 Thailand
Visit company websiteJob Level
Junior
Tech Stack
AirflowApacheAWSAzureCloudDockerETLGoogle Cloud PlatformKafkaKubernetesOpen SourcePythonSparkSQLTerraformUnity
About the role
- **Job Summary:**
- We are seeking an experienced Data Platform DevOps to implementation, and maintenance of our data infrastructure and pipelines. The ideal candidate will have a strong background in data engineering and DevOps principles, with a passion for automation, quality, and governance. You will act as a bridge between technical and business teams, ensuring our data platform is not only efficient and scalable but also reliable and compliant. This role is crucial for enabling data-driven decisions and accelerating the development life-cycle for all data initiatives.
- **Key Responsibilities:**
- - Design & Implement Data Platforms: Design, develop, and maintain robust, scalable data pipelines and ETL processes, with a focus on automation and operational excellence.
- - Ensure Data Quality and Governance: Implement automated data validation, quality checks, and monitoring systems to ensure data accuracy, consistency, and reliability.
- - Manage CI/CD for Data: Own and optimize the CI/CD pipelines for data engineering workflows, including automated testing and deployment of data transformations and schema changes.
- - Architect & Implement IaC: Use Infrastructure as Code (IaC) with Terraform to manage data infrastructure across various cloud platforms (Azure, AWS, GCP).
- - Performance & Optimization: Proactively monitor and optimize query performance, data storage, and resource utilization to manage costs and enhance efficiency.
- - Collaborate with Stakeholders: Manage communication with technical and business teams to understand requirements, assess technical and business impact, and deliver effective data solutions.
- - Strategic Design: Possess the ability to see the big picture in architectural design, conduct thorough risk assessments, and plan for future scalability and growth.
Requirements
- **Requirements:**
- - Experience: 1-3 years of experience in data engineering, data warehousing, and ETL processes, with a significant portion of that time focused on DataOps or a similar operational role.
- - Platform Expertise: Strong experience with data platforms such as Databricks and exposure to multiple cloud environments (Azure, AWS, or GCP).
- - Data Processing: Extensive experience with Apache Spark for large-scale data processing.
- - Orchestration: Experience working with data orchestration tools like Azure Data Factory (ADF), Apache Airflow, or similar.
- - CI/CD & Version Control: knowledge of version control (Git) and experience with CI/CD pipelines (GitLab CI/CD, GitHub Actions).
- - IaC: hands-on experience with Terraform.
- - Programming: Programming skills in Python and advanced proficiency in SQL.
- - Soft Skills: Strong stakeholder management, communication, and collaboration skills. The ability to articulate complex technical concepts to non-technical audiences is a must.
- - Problem-Solving: Strong problem-solving skills with an ability to analyze technical challenges and their business impact.
- **Preferred Qualifications:**
- - Data Modeling: Experience with data modeling tools and methodologies, specifically with dbt (data build tool).
- - AI & ML: Experience with AI-related technologies like Retrieval-Augmented Generation (RAG) and frameworks such as LangChain.
- - Data Observability: Hands-on experience with data quality and observability tools such as Great Expectations, Monte Carlo, or Soda Core.
- - Data Governance: Familiarity with data governance principles, compliance requirements, and data catalogs (e.g., Unity Catalog).
- - Streaming Technologies: Experience with stream processing technologies like Kafka or Flink.
- - Containerization: Familiarity with containerization and orchestration tools (e.g., Docker, Kubernetes).
- - Open Source: Contributions to open-source projects or relevant certifications.
Benefits
- Health Insurance – At Lotus's, we care about your health! Group insurance from a top insurance company is included in your benefits—OPD, IPD, Emergency OPD
- Provident Fund – Lotus's cares about your long-term plan! We offer 3% provident fund.
- Year-end bonus – We include variable and performance bonus for our employees.
- Attractive Vacations days – Enjoy our attractive annual leave. Let’s say the minimum is 16 days!
- No overtime – We work 5 days a week with. We set our own goals and deadlines.
- Free car parking space – No more stress or extra cost if you drive to work. We offer free parking space for our employees.
- Best Culture
- - Clear focus.
- - Diverse Workplace (Our members are from around the world!)
- - Non-hierarchical and agile environment
- - Growth opportunity and career path
Applicant Tracking System Keywords
Tip: use these terms in your resume and cover letter to boost ATS matches.
Hard skills
data engineeringETL processesdata warehousingApache SparkPythonSQLTerraformdata modelingdata orchestrationdata quality
Soft skills
stakeholder managementcommunicationcollaborationproblem-solvingtechnical articulation