
Senior DevOps, Data
qode.world
full-time
Posted on:
Location Type: Hybrid
Location: Ho Chi Minh City • 🇻🇳 Vietnam
Visit company websiteJob Level
Senior
Tech Stack
AirflowAnsibleAWSChefDynamoDBEC2JenkinsPuppetPythonSparkTerraform
About the role
- Manage, capacity plan, and operate workloads utilizing EC2 clusters via DataBricks/EMR to ensure efficient and reliable data processing
- Collaborate with stakeholders to design and implement a Data Mesh architecture across multiple closely related but separate enterprise entities
- Utilize Infrastructure as Code (IaC) tools such as CloudFormation or Terraform to define and manage data platform user access to data and compute resources.
- Implement role-based access control (RBAC) mechanisms using IaC templates to enforce least privilege principles and ensure secure access to data and compute resources
- Collaborate with cross-functional teams to design, implement, and optimize data pipelines and workflows
- Utilize distributed engines such as Spark to process and analyze large volumes of data efficiently when required
- Develop and maintain operational best practices for Spark and other data warehousing tools to ensure system stability and performance
- Implement and manage storage technologies to efficiently store and retrieve data as per business requirements
- Troubleshoot and resolve platform-related issues in a timely manner to minimize downtime and disruptions
- Stay updated on emerging technologies and industry trends to continuously enhance the data platform infrastructure
- Document processes, configurations, and changes to ensure comprehensive system documentation.
Requirements
- Knowledge of one or more of the following: AWS CloudFormation and Terraform for infrastructure provisioning
- Knowledge of the source control and its related concepts (Gitlab/Git flow, Trunk-based, branches, etc.).
- Familiarity with at least one programming language (Python, Bash, etc.).
- Familiarity with a distributed compute engine such as Spark
- Familiarity with a data platform or data orchestration tool such as Databricks/Airflow
- Equipped with in-depth working knowledge and experience in using AWS IAM, VPC, EC2, RDS, DynamoDB, DMS, and S3
- Experience with CI/CD tools (such as Jenkins, TeamCity, AWS CodePipeline, CodeDeploy) or configuration management tools (such as Ansible, Chef, Puppet..)
- DevOps mindset with automation and operational excellence in mind
- Good skills in English and the ability to communicate effectively with business and technical teams
- Demonstrate good logical thinking and problem-solving skills
- Be curious and have a self-learning attitude
- Big Plus: AWS Data Engineer Associate or DevOps Professional Certifications
Benefits
- Meal and parking allowances
- Full benefits and salary during probation.
- Insurances as per Vietnamese labor law and premium health care for you and your family.
- SMART goals and clear career opportunities (technical seminar, conference, and career talk) - we focus on your development.
- Values-driven, international working environment, and agile culture.
- Overseas travel opportunities for training and work-related.
- Internal Hackathons and company events (team building, coffee run, etc.).
- Pro-Rate and performance bonus.
- 15-day annual + 3-day sick leave per year from the company.
- Work-life balance 40-hr per week from Mon to Fri.
Applicant Tracking System Keywords
Tip: use these terms in your resume and cover letter to boost ATS matches.
Hard skills
AWS CloudFormationTerraformPythonBashSparkDatabricksAirflowAWS IAMAWS VPCAWS EC2
Soft skills
communicationlogical thinkingproblem-solvingcuriosityself-learning
Certifications
AWS Data Engineer AssociateAWS DevOps Professional