GRADION

Senior Consultant – Data Engineering

GRADION

full-time

Posted on:

Location Type: Hybrid

Location: Ho Chi Minh CityVietnam

Visit company website

Explore more

AI Apply
Apply

Job Level

About the role

  • Advise senior client stakeholders on modern data architecture, cloud migration strategies, and the secure, compliant use of data for business value.
  • Define clear roadmaps for clients to transition from legacy data warehouses to scalable cloud-native data platforms (e.g., Data Lakes, Lakehouses).
  • Design data pipelines and structures that enable clients to monetize data assets and derive actionable insights.
  • Conduct data maturity assessments and define target-state architectures and roadmaps.
  • Communicate complex data and AI topics in clear business language to executives and stakeholders.
  • Lead the design and implementation of robust, scalable, and cost-efficient data infrastructure (data lakehouse, data mesh, or centralized warehouse) on major hyperscaler platforms (AWS, Azure, GCP).
  • Develop and optimize high-throughput data pipelines using modern ELT/ETL tools (e.g., Spark, Flink, Kafka) to handle large data volumes and integrate disparate data sources.
  • Build, deploy, and manage production-ready machine learning AI/ML pipelines (MLOps) including feature stores, ML model registries, model training, workflows (e.g., MLflow, Vertex AI, SageMaker, Azure ML), serving and monitoring.
  • Explore and implement data engineering infrastructure required to develop and deploy Small Language Models and other applied AI solutions within client environments for specific client use cases.
  • Establish data governance, lineage, and compliance controls to ensure trustworthy AI and regulatory readiness.
  • Contribute to Gradion’s internal frameworks for data platform modernization and AI readiness.
  • Define and implement data governance frameworks aligned with ISO, FINMA, GDPR, or MedTech requirements.
  • Embed data security, masking, and access control into pipelines and platform layers.
  • Help clients design policy-as-code and automated compliance guardrails for data and AI systems.
  • Conduct technical and architectural assessments (data platform Health Checks) to identify bottlenecks, security gaps, and cost inefficiencies.

Requirements

  • 7+ years of experience in data engineering, data architecture, or ML platform engineering roles.
  • Strong background in ETL/ELT, data lake/warehouse architecture, and distributed data processing.
  • Hands-on experience with one or more cloud data ecosystems (AWS, Azure, GCP, Snowflake, Databricks, BigQuery, Synapse, etc.).
  • Proficiency in Python and SQL; experience with modern frameworks such as Spark, Airflow, dbt, Kafka.
  • Familiarity with containerization and orchestration (Docker, Kubernetes).
  • Experience designing and implementing MLOps principles and tooling (e.g., Kubeflow, MLflow, SageMaker, Azure ML). or integrating data pipelines with AI/ML workloads.
  • Understanding of data security, compliance, and governance frameworks (ISO, GDPR, SOC2).
  • Consulting mindset, able to translate technical depth into client value and communicate clearly with business stakeholders.
  • Excellent communication, presentation, and stakeholder management skills, comfortable working with both technical teams and C-level executives (English proficiency, German a plus).
  • Desired
  • Experience with data monetization, data products, or real-time analytics.
  • Familiarity with LLM/SLM architectures, vector databases, or retrieval-augmented generation (RAG) patterns.
  • Experience with Databricks, Snowflake, or other modern data warehousing/lakehouse platforms.
  • Familiarity with distributed processing frameworks (e.g., Spark).
  • A Master's degree in Computer Science, Data Science, or a related quantitative field.
  • Experience in regulated industries (finance, healthcare, MedTech) or with cross-border data environments (EU/US/APAC).
  • Certifications such as AWS Data Analytics, Azure Data Engineer, or GCP Professional Data Engineer are a plus.
Benefits
  • A laptop is provided
  • Community Tech activities
  • A fun & dynamic environment with freedom to be creative
  • Modern office with flexible and relaxing spaces
  • Performance bonus (up to 2-month salary)
  • Performance review 2 times/year
  • Extra Premium Healthcare & Annual Health-check
  • 15 days of annual leave
  • Working time: Monday – Friday (9 AM – 6 PM)
Applicant Tracking System Keywords

Tip: use these terms in your resume and cover letter to boost ATS matches.

Hard Skills & Tools
data engineeringdata architectureML platform engineeringETLELTdata lake architecturedata warehouse architecturedistributed data processingPythonSQL
Soft Skills
consulting mindsetcommunicationpresentationstakeholder management
Certifications
AWS Data AnalyticsAzure Data EngineerGCP Professional Data Engineer