
Senior Data Engineer – MLOps, Architecture
Pmweb
full-time
Posted on:
Location Type: Remote
Location: Brasil
Visit company websiteExplore more
Job Level
About the role
- Develop ETL/ELT workflows using Python and managed cloud services
- Orchestrate ingestion of data from multiple sources into Data Lake/Data Warehouse
- Implement CI/CD pipelines for Machine Learning
- Automate model retraining and deployments using native cloud tools
- Provision and manage resources on AWS and GCP using Terraform
- Monitor and optimize query and processing performance
- Implement automated data quality checks
- Ensure observability of pipelines
Requirements
- Senior Data Engineer with a focus on MLOps and Architecture (Customer Intelligence & Personalization)
- Advanced proficiency in Python (Required)
- Strong experience with PySpark for distributed processing
- Familiarity with AWS services: S3, Lambda, Glue, EMR, Kinesis and SageMaker
- Proficiency in GCP: BigQuery, Cloud Functions, Dataflow and Vertex AI
- Experience with workflow orchestrators: Apache Airflow (or Cloud Composer/MWAA)
- ML lifecycle tools: MLflow, DVC or native suite tools
- Proficiency deploying real-time recommendation models via scalable APIs
- Advanced SQL for Data Warehousing
- Docker (containerization of scripts and models)
- Terraform (Infrastructure as Code)
- CI/CD knowledge (GitHub Actions, GitLab CI or CodePipeline)
- English and/or Spanish
Benefits
- Health insurance
- Dental insurance
- Life insurance
- Childcare assistance
- Birthday day off
- Wellhub
- Transportation allowance
- Meal and/or food allowance
- Language learning support
- Education support
- Certification support
- 7 different work schedule options
- Home office allowance
- Structured evaluation cycle and continuous feedback
- Company profit sharing
Applicant Tracking System Keywords
Tip: use these terms in your resume and cover letter to boost ATS matches.
Hard skills
PythonPySparkSQLDockerTerraformCI/CDMachine LearningData WarehousingETLELT
Soft skills
communicationleadership