Salary
💰 $114,600 - $168,500 per year
Tech Stack
AirflowApacheAWSAzureCloudDockerETLGoogle Cloud PlatformJavaKubernetesLinuxPythonPyTorchScikit-LearnSparkSQLTensorflow
About the role
- Build and optimize batch and streaming data pipelines processing billions of interaction events per day.\n
- Implement reusable model-training and evaluation workflows on Twilio’s internal ML platform.\n
- Deploy, monitor, and troubleshoot low-latency inference services in Kubernetes and serverless environments.\n
- Automate data-quality checks, feature logging, and lineage tracking to guarantee trustworthy datasets.\n
- Collaborate with product, data-science, and DevOps partners to translate business goals into technical roadmaps.\n
- Contribute to design reviews, code reviews, and documentation to elevate engineering standards.\n
- Instrument systems with metrics, alerts, and dashboards that uphold reliability objectives.\n
- Participate in on-call rotations and continually improve CI/CD pipelines.
Requirements
- 1–3 years of professional experience in software, data, or ML infrastructure engineering.\n
- Proficiency in Python or Java and in SQL for data manipulation and analysis.\n
- Hands-on experience building ETL/ELT pipelines with tools such as Apache Spark, Flink, or Airflow.\n
- Familiarity with training and deploying ML models using scikit-learn, TensorFlow, or PyTorch.\n
- Working knowledge of containerization (Docker) and at least one major cloud platform (AWS, GCP, or Azure).\n
- Comfort with Linux, Git, and CI/CD workflows; ability to write clean, tested, maintainable code.\n
- Clear verbal and written communication skills and a demonstrated sense of ownership.\n
- Bachelor’s degree in Computer Science or a related field, or equivalent practical experience.