Tech Stack
AirflowApacheDockerKubernetesNoSQLPythonSQL
About the role
- Design, implement, and maintain complex data workflows
- Collaborate with quantitative analysts and data scientists
- Develop and maintain a resilient, portable, and infrastructure-agnostic data architecture
- Implement robust CI/CD pipelines
- Seamlessly integrate data solutions with host-managed security
Requirements
- 5+ years of demonstrated progressive experience in large-scale data engineering
- Expert proficiency in Python
- Deep expertise in Apache Beam
- Advanced proficiency in Apache Airflow
- Expert SQL development and optimization skills
- Hands-on experience with both relational and NoSQL databases
- Expert-level experience with Docker and orchestration tools like Kubernetes
- 100% remote work
- Equipment allowances
- Professional development opportunities
Applicant Tracking System Keywords
Tip: use these terms in your resume and cover letter to boost ATS matches.
Hard skills
PythonApache BeamApache AirflowSQLDockerKubernetesrelational databasesNoSQL databasesCI/CD pipelinesdata architecture
Soft skills
collaborationcommunication