
Explore more
About the role
- Design and maintain efficient data pipelines
- Centralize and structure data from various sources
- Prepare clean, structured data for AI models and business reporting
- Manage dependencies and data flows using orchestration patterns
- Implement validation frameworks to ensure data quality
- Collaborate to optimize infrastructure for scalability and cost-efficiency
- Apply governance frameworks and security standards to ensure compliance
Requirements
- Professional experience with Python and SQL
- Understanding of CI/CD pipelines, version control, and modular code design
- Comfortable working within cloud environments (AWS, GCP, or Azure)
- Familiarity with workflow orchestration and automating manual data tasks
- Exposure to container technologies for deploying applications
- Interest in continuous improvement through training, certifications, or knowledge sharing
Benefits
- Flexible environment
- Continuous learning with internal tech talks and hands-on training
Applicant Tracking System Keywords
Tip: use these terms in your resume and cover letter to boost ATS matches.
Hard Skills & Tools
PythonSQLCI/CD pipelinesversion controlmodular code designcloud environmentsAWSGCPAzurecontainer technologies
Soft Skills
collaborationcontinuous improvement