
Explore more
About the role
- Design, build, and scale robust ETL pipelines to support complex data workflows while ensuring high performance, reliability, and adaptability to evolving business needs.
- Automate and manage data ingestion from diverse sources (databases, APIs, cloud platforms), ensuring system resilience, fault tolerance, and failover readiness.
- Optimize data storage, processing, and retrieval layers to balance performance, scalability, and cost efficiency across the data platform.
- Modernize and enhance legacy data systems by identifying gaps, implementing architectural improvements, and aligning solutions with future business requirements.
- Lead technical excellence within the data engineering function through mentorship, code reviews, best-practice enforcement, and adoption of advanced tools and frameworks.
- Ensure end-to-end data quality, integrity, and governance by implementing validation, monitoring, testing, and compliance-focused data controls.
- Collaborate cross-functionally with analytics, product, engineering, DevOps, and business stakeholders to translate requirements into scalable data models and transformations.
- Drive a data-driven culture and long-term data strategy by enabling self-service analytics, maintaining clear documentation, leading training initiatives, and contributing to architecture roadmaps and governance policies.
Requirements
- 7+ years of experience in IT with 5+ years of hands-on experience in Data Engineering.
- Bachelor’s degree in Data Engineering, Computer Science, Data Analytics, or a related field is required.
- Master’s degree preferred.
- Advanced proficiency in Python and SQL, with proven experience in ETL pipeline development.
- Experience with cloud data platforms such as AWS, GCP, or Azure, including cloud-native data engineering tools and services.
- Strong understanding of modern data architecture patterns, including batch processing, streaming, and event-driven systems, along with industry best practices.
- Demonstrated ability to optimize data workflows, troubleshoot complex data issues, and ensure high performance, scalability, and reliability of data systems.
- Strong project management skills, with the ability to work independently, manage priorities, and deliver high-quality outcomes in a fast-paced environment.
Benefits
- Comprehensive compensation package including health insurance and relocation support.
- Flexibility to work remotely.
- Access to certification programs, mentorship, internal mobility, and continuous learning opportunities.
- Inclusive, collaborative, and supportive workplace with regular team-building activities.
- Commitment to IT education, community empowerment, fair practices, environmental sustainability, and gender equality.
Applicant Tracking System Keywords
Tip: use these terms in your resume and cover letter to boost ATS matches.
Hard Skills & Tools
ETL pipeline developmentPythonSQLdata ingestiondata storage optimizationdata processingdata retrievaldata qualitydata governancedata modeling
Soft Skills
mentorshipcode reviewsbest-practice enforcementcollaborationproject managementindependent workprioritizationcommunicationtraining initiativesdocumentation
Certifications
Bachelor’s degree in Data EngineeringBachelor’s degree in Computer ScienceBachelor’s degree in Data AnalyticsMaster’s degree in related field