FREE ACCESS
5,000–10,000 jobs/day

See all jobs on JobTailor
Search thousands of fresh jobs every day.
Discover
- Fresh listings
- Fast filters
- No subscription required
Create a free account and start exploring right away.
Tech Stack
Tools & technologiesAWSPySparkPython
About the role
Key responsibilities & impact- Gather, analyse and document business requirements for dashboards and analytical use cases
- Translate business needs into data models, transformation logic and dashboard designs
- Design and implement data transformations across Data Lake layers (RAW → TRUSTED → REFINED)
- Develop and maintain data pipelines using PySpark and Python
- Ensure data quality, consistency and compliance with data governance standards
- Build, test and deploy datasets, analyses and dashboards in AWS QuickSight
- Collaborate with Data Stewards, IT teams and business stakeholders
- Support UAT, user training and adoption of self-service analytics
Requirements
What you’ll need- Proven experience in data engineering and/or data analytics roles
- Strong knowledge of AWS Data Lake architectures
- Hands-on experience with PySpark and Python for data processing
- Experience building dashboards using AWS QuickSight
- Good understanding of BI concepts, KPIs and dimensional data modelling
- Strong communication skills, with the ability to interact with both technical and non-technical stakeholders
Benefits
Comp & perks- Remote work 📊 Check your resume score for this job Improve your chances of getting an interview by checking your resume score before you apply. Check Resume Score
ATS Keywords
✓ Tailor your resumeApplicant Tracking System Keywords
Tip: use these terms in your resume and cover letter to boost ATS matches.
Hard Skills & Tools
data engineeringdata analyticsdata transformationsdata modelsPySparkPythondata pipelinesdashboardsdata governancedimensional data modelling
Soft Skills
communicationcollaborationuser trainingstakeholder interaction
