Tech Stack
Amazon RedshiftAWSAzurePythonSQLTableau
About the role
- Structure, organize, and standardize data from multiple sources, applying consistent business rules.
- Consolidate information into the corporate Data Warehouse, ensuring integrity and scalability.
- Refactor manual report and metric generation processes, promoting automation and centralization of data.
- Implement best practices for data governance, versioning, and data quality, ensuring a single source of truth.
- Support the design and implementation of the new data architecture on AWS while maintaining compatibility with existing systems on Azure.
- Create and maintain efficient data pipelines in Databricks and Synapse, ensuring performance and reliability.
- Collaborate with business teams to translate operational rules into reliable indicators and metrics.
Requirements
- Strong knowledge of dimensional and relational modeling (Star Schema, Snowflake).
- Understanding of Data Warehouse layers (Raw, Trusted, Refined) and their purposes.
- Advanced SQL and Python for data manipulation and integration.
- Experience with Azure Synapse and Databricks.
- Familiarity with AWS services (Athena, Redshift, Glue, S3).
- Experience with visualization tools such as Tableau.
- Understanding of data quality, reliability, and versioning concepts.
- Analytical skills and the ability to translate business rules into indicators and metrics.
- Previous experience in migration projects or hybrid architecture (Azure → AWS) is desirable.
- Don’t meet all the requirements for this role?
- That’s okay! At Compass UOL, we encourage continuous development of new talent and turn challenges into opportunities.
Applicant Tracking System Keywords
Tip: use these terms in your resume and cover letter to boost ATS matches.
Hard skills
dimensional modelingrelational modelingSQLPythondata manipulationdata integrationdata qualitydata governancedata pipelinesdata architecture
Soft skills
analytical skillscollaborationcommunication