Tech Stack
AWSAzureCloudETLGoogle Cloud PlatformSQLVault
About the role
- Design and maintain scalable, business-aligned data models (e.g., Data Vault, dimensional models)
- Ensure high data quality through robust modeling practices, validation checks, and documentation
- Optimize data structures for performance, usability, and maintainability across cloud data platforms like Snowflake
- Build and manage reliable data transformation workflows, preferably using dbt or similar tools
- Support integration of cloud data solutions within the broader data ecosystem (Azure, AWS, or GCP environments)
- Design, develop, and maintain ETL processes to extract, transform, and load data from various sources into our data warehouse
- Proactively identify and address data-related issues, ensuring data accuracy and consistency
- Collaborate with other teams to understand their data requirements and provide effective solutions
- Clearly communicate complex technical concepts to non-technical stakeholders
- Collaborate with data scientists, analysts, and other team members to understand data needs and deliver solutions
- Maintain thorough documentation for all data engineering processes, ensuring knowledge transfer and best practices
Requirements
- Bachelors degree in Computer Science, Information Technology, or a related field
- Proven experience in SQL database design and optimization
- Experience in data modeling and designing scalable, efficient data structures
- Strong ETL development skills
- Excellent analytical and problem-solving skills
- Proactive mindset with ability to work independently and collaboratively
- Open to feedback and communication
- Preferably familiar with cloud platforms (Azure, AWS, GCP) and cloud data warehouse solutions (e.g., Snowflake, Databricks)
- Optionally experience building and managing scalable data pipelines using Azure Data Factory
- Some experience with modern data transformation tools such as dbt or interest in learning
- Obtained certifications in relevant technologies