
Data Engineer
Minor Hotels Europe and Americas
full-time
Posted on:
Location Type: Hybrid
Location: Raleigh • North Carolina • United States
Visit company websiteExplore more
Salary
💰 $90,000 - $96,900 per year
Tech Stack
About the role
- Design, develop and maintain scalable and reliable data engineering pipelines to support analytics and business intelligence needs
- Architect, design and implement solutions that meet stakeholders needs
- Architect, implement and optimize Snowflake data warehouse solutions including data modeling, performance tuning and cost optimization
- Lead and support data migration initiatives including migration from on-premises or legacy data platforms to Snowflake or other cloud based solutions
- Develop and manage ELT/ETL processes using modern data integration tools and frameworks
- Participate actively in requirements gathering, data modeling and design sessions
- Prepare high-level and detailed technical specifications for the projects in accordance with security and architecture documentation objectives
- Develop detailed plans and accurate estimates for the completion of the build, system testing and implementation phases of a project
- Collaborate with data architects, analytics teams and business stakeholders to translate requirements into technical solutions
- Running and optimizing SQL queries on RDBMS like MS SQL server, MySQL, MariaDB
- Ensure data quality, security, governance and compliance throughout the data lifecycle
- Develop code, document and execute unit tests, systems integration, and acceptance tests and testing tools for functions of high complexity
- Troubleshoot and resolve performance, data integrity and pipeline reliability issues
- Document architecture, data flows and operational procedures
Requirements
- Strong hands-on experience in Data Engineering including data pipeline development and largescale data processing
- Deep expertise in Snowflake architecture including Virtual warehouses, Micro partitioning, Clustering and performance optimization, Security and access control
- Proven experience with data migration projects including assessment, planning, execution and validation
- Minimum 5 years of experience in software engineering or analytics in creating enterprise data architectures, distributed and microservice software architectures and design patterns
- Strong SQL skills and experience with data modeling, dimensional and or data vault
- Core SQL database concepts including creating DDL, DML scripts, normalization, running and optimizing SQL queries on RDBMS
- Experience working with cloud platforms (AWS, Azure or GCP)
- 2 years of application development experience in Hadoop, NoSQL databases like MongoDB, Cassandra or HBase
- Familiarity with orchestration and data integration tools (e.g., Airflow, dbt, Informatica, Fivetran or similar)
- Prior experience with Liquibase, Git, code repos (like GitHub)
- Bachelor's degree in Information Technology or Computer Science
- Strong problem-solving skills and ability to work independently in a fast paced environment.
Benefits
- Paid time off based on employee grade (A-F), defined by policy: Vacation: 12-25 days, depending on grade
- Company paid holidays
- Personal Days
- Sick Leave
- Medical, dental, and vision coverage (or provincial healthcare coordination in Canada)
- Retirement savings plans (e.g., 401(k) in the U.S., RRSP in Canada)
- Life and disability insurance
- Employee assistance programs
- Other benefits as provided by local policy and eligibility
Applicant Tracking System Keywords
Tip: use these terms in your resume and cover letter to boost ATS matches.
Hard Skills & Tools
data engineeringdata pipeline developmentSnowflake architectureSQLdata modelingdimensional modelingdata vaultcloud platformsHadoopNoSQL databases
Soft Skills
problem-solvingindependent workcollaborationrequirements gatheringcommunication
Certifications
Bachelor's degree in Information TechnologyBachelor's degree in Computer Science