Tech Stack
ApacheAWSAzureETLGoogle Cloud PlatformInformaticaPythonSQL
About the role
- Design, develop, and maintain scalable data pipelines using Snowflake
- Write and optimize complex SQL queries for data transformation and analysis
- Migrate and ingest data from REST APIs into Snowflake
- Work with various ETL tools to orchestrate and automate data workflows
- Manage and process unstructured data (JSON, XML, log files, text)
- Collaborate with data analysts, data scientists, and business teams to ensure data availability and accuracy
- Ensure data quality, performance, and governance across the data pipeline
- Promote and enforce information security practices and assess security risks
- Evaluate how data is stored, processed, or transmitted to ensure compliance with data privacy/protection standards
- Ensure data protection measures are integrated throughout the information lifecycle
Requirements
- Bachelors degree in Computer Science, Information Technology, or a related field
- 6+ years of experience as a Data Engineer or in a similar role
- Snowflake expertise (Snowflake SQL and architecture) - MUST
- Proven ability to write and debug complex SQL queries
- Experience with ETL processes and tools (Informatica, Talend, Apache Nifi, or DBT)
- Experience migrating and ingesting data from REST APIs into Snowflake
- Familiarity with managing and analyzing unstructured data formats (JSON, XML, logs, text)
- Strong problem-solving and communication skills
- Understanding of data privacy and protection standards (GDPR, CCPA, etc.)
- Must be legally eligible to work in Canada (will consider sponsorship for eligible candidates)
- Nice to have: experience with AWS, Azure, or GCP; exposure to CI/CD pipelines for data engineering; knowledge of Python or other scripting languages