Super.com

Data Engineering Intern

Super.com

internship

Posted on:

Origin:  • 🇨🇦 Canada

Visit company website
AI Apply
Apply

Salary

💰 CA$35 - CA$40 per hour

Job Level

Entry Level

Tech Stack

AirflowAWSETLKubernetes

About the role

  • Build highly reliable and scalable ELT pipelines for large datasets.
  • Build and maintain pipelines that ingest large amounts of data from various sources.
  • Set up Reverse-ETL syncs to power operational analytics.
  • Write various styles of automated tests to ensure operational reliability and data integrity.
  • Implement improvements in platform operations and ensure high data quality and uptime.
  • Work directly with analysts on a full-stack data team and collaborate cross-functionally.
  • Work on data-intensive applications and a proprietary event logging platform processing terabytes weekly.
  • Operate within a distributed ELT infrastructure hosted using Kubernetes and AWS.

Requirements

  • You have at least 6-18 months of Data Engineering or Software Engineering work experience, preferably with exposure to a dynamic, startup environment.
  • Experience building scalable data pipelines and powerful reporting tools.
  • Experience with ELT pipelines and data-intensive applications.
  • Experience setting up Reverse-ETL syncs.
  • Experience writing automated tests to ensure operational reliability and data integrity.
  • Familiarity with modern data stack: Airflow, Fivetran, Snowflake, dbt, Looker, Hightouch.
  • Experience with distributed ELT infrastructure, Kubernetes, and AWS.
  • Ability to learn quickly across languages and technologies.
  • Ability to decode complex problems with logical, well-reasoned solutions.