Tech Stack
CloudCyber SecurityETLPythonShell ScriptingSparkSQLUnix
About the role
- Plan, design and contribute towards database development (Data Warehousing, ETL) and engineering activities
- Support infrastructure related upgrades involving database and other environments
- Develop and support Data Ingestion Framework using Spark, Python, Databricks, Snowpipe
- Implement DevOps techniques and practices (CI/CD, Test Automation, Build Automation, TDD) to enable rapid delivery using tools like Git
- Craft data pipelines, implement data models, and optimize data processes for improved accuracy and accessibility
- Apply machine learning and AI-based techniques to data engineering solutions
- Provide specialist data analysis and expertise to drive decision-making and business insights
- Work as part of the Credit Risk - Data Infrastructure team in Risk Tech, at Vice President (Lead Data & Analytics Engineering) level
Requirements
- 7+ years of working experience in Data Warehousing
- Bachelor’s degree in computer science or related field
- Excellent understanding of Data Warehousing concepts (Data modelling, data transformations)
- Strong skills in Relational Databases and writing complex SQL
- Good programming skills with any programming language (Python preferred)
- Knowledge of Unix shell scripting
- Knowledge about DevOps implementation in Data space (CI/CD, Test Automation, Build Automation, TDD)
- Hands on experience with Spark, Snowflake and/or Databricks
- Experience developing/supporting Data Ingestion Frameworks (Spark, Python scripting, Databricks, Snowpipe)
- Familiarity with Git
- Excellent communication and interpersonal skills
- Knowledge of French and English
- Ability to work in Montreal, Quebec (positions located in Montreal)