
Senior Data Engineer
Lightcast
full-time
Posted on:
Location Type: Hybrid
Location: Quebec City • 🇨🇦 Canada
Visit company websiteJob Level
Senior
Tech Stack
AirflowAWSDistributed SystemsETLHadoopPythonSparkSQL
About the role
- Design, build, and maintain scalable data pipelines (batch & streaming) for analytics, reporting, and ML.
- Architect applications and automated tools, translating complex requirements into high-performing solutions.
- Define data/software solutions, including hardware needs, to ensure performance and scalability.
- Establish and enforce standards for data integration, modeling, and schema design (dimensional, star, snowflake).
- Optimize SQL queries and schema performance; ensure data quality, consistency, and validation.
- Monitor, troubleshoot, and tune pipelines, databases, and workloads.
- Implement engineering best practices: version control, CI/CD, testing, documentation, and code reviews.
- Collaborate with engineers to integrate data systems into production.
- Mentor junior engineers and provide cross-team technical support.
- Evaluate and recommend new tools, frameworks, and technologies.
- Ensure compliance with data security, governance, access control, and regulations (GDPR, CCPA).
Requirements
- Bachelor’s in Computer Science or related field; Master’s preferred
- 5+ years in data engineering, software engineering, or data science
- Expert SQL (optimization, advanced joins, windowing, partitioning, indexing)
- Proven Snowflake expertise in data warehousing and analytics
- Strong knowledge of data modeling, engineering best practices, and distributed systems (Spark, Hadoop, Hive, Presto)
- Hands-on experience with ETL/ELT pipelines, API/event/log integration, and workflow orchestration (Airflow/Astronomer required)
- Proficient in DBT for transformation and modeling
- Skilled with AWS data engineering and infrastructure services
- Strong software engineering: clean code, modularization, error handling, CI/CD, automated testing
- Knowledge of object-oriented design, data structures, algorithms, and disaster recovery
- Expertise in scaling and performance optimization of pipelines, databases, and workloads
- Familiar with Python (preferred) and other modern languages
- Ability to analyze complex data sets and deliver actionable insights
- Effective collaborator and mentor with cross-functional teams
- Experience in agile or rapid application development
- Creative, detail-oriented, results-driven, with strong problem-solving and prioritization skills
Applicant Tracking System Keywords
Tip: use these terms in your resume and cover letter to boost ATS matches.
Hard skills
SQL optimizationSnowflakedata modelingETLELTDBTAWS data engineeringPythonSparkHadoop
Soft skills
collaborationmentoringproblem-solvingprioritizationattention to detailresults-drivencommunicationcreativityadaptabilityleadership