RBC

Software Developer

RBC

full-time

Posted on:

Location Type: Office

Location: TorontoCanada

Visit company website

Explore more

AI Apply
Apply

About the role

  • Design and develop enterprise-grade data pipelines, ETL/ELT processes, and APIs
  • Build and optimize data infrastructure components that support real-time analytics, reporting, and AI/ML implementation into pipelines
  • Implement data quality monitoring, validation, and lineage tracking systems to ensure data integrity across the organization
  • Develop microservices and containerized applications (Docker, Kubernetes) for scalable data processing and integration
  • Write clean, well-documented, and testable code following enterprise standards and best practices
  • Participate in code reviews, performance optimization, and technical design discussions
  • Support CI/CD pipeline development and automation for data platform deployments
  • Troubleshoot production issues and implement solutions with minimal downtime

Requirements

  • 3+ years of experience and strong proficiency in one or more programming languages (Python, Spring, Java, Scala, Go)
  • Advanced SQL and experience with relational and NoSQL databases
  • Hands-on experience with cloud data platforms (AWS, Azure) or on-premises data warehouse solutions
  • Familiarity with analytical and cloud data lakehouse platforms (Databricks, Snowflake)
  • Understanding of data modeling, ETL patterns, and data governance principles
  • Experience with containerization, orchestration, and deployment technologies (Docker, Kubernetes, Terraform)
  • Knowledge of ML/AI workflows, model deployment, and inference pipelines
  • Proficiency with version control systems (Git) and CI/CD tools
  • Strong problem-solving skills and ability to work independently with minimum supervision
  • Excellent communication skills and ability to collaborate across technical and business teams
  • Nice to have: Experience with data mesh or decentralized data architecture patterns
  • Familiarity with streaming technologies (Kafka, Apache Flink)
  • Knowledge of data security, encryption, and compliance frameworks (GDPR, HIPAA)
  • Exposure to observability and monitoring tools (Grafana, Datadog, Prometheus, ELK stack)
  • Agile/Scrum development experience
  • Background in machine learning operations (MLOps) or data engineering best practices
Benefits
  • A comprehensive Total Rewards Program including bonuses and flexible benefits
  • Competitive compensation
  • Commissions and stock where applicable
  • Leaders who support your development through coaching and managing opportunities
  • Ability to make a difference and lasting impact
  • Work in a dynamic, collaborative, progressive, and high-performing team
  • A world-class training program in financial services
  • Flexible work/life balance options
  • Opportunities to do challenging work
Applicant Tracking System Keywords

Tip: use these terms in your resume and cover letter to boost ATS matches.

Hard Skills & Tools
PythonSpringJavaScalaGoSQLETLdata modelingdata governancemachine learning
Soft Skills
problem-solvingindependent workcommunicationcollaboration