ICBC (Insurance Corporation of British Columbia)

Data Engineer – Intermediate

ICBC (Insurance Corporation of British Columbia)

full-time

Posted on:

Location Type: Hybrid

Location: North VancouverCanada

Visit company website

Explore more

AI Apply
Apply

Salary

💰 CA$92,729 - CA$100,428 per year

About the role

  • Working in collaboration with customers across the organization (Strategic Analytics, Actuarial, Claims, Insurance, Finance, Driver Licensing, Road Safety, Regulatory Affairs etc.) to plan, scope, execute and sustain machine learning operations and data science based solutions.
  • Operationalizing Data Science Model into Machine Learning pipelines, applying coding optimization of the data science models, conducting model training and re-training, deploying the models and sustaining them in Production.
  • Responding to data requests, data discovery and data profiling to support various data science, evaluative and machine learning solutions and projects, reviewing and clarifying data requirements, ensuring the data artifacts are acceptable within policy and privacy protocols.
  • Providing subject matter & data expertise to the Strategic Analytics, Actuarial and Regulatory Affairs departs as well as ICBC divisional clients on data sources, reporting workflows, business process, and the appropriate tools with which to analyze their data.
  • Participating with corporate data user teams, developing data science model validation and test plans, performing user acceptance testing, and providing support to data scientists, evaluative & performance metrics analysts and sustainment of their end products.
  • Conducting analysis for moderate to complex strategic solutions and POCs, defining data fields and determining data availability, developing information layout, format and interactivity. Presenting findings and providing clarification.

Requirements

  • Hands-on experience implementing LLM solutions for document extraction/classification/summarization with structured outputs (JSON/function calling), validation, and fallbacks.
  • Proven experience designing and operating RAG pipelines end-to-end (chunking, embeddings, vector indexing, hybrid retrieval, re-ranking, answers with citations).
  • Experience building LLM Agents / tool-using workflows (multi-step orchestration, tool/function integration, guardrails, state/memory, reliability testing).
  • Hands-on experience building production OCR/document AI pipelines for PDFs/images (pre/post-processing, layout/table extraction).
  • Strong MLOps/LLMOps experience: model/prompt/version control, automated evaluation & regression tests, CI/CD deployment etc.
  • Advanced skill in Object Oriented programming languages such as Python or Scala.
  • Working experience with Big Data platforms, with exposure to Hadoop ecosystem (Spark, HDFS, Hive, Kafka).
  • Experience with CI/CD tools like Jenkins, GitLab, Fisheye
  • Proven experience in building and deploying machine learning models.
  • Advanced working SQL knowledge and experience working with Relational and NoSQL databases.
  • Strong data quality management process understanding, data analysis and data profiling.
  • Ability to apply critical thinking skills to troubleshoot and perform root cause analysis on technical problems and solution design.
  • Providing technical advice and guidance to staff in resolving complex data ingestion and transformation issues.
  • Experience with performance tuning and code optimization.
  • Design, develop and enforce best practices and standards around MLOps & data engineering.
  • Ability to work effectively with a team or independently, as well as lead small teams as needed.
  • Demonstrating leadership qualities in coaching junior staff members and new hires.
  • Understanding of Agile Methodologies.
  • Excellent interpersonal, verbal and written communication skills to work with Business Partners, IS Managers, Directors and Executive level leaders.
  • Knowledge or working exposure to Cloud technologies including AWS & Azure and demonstrable working experience with reporting and visualization tools, such as Tableau, user interface design, and Geospatial Analysis and knowledge of iterative customer-driven design processes.
Benefits
  • health insurance
  • retirement plans
  • paid time off
  • flexible work arrangements
  • professional development

Applicant Tracking System Keywords

Tip: use these terms in your resume and cover letter to boost ATS matches.

Hard skills
machine learning operationsdata scienceLLM solutionsRAG pipelinesOCR/document AIMLOpsObject Oriented programmingSQLdata quality managementperformance tuning
Soft skills
critical thinkingleadershipinterpersonal communicationverbal communicationwritten communicationteamworkindependent workcoachingtroubleshootingproblem solving