Horizon Software International, LLC

Data Architect, GCP

Horizon Software International, LLC

contract

Posted on:

Location: 🇨🇦 Canada

Visit company website
AI Apply
Apply

Salary

💰 CA$100 - CA$120 per hour

Job Level

Mid-LevelSenior

Tech Stack

AirflowApacheBigQueryCloudERPETLGoogle Cloud PlatformInformaticaPythonSQLTerraformVault

About the role

  • Define and implement enterprise-wide data strategy aligned with business goals, including governance, classification, retention, and privacy policies
  • Design conceptual, logical, and physical data models to support analytics and operational workloads; implement star, snowflake, and data vault models
  • Implement S4 CDS views in Google BigQuery
  • Architect data solutions on GCP using BigQuery, Cloud Storage, Dataflow, and Dataproc and apply cost optimization strategies
  • Design and orchestrate ETL/ELT pipelines using Apache Airflow (Cloud Composer) and Dataflow
  • Integrate data from multiple systems including SAP BW, SAP HANA, Business Objects using tools like SAP SLT and Google Cortex Framework; leverage Boomi for interoperability
  • Develop complex SQL queries for analytics, transformations, and performance tuning and build automation scripts/utilities in Python
  • Lead on-premise to cloud migrations for enterprise data platforms, managing migration of SAP datasets to GCP ensuring data integrity and minimal downtime
  • Implement CI/CD pipelines for data workflows and apply infrastructure-as-code principles using GitHub Actions, Cloud Build, and Terraform

Requirements

  • Strong SAP data integration expertise
  • Proven experience with GCP BigQuery, Cloud Storage, Pub/Sub, Dataflow, Dataproc
  • Hands-on experience with SAP data extraction, modeling, and integration from ERP, BW, and/or HANA systems
  • Experience with SAP SLT and Google Cortex Framework
  • Experience with integration tools such as Boomi, Informatica, or MuleSoft
  • Experience implementing S4 CDS views in Google BigQuery
  • Strong SQL and Python programming skills
  • Good understanding of CDS views and ABAP language
  • Experience designing and orchestrating ETL/ELT pipelines using Apache Airflow (Cloud Composer) and Dataflow
  • Experience with data modeling techniques: star, snowflake, data vault
  • Knowledge of data governance frameworks, data classification, retention, and privacy policies (GDPR, HIPAA, PIPEDA)
  • Experience leading on-premise to cloud migrations for enterprise data platforms (SAP BW/Bobj)
  • Experience implementing CI/CD pipelines for data workflows using GitHub Actions, Cloud Build, and Terraform
  • Ability to implement cost optimization strategies for GCP workloads
  • Experience collaborating with business stakeholders and engineering teams
Desjardins

Data Engineering Advisor

Desjardins
Mid · Seniorfull-time🇨🇦 Canada
Posted: 1 day agoSource: desjardins.wd10.myworkdayjobs.com
Top Hat

Data Engineer

Top Hat
Mid · Seniorfull-time🇨🇦 Canada
Posted: 4 days agoSource: jobs.ashbyhq.com
AirflowAmazon RedshiftAWSAzureBigQueryCloudETLGoogle Cloud PlatformKafkaNeo4jPythonScala+1 more
EXL

Senior Manager – Architecture, Data Architecture

EXL
Seniorfull-time🇨🇦 Canada
Posted: 6 days agoSource: fa-ewjt-saasfaprod1.fa.ocs.oraclecloud.com
Nestle

IT Data Engineering Intern

Nestle
Entryinternship🇨🇦 Canada
Posted: 7 days agoSource: jobdetails.nestle.com
ApacheAzureCloudETLPythonSparkSQL
D2L

Senior Data Engineer

D2L
Seniorfull-time$85k–$120k / year🇨🇦 Canada
Posted: 8 days agoSource: boards.greenhouse.io
AWSETLPythonSQLSSISTerraform