SereneAid

Enterprise Data Platform Architect, Strategic Technology Consultant

SereneAid

full-time

Posted on:

Location Type: Remote

Location: Remote • 🇨🇦 Canada

Visit company website
AI Apply
Apply

Job Level

SeniorLead

Tech Stack

AzureCloudETLUnityVault

About the role

  • Develop the strategic vision and multi-year roadmap for the enterprise data platform.
  • Collaborate closely with service owners and senior stakeholders to prioritize capabilities and influence long-term modernization efforts.
  • Translate business and organizational goals into actionable platform strategies, shaping enterprise architecture, governance, and AI readiness.
  • Promote enterprise adoption through standardized data products, reusable frameworks, and self-service capabilities.
  • Architect and implement cloud-native data solutions using Azure and Databricks, including Delta Lake, Unity Catalog, and Medallion architectures.
  • Lead the design and development of scalable data ingestion, transformation, and orchestration pipelines using Databricks, Azure Data Factory, Event Hub, Functions, and APIs.
  • Define and enforce governance, security, and compliance frameworks including RBAC/ABAC, encryption practices, and secure networking models.
  • Implement DataOps practices and CI/CD automation for Databricks workflows and Azure data pipelines.
  • Optimize performance, reliability, and cost efficiency across Databricks clusters, ETL workloads, and Azure resources.
  • Identify opportunities for AI-driven automation such as automated pipeline generation, intelligent observability, anomaly detection, and generative code scaffolding.
  • Lead proofs of concept and assessments of new Azure, Databricks, and AI capabilities, and recommend adoption paths.
  • Mentor and guide data engineering and platform teams to ensure strong technical standards and continuous improvement.
  • Provide architectural expertise to teams building data products, analytics solutions, and AI workloads using the enterprise data platform.

Requirements

  • Bachelor’s degree in Computer Science, Information Systems, Engineering, or a related field (Master’s preferred).
  • 8–12+ years of experience in data architecture, cloud data platforms, or enterprise data engineering.
  • Proven experience leading large-scale data platform modernization or cloud transformation initiatives.
  • Strong background in Azure and Databricks ecosystem, including: Delta Lake, Medallion architecture Unity Catalog, schema management Azure Data Factory, Event Hub, Functions, Key Vault Cluster configuration, performance tuning, and optimization.
  • Hands-on experience designing cloud-native data solutions and scalable pipelines.
Benefits
  • Professional development
  • Remote work options

Applicant Tracking System Keywords

Tip: use these terms in your resume and cover letter to boost ATS matches.

Hard skills
data architecturecloud data platformsenterprise data engineeringdata ingestiondata transformationdata orchestrationDataOpsCI/CD automationperformance tuningcloud-native data solutions
Soft skills
strategic visioncollaborationinfluencementoringguidancecontinuous improvementcommunicationleadershiporganizational goalsstakeholder engagement
Certifications
Bachelor’s degree in Computer ScienceBachelor’s degree in Information SystemsBachelor’s degree in EngineeringMaster’s degree (preferred)