
Data Engineer
Clever Digital Marketing
full-time
Posted on:
Location Type: Remote
Location: Canada
Visit company websiteExplore more
Salary
💰 CA$120,000 - CA$150,000 per year
About the role
- Implement and manage robust, scalable ETL/ELT pipelines to ingest, transform, and load data from Google Ads, Meta Ads, and Microsoft/Bing Ads APIs — as well as CRM, call tracking, and lead event sources — into our BigQuery data ecosystem
- Own Bronze layer ingestion with full fidelity, audit trails, and zero silent failures
- Build and maintain the permanent Cloud Run connector for the Microsoft Ads Reporting API (SOAP/OAuth2), replacing interim Dataslayer bridges as the platform matures
- Integrate Pub/Sub-based event streaming for real-time lead data flows and speed-to-lead use case
- Translate designs and specifications from the Data Architect into functional, production-grade data infrastructure and code
- Extend the Medallion Architecture (Bronze → Silver → Gold), owning Silver layer transformation logic and Gold layer Dataform models serving Command Centre, client reporting, and future AI/ML consumers
- Own the CDMID join logic that unifies CRM, ad platform, and call tracking data into a coherent lead record across all client accounts
- Develop and manage data models within BigQuery, ensuring data is organized efficiently for analytics and AI/ML workloads
- Apply dimensional modeling and star schema design principles to build a single source of truth for all reporting and analytics across our home improvement advertiser base
- Deliver the clean, trusted, cost-metric-driven datasets that allow our team to move beyond vanity metrics to cost-per-issued-lead and cost-per-demo insights that drive real client decisions
- Write Python-based automation scripts and leverage GCP services — Cloud Run, Cloud Functions, Pub/Sub, and Dataflow — to orchestrate data workflows and eliminate manual processes
- Graduate CRM push workflows off Zapier and onto a reliable, auditable Cloud Run pipeline
- Proactively monitor and tune data pipelines and BigQuery queries for performance and cost-efficiency
- Implement data quality checks, validation rules, and monitoring across the full pipeline to ensure accuracy, completeness, and timeliness of all data assets
- Build the observability and alerting layer so the team knows about data issues before clients do
- Maintain living architecture documentation as a reliable source of truth across workstreams
Requirements
- 3–5+ years of hands-on data engineering experience building and maintaining production pipelines — not analytical support or BI roles; you have owned infrastructure end-to-end
- Strong, practical GCP expertise: BigQuery, Cloud Run, Dataflow, Cloud Storage, and Pub/Sub are essential
- Deep proficiency in SQL and Python for data transformation, scripting, and pipeline development
- Direct production experience with at least one major ad platform API: Google Ads, Meta Marketing API, or Microsoft Ads Reporting API (SOAP/OAuth2)
- Proven experience designing, building, and operationalizing complex ETL/ELT pipelines from a variety of sources
- Solid understanding of data warehousing concepts — dimensional modeling, star schemas, and analytical schema design
- Experience with real-time data streaming technologies such as Pub/Sub or equivalent
- Familiarity with data transformation frameworks: Dataform, dbt, SQLMesh, or equivalent
Benefits
- An incredible team and culture — high-performance, feedback-oriented, and data-driven, where everyone is empowered to succeed. Our culture thrives on collaboration, extreme ownership, and Kaizen. This is more than a workplace — it's a place where you'll grow, learn, and thrive alongside passionate teammates who are as invested in your success as you are
- Full ownership of a data platform being built from the ground up — your architecture decisions will shape CDM's competitive advantage for years
- Be part of the #1 fastest-growing marketing company in Canada, redefining performance marketing for the home improvement industry
- A modern, GCP-native stack with AI embedded in daily workflows — Claude and Gemini are tools you'll use every day, not novelties
- Exposure to AI/NLQ, MMM modeling, and ML optimization layers as the proprietary platform matures
- Above-industry competitive compensation that reflects your expertise and contributions
- Fully remote work environment with the flexibility to work from anywhere
- 3+ weeks of paid time off annually, with the freedom to use them as you see fit
- Comprehensive group benefits including health, dental, and vision
- Company-provided MacBook Pro and home office budget to help you create your ideal workspace
- A work abroad policy to support you when you need a fresh perspective
- Merch deliveries and opportunities to connect with teammates at meetups and company-wide events
- A clear path to increased scope, leadership, and platform ownership for those who demonstrate exceptional performance and impact
Applicant Tracking System Keywords
Tip: use these terms in your resume and cover letter to boost ATS matches.
Hard Skills & Tools
ETLELTdata engineeringSQLPythondimensional modelingstar schema designdata transformationdata quality checksdata modeling
Soft Skills
problem-solvingattention to detailproactive monitoringdocumentation