
Senior Data Integration Engineer, AWS, SQL, ETL Pipelines
Engineered Intelligence Inc
full-time
Posted on:
Location Type: Remote
Location: Remote • 🇨🇦 Canada
Visit company websiteJob Level
Senior
Tech Stack
AWSCloudETLKafkaSQL
About the role
- Own and maintain data ingestion pipelines and workflows from external sources into our analytics platform
- Take over and document existing ETL processes and ensure smooth knowledge transfer and continuity
- Define and establish integration best practices, development standards, and deployment processes
- Build scalable and reliable data pipelines using modern ETL/ELT tools and frameworks
- Lead efforts in data validation, quality assurance, error handling, and monitoring across integration layers
- Support onboarding of new clients and third-party data sources, including file-based, database, and API integrations
- Collaborate closely with product, data engineering, analytics, and customer-facing teams to define data contracts and SLAs
- Provide mentorship, guidance, and support hiring as the integration function grows into a dedicated team
- Evaluate and recommend modern integration tooling and architectural improvements
- Ensure comprehensive documentation, traceability, and support procedures are in place for all integration processes
Requirements
- 5+ years of experience in data integration, ETL/ELT development, or data engineering roles
- Proven ability to design and manage robust integration architectures, including both batch and API-driven pipelines
- Strong proficiency in SQL and data transformation logic
- Experience with a range of ETL tools and cloud data services
- Deep understanding of data warehousing, modeling, data contracts, and data quality principles
- Comfortable owning both hands-on implementation and higher-level architectural planning
- Familiarity with cloud infrastructure and services, preferably AWS
- Strong communication, documentation, and stakeholder collaboration skills
- Ability to operate independently and drive initiatives in a fast-paced work environment.
- Nice to Have: Experience establishing integration or data operations functions from the ground up
- Familiarity with DevOps practices for data pipelines (CI/CD, GitOps)
- Exposure to event-driven architectures (e.g., Kafka, Kinesis…)
- Experience mentoring or leading other developers
Benefits
- Remote-First Work Environment – Enjoy the flexibility of fully remote work, with office spaces available in Calgary and Toronto for in-person collaboration.
- Flexible Hours – Work around a core schedule (10:00 AM – 3:00 PM) while maintaining a work-life balance that suits you.
- Autonomy & Growth – Take ownership of diverse responsibilities, explore cross-disciplinary opportunities, and advance your career in a dynamic, fast-growing company.
- Impactful Work – Leverage your skills to contribute meaningfully, build an impressive track record, and be part of an exciting business from its early stages.
- Engaging Team Culture – Join a supportive, interactive, and collaborative remote work environment.
- Competitive Compensation – Receive a competitive salary and flexible benefits package.
- Paid Time Off & Wellness Support – Enjoy generous paid time off, and Health Spending Accounts to prioritize your well-being.
Applicant Tracking System Keywords
Tip: use these terms in your resume and cover letter to boost ATS matches.
Hard skills
ETLELTSQLdata transformationdata warehousingdata modelingdata qualityAPI integrationcloud data servicesDevOps
Soft skills
communicationdocumentationstakeholder collaborationmentorshipguidanceindependenceinitiativeleadershiporganizationalproblem-solving