Dijital Team

Senior Data Engineer – DV 2.0 Certified

Dijital Team

full-time

Posted on:

Location Type: Remote

Location: Sri Lanka

Visit company website

Explore more

AI Apply
Apply

Job Level

About the role

  • Build and maintain ELT pipelines and transformations in Snowflake, with a strong focus on SQL quality and performance
  • Develop and optimise Snowflake stored procedures / SQL-based transformations aligned to engineering standards
  • Work within an Azure-based ecosystem and contribute to secure, reliable pipeline operations
  • Design and implement Data Vault 2.0 structures (Hubs, Links, Satellites), including business vault logic where required
  • Re-model legacy reporting data into scalable, auditable Data Vault patterns
  • Collaborate with architects and domain experts to interpret and codify business rules—especially where rules differ across legacy systems
  • Create and maintain reporting-ready layers (e.g., information marts / dimensional patterns where appropriate) to support Power BI
  • Support new delivery methods such as Power Automate outputs for end users where required
  • Work with ingestion patterns where source data lands as Parquet (e.g., delta/Parquet structured files)
  • Use and support orchestration via Azure Data Factory (ADF)—especially for onboarding new sources and maintaining reliable workflows
  • Operate in an Agile delivery environment (e.g., Jira / Scrum), contributing to sprint planning, estimation, and continuous improvement
  • Participate in peer reviews and knowledge sharing to uplift capability across the team

Requirements

  • Strong experience as a Data Engineer delivering enterprise data platforms and pipelines
  • 4 -5 years of experience in a similar capacity or in Data Engineer roles
  • Demonstrated capability with Data Vault 2.0 (modelling + implementation)
  • Strong SQL skills (complex transformations, performance tuning, procedural logic)
  • Hands-on experience with Snowflake and the ability to further ramp up your proficiency using your strong SQL foundations
  • Experience working with cloud data ecosystems (ideally Azure)
  • Ability to translate business/reporting requirements into scalable data models and transformations
  • Experience with Azure Data Factory (ADF)
  • Familiarity with Parquet / delta-style file-based ingestion patterns
  • Experience building reporting layers/information marts and/or dimensional modelling
  • Exposure to Data Vault tooling such as IRIS (or similar modelling/automation tools)
  • Experience with CI/CD / DevOps practices for data engineering
  • Data Vault 2.0 certification mandatory
  • Snowflake certification is desirable
  • Azure fundamentals certification is desirable
Benefits
  • Flexible working arrangements
Applicant Tracking System Keywords

Tip: use these terms in your resume and cover letter to boost ATS matches.

Hard Skills & Tools
SQLSnowflakeData Vault 2.0Azure Data FactoryPower BIParquetCI/CDDevOpsdata modellingperformance tuning
Soft Skills
collaborationcommunicationproblem-solvingagile methodologyknowledge sharingcontinuous improvementbusiness analysisteamworkadaptabilitycritical thinking
Certifications
Data Vault 2.0 certificationSnowflake certificationAzure fundamentals certification