
Data Engineering Consultant
GFT Technologies
full-time
Posted on:
Location Type: Hybrid
Location: Montréal • Canada
Visit company websiteExplore more
About the role
- Maintain new and existing data models
- Maintain and/or implement various data quality aspects, such as security, compliance and validation rules, completeness, transformations and business rules, as well as deduplication (match & merge)
- Build a data governance process based on duplicate elimination (match & merge)
- Support the production environment and handle incident remediation
- Define and maintain current and future data flows
- Keep permanent documentation up to date.
Requirements
- Minimum of eight years of relevant experience
- Relevant experience with agile methodologies (Scrum, Kanban)
- Advanced proficiency in TIBCO EBX 6
- Proficiency in developing APIs in EBX
- Proficiency in event-driven architecture (e.g., Kafka) and batch processing
- Expertise in implementing decision trees, data profiling, and search algorithm optimization
- Expertise in high-volume data processing within the financial domain
- Knowledge of Java programming (Java 17+)
- Knowledge of relational databases (Oracle) and SQL
- Familiarity with CI/CD (e.g., GitHub Actions, Concourse, Jenkins, etc.)
- Knowledge of hybrid architecture (on-premises and cloud)
- Familiarity with monitoring/observability tools such as Dynatrace.
Benefits
- Six weeks of vacation for all employees
- 35-hour workweek with full remote or hybrid options according to your needs
- A variety of social and sports activities
- Generous individual or family insurance coverage from day one
- And many other employee benefits that we will be happy to present!
Applicant Tracking System Keywords
Tip: use these terms in your resume and cover letter to boost ATS matches.
Hard Skills & Tools
data modelsdata qualitydata governanceAPIsevent-driven architecturebatch processingdecision treesdata profilingsearch algorithm optimizationJava
Soft Skills
incident remediationdocumentation