Gaming Innovation Group

Big Data Engineer

Gaming Innovation Group

full-time

Posted on:

Location Type: Hybrid

Location: BarcelonaSpain

Visit company website

Explore more

AI Apply
Apply

About the role

  • Data Pipeline Design & Implementation: Design, develop, and maintain robust data pipelines for the extraction, transformation, and loading (ETL) of large volumes of structured and unstructured data, ensuring scalability and efficiency.
  • Migration Execution & Validation: Lead the execution of data migrations, including data extraction, transformation, schema mapping, validation, and cutover, ensuring data integrity and consistency.
  • Collaboration & Stakeholder Engagement: Work closely with product, technology, delivery, and compliance teams to gather migration requirements, align on data definitions, and ensure successful migration outcomes.
  • Automation & Optimization: Implement automation solutions for ETL/migration processes to improve efficiency and reduce manual intervention.
  • Monitoring & Troubleshooting: Monitor and troubleshoot migration pipelines, addressing issues promptly to ensure smooth operations.
  • Documentation & Process Improvement: Document migration processes, contribute to process improvement initiatives, and ensure adherence to best practices.
  • Technical Support & Incident Resolution: Provide technical support during incident resolution related to migration pipelines and data discrepancies.
  • Agile Participation: Actively participate in all agile scrum meetings such as daily standups, refinement sessions, and retrospectives.
  • Innovation & Continuous Improvement: Propose and implement new ideas to improve existing products and services, fostering a culture of innovation.
  • Code Review & Mentorship: Perform code reviews for other engineers, ensuring code quality and knowledge sharing within the team.
  • Stakeholder Communication: Effectively communicate with respective stakeholders, ensuring that all information is transmitted correctly and to the right audience.

Requirements

  • Bachelor’s/Master’s in Computer Science, Engineering, or related field.
  • 3+ years of experience in data engineering, ETL processes, or data migration.
  • Strong SQL skills and experience working with structured and unstructured datasets.
  • Knowledge of data governance, compliance, and data quality assurance.
  • Solution-oriented mindset with the ability to troubleshoot and resolve issues.
  • Strong experience with Big Data and streaming technologies including ClickHouse, Kafka and NiFi
  • Strong knowledge of Microservices, APIs, GraphQL
  • Preferred Skills
  • Experience in the iGaming industry or regulated markets.
  • Experience with automation of ETL/migration processes.
Benefits
  • - Great career development opportunities
  • - Hybrid working model
  • - International Health Insurance
  • - Health and Wellbeing Package (350 EUR per year)
  • - Birthday Day Off
  • - Me Time - 1 day off per year
Applicant Tracking System Keywords

Tip: use these terms in your resume and cover letter to boost ATS matches.

Hard Skills & Tools
data pipeline designETLdata migrationSQLBig Datastreaming technologiesClickHouseKafkaNiFiMicroservices
Soft Skills
solution-oriented mindsettroubleshootingcollaborationstakeholder engagementcommunicationinnovationmentorshipprocess improvementdocumentationagile participation
Certifications
Bachelor’s in Computer ScienceMaster’s in Computer ScienceBachelor’s in EngineeringMaster’s in Engineering