Marcura

Data Engineer

Marcura

full-time

Posted on:

Location Type: Remote

Location: United Kingdom

Visit company website

Explore more

AI Apply
Apply

About the role

  • **About the Role: **
  • To bring domain expertise in data engineering to the team, including the ETL process using modern tools and methodologies. You will play a key role in building scalable data structures, with a specific focus on implementing Data Vault 2.0 to ensure a flexible and auditable data foundation.
  • **Roles and Responsibilities:**
  • 1. Data engineering best practices
  • - You will contribute to the data team's ability to adhere to data engineering best practices across pipeline design, data quality monitoring, storage, versioning, security, testing, documentation, cost, and error handling.
  • 2. Data transformation in DBT
  • - Ensure that the daily DBT build is successful, including full test coverage of existing models.
  • - Create new data models in collaboration with the data analysts, utilizing Data Vault 2.0 principles where appropriate to handle complex data relationships and historical tracking.
  • - Add new tests to enhance data quality and maintain the integrity of the data warehouse.
  • - Incorporate new data sources into the warehouse architecture.
  • 3. Data extraction
  • - Develop and maintain our data pipelines in Stitch, Fivetran, Segment, and Apache Airflow (Google Cloud Composer).
  • - Evaluate when it's appropriate to use managed tools versus building custom data pipelines in Cloud Composer.
  • - Ensure that data extraction jobs run successfully daily.
  • - Collaborate with engineers from MarTrust to add new data sets to our data extraction jobs.
  • 4. Data warehousing in BigQuery
  • - Ensure that the data in our data warehouse is kept secure and that daily jobs in BigQuery run successfully.
  • - Support the evolution of our BigQuery schema to accommodate Data Vault 2.0 structures (Hubs, Links, and Satellites) for long-term scalability.
  • 5. Data Governance and Security
  • - Data Quality (DQ): Implement and monitor automated data quality checks and observability to ensure the accuracy and reliability of downstream reporting.
  • - Access Control: Manage and enforce granular access control policies (IAM) within BigQuery and GCP to ensure data is only accessible to authorized users.
  • - Governance: Ensure all data processes comply with security standards and data privacy regulations, maintaining clear documentation of lineage and metadata.

Requirements

  • - **Data Modeling**: Solid understanding and hands-on experience with Data Vault 2.0 methodologies.
  • - **GCP Infrastructure:** Experience with Google BigQuery and Cloud Composer (Apache Airflow).
  • - **Modern Data Stack:** Proficiency in DBT for data transformation and data quality testing.
  • - **Governance & Security:** Practical experience managing data access controls, security best practices, and DQ frameworks.
  • - **Pipeline Tools:** Experience with managed ELT services like Fivetran, Stitch, or Segment.
  • - **Remote Work: **Ability to work effectively in a fully remote, distributed team environment.
Benefits
  • **Competitive Salary and Bonus**: We reward your expertise and contributions.
  • **Inclusive Onboarding Experience**: Our onboarding program is designed to set you up for success right from day one.
  • **Marcura Wellness Zone**: We value your work-life balance and well-being.
  • **Global Opportunities**: Be part of an ambitious, expanding company with a local touch.
  • **Diverse, Supportive Work Culture**: We’re committed to inclusion, diversity, and a sense of belonging for all team members.
Applicant Tracking System Keywords

Tip: use these terms in your resume and cover letter to boost ATS matches.

Hard Skills & Tools
data engineeringETLData Vault 2.0data transformationDBTdata modelingdata quality monitoringdata warehousingdata pipelinesdata governance
Soft Skills
collaborationremote workcommunication