Bridgeway Benefit Technologies

Senior Data Engineer

Bridgeway Benefit Technologies

full-time

Posted on:

Location Type: Remote

Location: Remote • 🇺🇸 United States

Visit company website
AI Apply
Apply

Job Level

Senior

Tech Stack

AirflowAzureKafkaPythonSparkSQLTableauTerraformUnity

About the role

  • Design, develop, and maintain a scalable data warehouse/lakehouse environment.
  • Design and implement ELT pipelines to ingest, transform, and deliver high-quality data for analytics and reporting, incorporating current best practices, such as “pipelines as code”.
  • Ensure data security and compliance, including role-based access controls for security, encryption, masking, and governance best practices to ensure compliant handling of sensitive information.
  • Optimize performance of data workflows and storage for cost efficiency and speed.
  • Partner with engineers, analysts, and stakeholders to meet data needs; balance cost, performance, simplicity, and time-to-value while mentoring teams and documenting standards.
  • Define and implement robust testing frameworks, enforce data contracts, and establish observability practices including lineage tracking, SLAs/SLOs, and incident response runbooks to maintain data integrity and trustworthiness.
  • Monitor, troubleshoot, and resolve data & automation issues.
  • Collaborate within an Agile-Scrum framework and develop comprehensive technical design documentation to ensure efficient and successful delivery.
  • Serve as a trusted expert on organizational data domains, processes, and best practices.

Requirements

  • 5+ years of experience in data engineering and ELT with a focus on large-scale data platforms
  • 3+ years of experience with Databricks
  • Advanced proficiency in analytical SQL, including ANSI SQL, T-SQL, and Spark SQL
  • Strong Python skills for data engineering
  • Expertise in data modeling
  • Hands-on experience with data quality and observability practices (tests, contracts, lineage tracking, alerts)
  • Practical knowledge of orchestration tools and CI/CD concepts for data workflows
  • Excellent communication and a track record of technical leadership and mentoring
  • Strong understanding of integrating data solutions with AI and machine learning models
  • Strong problem-solving skills and attention to detail.
  • Experience with version control systems like Git preferred
  • Strong understanding of data governance and best practices in data management, with hands-on experience using Unity Catalog
  • Hands-on experience in designing and managing data pipelines using Delta Live Tables (DLT) on Databricks
  • Streaming and ingestion tools, such as Kafka, Kinesis, Event Hubs, Debezium, or Fivetran
  • DAX, LookML, dbt; Airflow/Dagster/Prefect, Terraform; Azure DevOps; Power BI/Looker/Tableau; GitHub CoPilot knowledge is a plus
  • Bachelor’s degree in Computer Science, Information Technology, or a related field. Master’s degree preferred

Applicant Tracking System Keywords

Tip: use these terms in your resume and cover letter to boost ATS matches.

Hard skills
data engineeringELTanalytical SQLT-SQLSpark SQLPythondata modelingdata qualityobservability practicesdata governance
Soft skills
communicationtechnical leadershipmentoringproblem-solvingattention to detail
Certifications
Bachelor’s degree in Computer ScienceBachelor’s degree in Information TechnologyMaster’s degree