BeOne Medicines

Associate Director, Data Platform and Solution Engineering

BeOne Medicines

full-time

Posted on:

Location Type: Remote

Location: Remote • 🇺🇸 United States

Visit company website
AI Apply
Apply

Salary

💰 $141,600 - $191,600 per year

Job Level

Senior

Tech Stack

ApacheAWSAzureCloudETLInformaticaPythonSparkSQLTableauUnity

About the role

  • Design and implement robust data architectures using Databricks, ensuring integration with existing systems and scalability for future growth
  • Establish data management frameworks, optimizing ETL/ELT processes and data models for performance and accuracy.
  • Evaluate and recommend modern architectural patterns, including Lakehouse, Delta Live Tables, Data Mesh, and real-time streaming.
  • Drive rapid Proof-of-Concepts (POCs) to validate new architectural approaches, tools, and design patterns before enterprise rollout.
  • Partner with data engineers, scientists, and business stakeholders to develop seamless data pipelines prioritizing data integrity and usability.
  • Implement and uphold data governance practices that enhance data accessibility while ensuring compliance with regulations.
  • Integrate external systems, APIs, and cloud-native services to support new data products and analytics use cases.
  • Prototype and test new connectors, ingestion frameworks, and integration patterns to accelerate innovation.
  • Monitor data pipelines and infrastructure performance, troubleshooting issues as they arise and ensuring high availability.
  • Optimize and enhance existing data systems for performance, reliability, and cost-efficiency.
  • Collaborate with data analysts and data scientists to understand data requirements and implement solutions that support data-driven insights and models.
  • Monitor and enhance system performance, employing tools and methodologies to optimize data processing and storage solutions.
  • Optimize compute costs, job orchestration, workflow efficiency, and data storage strategies.
  • Troubleshoot and resolve data-related issues to maintain optimal system functionality.
  • Experiment with new Databricks features (Unity Catalog updates, AI/ML runtimes, Photon, DBRX, Delta Sharing, serverless SQL/compute, etc.) through quick hands-on evaluations.
  • Develop and enforce data governance standards, including data quality, security, and compliance through automation.
  • Conduct fast-turnaround POCs to explore new technical capabilities, libraries, and features across Databricks, Azure, Informatica, Reltio, and other ecosystem tools.
  • Build lightweight demo pipelines, dashboards, and micro-solutions to demonstrate feasibility, guide architectural choices, and influence roadmap decisions.
  • Stay current with emerging technologies, industry trends, and platform advancements; translate insights into actionable recommendations.
  • Collaborate with vendors and internal teams to evaluate beta features, pilot new capabilities, and provide technical feedback for adoption decisions.

Requirements

  • Proven experience (8+ years) in data architecture or in a similar role, with extensive experience in Databricks and cloud-based data solutions.
  • 8+ years of experience in solution engineering, platform architecture, or related working in a cross-functional environment.
  • Strong proficiency in Apache Spark, Unity Catalog technology, Python, SQL, and data processing frameworks.
  • Experience with APIs and experience in integrating diverse technology systems.
  • Familiarity with modern development frameworks, DevOps methodologies, and CI/CD processes.
  • Experience with data warehousing solutions, delta lakes, and ETL/ELT processes.
  • Familiarity with cloud environments (AWS, Azure) and their respective data services.
  • Solid understanding of data governance, security, and compliance best practices.
  • Excellent communication and interpersonal skills, with an ability to articulate complex technical concepts to diverse audiences.
  • Databricks certifications or hands-on experience with Delta Lake and its cloud architecture is strongly preferred.
  • Familiarity with machine learning, AI frameworks, and data visualization tools (e.g., Tableau, Power BI, Spotfire).
  • A proactive approach to learning and implementing new technologies and frameworks.
  • Experience working with Life Sciences data, including exposure to R&D, Clinical Operations, TechOps, or Manufacturing domains.
  • Understanding of key systems (CTMS, EDC, eTMF, LIMS, MES, PV systems), data models (CDISC, SDTM, ADaM), and typical data challenges (quality, lineage, integration, governance) is highly desirable.
Benefits
  • Medical
  • Dental
  • Vision
  • 401(k)
  • FSA/HSA
  • Life Insurance
  • Paid Time Off
  • Wellness

Applicant Tracking System Keywords

Tip: use these terms in your resume and cover letter to boost ATS matches.

Hard skills
data architectureDatabricksETLELTApache SparkPythonSQLdata warehousingDevOpsCI/CD
Soft skills
communicationinterpersonal skillsproblem-solvingcollaborationproactive learning
Certifications
Databricks certifications