GSK

Data Engineer II

GSK

full-time

Posted on:

Location Type: Hybrid

Location: San Francisco • California, Maryland, Massachusetts • 🇺🇸 United States

Visit company website
AI Apply
Apply

Salary

💰 $116,325 - $193,875 per year

Job Level

Mid-LevelSenior

Tech Stack

AirflowAWSAzureCloudGoogle Cloud PlatformKafkaPythonSpark

About the role

  • Builds modular code / libraries / services / etc using modern data engineering tools (Python/Spark, Kafka, Storm, …) and orchestration tools (e.g. Google Workflow, Airflow Composer)
  • Produces well-engineered software, including appropriate automated test suites and technical documentation
  • Develop, measure, and monitor key metrics for all tools and services and consistently seek to iterate on and improve them
  • Ensure consistent application of platform abstractions to ensure quality and consistency with respect to logging and lineage
  • Fully versed in coding best practices and ways of working, and participates in code reviews and partnering to improve the team’s standards
  • Adhere to QMS framework and CI/CD best practices
  • Provide L3 support to existing tools / pipelines / services

Requirements

  • Bachelor’s degree in Data Engineering, Computer Science, Software Engineering, or a related discipline
  • 4+ years of Data engineering Experience
  • Software engineering experience
  • Familiarity with orchestrating tooling
  • Cloud experience (GCP, Azure or AWS)
  • Experience in automated testing and design
Benefits
  • health care and other insurance benefits (for employee and family)
  • retirement benefits
  • paid holidays
  • vacation
  • paid caregiver/parental and medical leave

Applicant Tracking System Keywords

Tip: use these terms in your resume and cover letter to boost ATS matches.

Hard skills
PythonSparkKafkaStormGoogle WorkflowAirflow Composerautomated testingCI/CDdata engineeringsoftware engineering
Soft skills
collaborationcode reviewproblem-solvingquality assuranceiterationdocumentation