StockX

Senior Data Engineer

StockX

full-time

Posted on:

Location Type: Remote

Location: United States

Visit company website

Explore more

AI Apply
Apply

Salary

💰 $140,000 - $160,000 per year

Job Level

About the role

  • Design and build mission critical data pipelines with a highly scalable distributed architecture - including data ingestion (streaming, events and batch), data integration, data curation
  • Help continually improve ongoing reporting and analysis processes, simplifying self-service support for business stakeholders
  • Build and support reusable framework to ingest, integration and provision data
  • Automation of end to end data pipeline with metadata, data quality checks and audit
  • Build and support a big data platform on the cloud
  • Define and implement automation of jobs and testing
  • Optimize the data pipeline to support ML workloads and use cases
  • Support mission critical applications and near real time data needs from the data platform
  • Capture and publish metadata and new data to subscribed users
  • Work collaboratively with business analysts, product managers, data scientists as well as business partners and actively participate in design thinking session
  • Participate in design and code reviews
  • Motivate, coach, and serve as a role model and mentor for other development team associates/members that leverage the platform

Requirements

  • 3 to 5 years' experience in data warehouse / data lake house technical architecture
  • 3+ years of experience in using programming languages (Python / Scala / Java / C#) Minimum
  • 3 years of Big Data and Big Data tools in one or more of the following: Batch Processing (e.g. Hadoop distributions, Spark), Real time processing (e.g. Kafka, Flink/Spark Streaming)
  • Minimum of 2 years' experience with AWS or engineering in other cloud environments
  • Strong Knowledge of Databricks SQL/Scala - Data Engineering Pipelines
  • Experience with Database Architecture/Schema design
  • Strong familiarity with batch processing and workflow tools such as dbt, AirFlow, NiFi
  • Ability to work independently with business partners and management to understand their needs and exceed expectations in delivering tools/solutions
  • Strong interpersonal, verbal and written communication skills and ability to present complex technical/analytical concepts to executive audience
  • Strong business mindset with customer obsession; ability to collaborate with business partners to identify needs and opportunities for improved data management and delivery
  • Experience providing technical leadership and mentoring other engineers for best practices on data engineering
  • Bachelor's degree in Computer Science, or a related technical field.
Benefits
  • medical
  • dental
  • equity
  • discretionary bonuses
Applicant Tracking System Keywords

Tip: use these terms in your resume and cover letter to boost ATS matches.

Hard Skills & Tools
data pipeline designdata ingestiondata integrationdata curationbig data architecturePythonScalaJavaC#Databricks SQL
Soft Skills
interpersonal skillsverbal communicationwritten communicationcollaborationmentoringcoachingproblem-solvingcustomer obsessionleadershipdesign thinking
Certifications
Bachelor's degree in Computer Science