LeoLabs

Data Engineer

LeoLabs

full-time

Posted on:

Location Type: Remote

Location: United States

Visit company website

Explore more

AI Apply
Apply

Job Level

About the role

  • Play a key role in building and operating data pipelines and analytics infrastructure
  • Work closely with software engineers, radar and catalog teams, and data scientists
  • Ensure reliable extraction, transformation, and loading (ETL) of mission-critical datasets
  • Develop scalable batch and streaming data workflows
  • Enable advanced analytics and support machine learning initiatives
  • Help transform large volumes of sensor and orbital data into actionable intelligence
  • Engage in hands-on development with opportunities to grow into increased ownership of data platform design and optimization

Requirements

  • B.S. or M.S. in Computer Science, Data Science, Engineering, Mathematics, Physics, or equivalent experience
  • 0-2 years of experience in data engineering, software engineering, analytics engineering, or related technical roles.
  • Experience designing and building data pipelines or ETL/ELT workflows
  • Hands-on experience with Databricks, Apache Spark, or distributed data processing frameworks
  • Proficiency in Python and SQL for data transformation and analysis
  • Familiarity with data modeling concepts and modern data lake or warehouse architectures
  • Experience working in cloud-native environments (AWS preferred)
  • Understanding of software development best practices including version control, testing, and CI/CD
  • Strong analytical mindset and ability to troubleshoot complex data issues
  • Effective communication skills and ability to collaborate across distributed engineering teams
  • Ability to participate in operational support rotations during critical incidents
  • Experience supporting data science or machine learning workflows, including feature engineering pipelines
  • Familiarity with Delta Lake, Lakehouse architectures, or large-scale telemetry data processing
  • Exposure to streaming data systems such as Kafka or Spark Structured Streaming
  • Experience with workflow orchestration tools such as Airflow or Databricks Workflows
  • Background in orbital mechanics, aerospace, physics, or applied mathematics
  • Experience building analytics datasets or semantic models for BI tools
  • Active U.S. security clearance or ability to obtain one
Benefits
  • Global workforce: flexible remote/hybrid opportunities
  • Work on complex, meaningful missions with real-world impact
  • Unlimited paid time off for most roles
  • Competitive salary and equity packages
  • Comprehensive health, dental, and vision coverage
  • Access to the forefront of commercial space operations and defense innovation
Applicant Tracking System Keywords

Tip: use these terms in your resume and cover letter to boost ATS matches.

Hard Skills & Tools
data engineeringETLdata pipelinesPythonSQLDatabricksApache Sparkdata modelingcloud-native environmentsfeature engineering
Soft Skills
analytical mindsettroubleshootingeffective communicationcollaborationownershipproblem-solvingadaptabilityteamworkoperational supportinitiative
Certifications
B.S. in Computer ScienceM.S. in Data Scienceactive U.S. security clearance