Own and support Planet’s Pelican and SkySat ground processing pipeline from downlink to customer workflows
Drive continuous improvements in the quality and latency of Planet's high resolution data products and pipeline processes
Work cross-functionally across Space Systems and with Data Quality Experts to optimize image algorithm implementations in Planet’s Python/Linux-based production environment
Support monitoring of scientific fidelity by implementing and reviewing data quality metrics
Troubleshoot data quality issues and implement software fixes to resolve them
Participate in on-call rotation to ensure operational excellence across pipelines
Build and optimize algorithms in the low latency software for the Pelican Pipeline imagery processing system
Ensure top data quality and latency goals to support customers and operate at scale (millions of images, terabytes of data)
Requirements
Bachelor’s degree in Computer Science, Aerospace Engineering or similar
Experience building production-grade services with modern Python
Experience with Ray or other Python-based distributed computing frameworks such as Dask, Flink, and PySpark
Exposure to containerization technologies such as Kubernetes and Docker
Experience building low latency data processing chains
Ability to mentor team members and conduct code and test reviews
Ability to collaborate cross-functionally with Product, Engineering Management, and engineering teams on system design and roadmapping for scalable and robust solutions
Experience working with Jira task management and progress tracking
Exposure to geospatial raster data processing with tools such as GDAL and NumPy (preferred)
Exposure to parallelization techniques such as threading, multi-processing, and distributed workloads (preferred)
Experience building and deploying Kubernetes based services (preferred)
Experience with CUDA-based GPU programs (preferred)
Applicant Tracking System Keywords
Tip: use these terms in your resume and cover letter to boost ATS matches.