Tech Stack
CloudElasticSearchGrafanaLinuxPostgresPythonRemote SensingUnix
About the role
- Organizes the various image quality, product quality and payload performance metrics and datasets that already exist by combining different data sources
- Surfaces gaps and redundancies in the existing data and works with the teams to propose and implement mitigation strategies. This involves assessing statistical independence and general correlations between different measurements
- Develops, validates and automates large scale image and product quality and payload performance metrics
- Support the imaging operations of our medium resolution monitoring fleet
- Collaborates with image quality support, collection planning, space operations, data pipeline and product teams
Requirements
- 0-1 years of experience in solving problems by exploiting large cloud-based datasets
- Practical experience working with large datasets and databases using tools such as kibana and elasticsearch, grafana, Looker Studio, OpenSearch, postgreSQL
- Practical experience leveraging scientific Python in a Unix/Linux environment
- Great interpersonal skills and ability to collaborate effectively
- Bachelor’s degree in Physics, Visual Computing, Computer Science or other relevant scientific/engineering field
- Skilled with Statistics, Linear Algebra and Physics (desired)
- Experience with image processing methods and quality assessment of image data (desired)
- Relevant experience in computational photography and scientific imaging (desired)
- Familiarity with source code management tools like Git (desired)