Salary
💰 $220,000 - $320,000 per year
Tech Stack
KafkaPythonSparkSQLTypeScript
About the role
- Prototype, build, and maintain intelligence systems that detect, triage, and enable efficient human review of possible high severity harm
- Work hand in hand with operators and investigators, designing and delivering systems that enable them to do their work faster, more accurately, and more safely
- Develop across the stack: UIs, services, pipelines, and anything else required
- Interact with partners across Product Policy, Platform Integrity, Safety Systems, and Research
- Contribute to the team’s technical strategy, especially for child safety related tools and systems
- Report on impact in a data-driven fashion
- Adapt quickly in ambiguous, fast-moving environments to deliver reliable tooling for high-severity safety work
Requirements
- Strong software engineering foundation and experience owning systems end-to-end
- Experience building and operating large-scale data pipelines or search/retrieval systems
- Proficiency in Python and/or TypeScript
- Familiarity with tools like Spark, Kafka, Flink, data warehouses, and SQL
- Product-minded approach; design with user workflows in mind and iterate quickly
- Comfortable shipping pragmatic solutions in low-support environments and navigating ambiguity
- Experience working at the frontier of AI capabilities, integrating models and APIs
- Prior experience working on engineering for high-severity harms
- Intuition for operations and investigative team workflows and curiosity about solving complex investigations
- Comfortable with exposure to sensitive and egregious content
- Ability to work from OpenAI's US office three days per week (role based in San Francisco, CA)
- Bonus: prior knowledge of child-safety-specific challenges such as secure handling of quarantined content and reporting to NCMEC