Salary
💰 $80,000 - $85,000 per year
About the role
- Moderate trust & safety flags from players to determine appropriate responses to harmful behavior
- Apply community guidelines and enforcement protocols consistently
- Document findings clearly and consistently
- Collaborate with the product development team to refine algorithms that flag behavior
- Update and maintain outward facing community guidelines and internal moderation guidelines
- Create and share reports on overall player behavior with the development team
- Demonstrate support for journalistic independence and a strong commitment to the mission
Requirements
- 2+ years working in content moderation, risk assessment or digital trust & safety
- Previous experience working with sensitive and potentially graphic content
- Familiarity with online gaming culture, platforms, and moderation challenges
- Proficient in SaaS-based collaboration tools including Amazon Connect, JIRA, and more
- Familiarity with machine learning principles
- Health insurance
- 401(k)
- Flexible work arrangements
- Professional development
Applicant Tracking System Keywords
Tip: use these terms in your resume and cover letter to boost ATS matches.
Hard skills
content moderationrisk assessmentdigital trust & safetymachine learning principles
Soft skills
collaborationdocumentationsupport for journalistic independencecommitment to mission