Threat Operations Specialist, Child Safety

3 Months ago • 1-2 Years
Operations

Job Description

Discord is seeking a Threat Operations Specialist for Child Safety to mitigate harmful content, including child sexual abuse material, on their platform. This role involves investigating complex threats, enhancing investigative capabilities, and developing innovative prevention strategies, particularly for minors. The specialist will report to the Threat Operations Manager and work remotely in Australia. Responsibilities include reviewing sensitive content reports, investigating abuse patterns, preparing reports for internal and external audiences, and collaborating with teams to share expertise on minor safety and exploitative content. The role requires evaluating risks and user privacy in time-critical situations and responding empathetically to users facing safety issues.
Good To Have:
  • Proficiency in Korean or Japanese
  • Tertiary qualification in Intelligence Studies, Cybersecurity, Criminal Justice, Criminology or related field
  • Experience with SQL, Python for data manipulation
Must Have:
  • 1-2 years in Trust and Safety moderation
  • Child safety content review experience
  • Operate in high tempo, sensitive environment
  • Meet specific SLAs
  • Exceptional communication skills

Add these skills to join the top 1% applicants for this job

team-management
communication
talent-acquisition
python
sql
multiplayer

Discord is used by over 200 million people every month for many different reasons, but there’s one thing that nearly everyone does on our platform: play video games. Over 90% of our users play games, spending a combined 1.5 billion hours playing thousands of unique titles on Discord each month. Discord plays a uniquely important role in the future of gaming. We are focused on making it easier and more fun for people to talk and hang out before, during, and after playing games.

This is an international position in Australia employed by an international PEO.

This role is critical in supporting the company to deeply understand and mitigate how harmful content, including child sexual abuse material, manifests on our platform, as well as investigating complex threats, advancing our investigative capabilities, and developing innovative approaches to prevent harm to our users, particularly minors. This hire will report to the Threat Operations Manager and will work remotely in Australia.

This role involves exposure to graphic and/or objectionable content including but not limited to graphic images, videos and writings, offensive and derogatory language, and other potentially objectionable material, ie. child exploitation, graphic violence, self-injury, animal abuse, and other content which may be considered offensive or disturbing.

What you’ll be doing

  • Review and respond to sensitive content reports in our queues, including but not limited to the review of child exploitation, graphic violence, self-injury and suicide, explicit images, videos, and other objectionable and/or disturbing content.
  • Ability to work 1-2 hours on Saturdays, and flexibility to pick up non-standard shifts such as early morning or evening, and weekend/ holiday shifts to support our global operations
  • Investigate complex cases to develop a detailed understanding of how abuse is occurring and attribute it to the person(s), and/or networks responsible, in order to prepare high-quality written reports, for both internal and external audiences, including law enforcement agencies.
  • Demonstrate operational excellence when evaluating risks, threats, and user privacy in time-critical situations and execute decision-making while analyzing a variety of factors that include imminence of danger, sensitivities, and/or graphic content
  • Work collaboratively in responding to sensitive issues, providing deep knowledge into different exploitative content types and sharing insights and expertise about minor safety and exploitative content issues.
  • Proactively identify currently undetected abuse by leveraging internal data, open-source intelligence, trusted partner information and third party private intelligence.
  • Respond to users experiencing safety-related or high harm issues and empathetically address their concerns

What you should have

  • 1 - 2 years of experience in Trust and Safety moderation, including experience with child safety content review and removal 
  • Demonstrated ability to operate in a high tempo, sensitive environment while meeting specific SLAs.
  • Exceptional communication skills with an ability to communicate complex information, concepts, or ideas in a confident and well-organized manner through verbal, written, and/or visual means.

Bonus points

  • Proficiency with a second language (preferably Korean or Japanese)
  • Tertiary qualifications or equivalent experience in Intelligence Studies, Cybersecurity, Criminal Justice, Criminology or related field.
  • Experience in utilizing SQL, Python or other programming language for data manipulation.

Why Discord? 

Discord plays a uniquely important role in the future of gaming. We're a multiplatform, multigenerational and multiplayer platform that helps people deepen their friendships around games and shared interests. We believe games give us a way to have fun with our favorite people, whether listening to music together or grinding in competitive matches for diamond rank. Join us in our mission! Your future is just a click away!

Please see our Applicant and Candidate Privacy Policy for details regarding Discord’s collection and usage of personal information relating to the application and recruitment process by clicking HERE.

Set alerts for more jobs like Threat Operations Specialist, Child Safety
Set alerts for new jobs by Discord
Set alerts for new Operations jobs in Australia
Set alerts for new jobs in Australia
Set alerts for Operations (Remote) jobs
Contact Us
hello@outscal.com
Made in INDIA 💛💙