Senior Threat Operations Specialist

7 Minutes ago • 2 Years + • Operations • $124,000 PA - $139,500 PA

Job Summary

Job Description

This critical role supports Discord in understanding and mitigating harmful content, including child sexual abuse material, on its platform. It involves investigating complex threats, advancing investigative capabilities, and developing innovative approaches to prevent harm, especially to minors. The specialist will report to the Minor Safety and Exploitative Content Manager and will be exposed to graphic and disturbing content.
Must have:
  • Respond to time sensitive escalations from safety reporting network, law enforcement, government bodies, and users
  • Review child exploitation, graphic violence, self-injury, suicide, explicit images, videos, and other disturbing content
  • Investigate accounts and create reports for NCMEC
  • Proactively identify undetected abuse by leveraging internal data, open-source intelligence, trusted partner information and third party private intelligence
  • Identify effective strategies to disrupt abuse at scale, build recommendations, and work collaboratively with internal teams
  • Lead projects to effect strategies with Policy, Product, Engineering and Legal teams
  • Demonstrate operational excellence when evaluating risks, threats, and user privacy in time-critical situations
  • Execute decision-making while analyzing factors including imminence of danger, sensitivities, and graphic content
  • Take ownership in responding to incidents, providing deep knowledge into different exploitative content types
  • Share insights and expertise about minor safety and exploitative content issues with stakeholder teams
  • Create, maintain, and develop internal resources around contemporary subject matter expertise, workflows and process updates
  • Mentor and guide junior team members in the execution of their duties, including delivering internal training
  • Ability to work early morning, and occasional weekend/holiday shifts to support global operations
  • Minimum 2 years of specialized experience investigating crimes against children
  • Minimum 2 years of generalized experience in investigations or content moderation
  • Current expertise in global online safety landscapes, including familiarity with child sexual exploitation trends
  • Proven experience reporting child safety cases to NGOs, law enforcement agencies, or other relevant authorities
  • Outstanding communication abilities to articulate complex technical concepts, case findings, and risk assessments clearly
  • Self-directed work style with proven ability to maintain high performance standards and adapt quickly to changing priorities
Good to have:
  • Multilingual capabilities with native or near-native proficiency in a second language
  • Previous threat intelligence experience involving minor safety and the prevention of child sexual abuse
  • Education or equivalent professional experience in Law, Intelligence Studies, Cybersecurity, Criminal Justice, Criminology, or related disciplines
  • Technical proficiency in data analysis tools including SQL, Python, or other programming languages
Perks:
  • equity
  • benefits

Job Details

Discord is used by over 200 million people every month for many different reasons, but there’s one thing that nearly everyone does on our platform: play video games. Over 90% of our users play games, spending a combined 1.5 billion hours playing thousands of unique titles on Discord each month. Discord plays a uniquely important role in the future of gaming. We are focused on making it easier and more fun for people to talk and hang out before, during, and after playing games.

This role is critical in supporting the company to deeply understand and mitigate how harmful content, including child sexual abuse material, manifests on our platform, as well as investigating complex threats, advancing our investigative capabilities, and developing innovative approaches to prevent harm to our users, particularly minors. This hire will report to the Minor Safety and Exploitative Content Manager.

This role involves exposure to graphic and/or objectionable content including but not limited to graphic images, videos and writings, offensive and derogatory language, and other potentially objectionable material, i.e., child exploitation, graphic violence, self-injury, animal abuse, and other content which may be considered offensive or disturbing.

What You'll Be Doing

  • Respond to time sensitive escalations from members of the Safety Reporting network, law enforcement, government bodies, and users, including but not limited to the review of child exploitation, graphic violence, self-injury and suicide, explicit images, videos, and other objectionable and/or disturbing content.
  • Investigate accounts and create reports for NCMEC, if required.
  • Proactively identify currently undetected abuse by leveraging internal data, open-source intelligence, trusted partner information and third party private intelligence.
  • Identify effective strategies to disrupt abuse at scale, build recommendations, and work collaboratively with other internal teams, including Policy, Product, Engineering and Legal teams to effect those strategies, including leading projects in some instances.
  • Demonstrate operational excellence when evaluating risks, threats, and user privacy in time-critical situations and execute decision-making while analyzing a variety of factors that include imminence of danger, sensitivities, and/or graphic content.
  • Take ownership in responding to incidents, providing deep knowledge into different exploitative content types and sharing insights and expertise about minor safety and exploitative content issues with stakeholder teams
  • Create, maintain, and develop internal resources around contemporary subject matter expertise, workflows and process updates.
  • Mentor and guide junior team members in the execution of their duties, including delivering internal training.
  • Ability to work early morning, and occasional weekend/ holiday shifts to support our global operations.

What you should have

  • Minimum 2 years of specialized experience investigating crimes against children through intelligence agencies, law enforcement, NGOs, or Trust and Safety teams.
  • Minimum 2 years of generalized experience in investigations or content moderation.
  • Current expertise in global online safety landscapes, including familiarity with child sexual exploitation trends such as sextortion, the facilitation of CSAM sale and distribution, and evolving criminal methodologies targeting minors.
  • Proven experience reporting child safety cases to NGOs, law enforcement agencies, or other relevant authorities, with understanding of proper protocols and considerations.
  • Outstanding communication abilities to articulate complex technical concepts, case findings, and risk assessments clearly to diverse stakeholders through written reports, presentations, and briefings.
  • Self-directed work style with proven ability to maintain high performance standards and adapt quickly to changing priorities in remote, deadline-driven environments.

Bonus Points

  • Multilingual capabilities with native or near-native proficiency in a second language, enabling investigation and reporting of international cases and cross-border collaboration.
  • Previous threat intelligence experience involving minor safety and the prevention of child sexual abuse
  • Education or equivalent professional experience in Law, Intelligence Studies, Cybersecurity, Criminal Justice, Criminology, or related disciplines that enhance investigative methodology and legal understanding.
  • Technical proficiency in data analysis tools including SQL, Python, or other programming languages for extracting insights from large datasets, identifying patterns, and supporting evidence-based investigations.

Similar Jobs

Looks like we're out of matches

Set up an alert and we'll send you similar jobs the moment they appear!

Similar Skill Jobs

Looks like we're out of matches

Set up an alert and we'll send you similar jobs the moment they appear!

Jobs in San Francisco, California, United States

Looks like we're out of matches

Set up an alert and we'll send you similar jobs the moment they appear!

Operations Jobs

Looks like we're out of matches

Set up an alert and we'll send you similar jobs the moment they appear!

About The Company

Discord is a voice, video and text platform that helps friends come together to hang out, play games and have fun. When Jason Citron and Stanislav Vishnevskiy founded Discord in 2015 they had a hunch that multiplayer gaming would be the future of entertainment and that people would need a communications platform designed for them to talk with their gaming friends. Today, gaming has become the largest form of entertainment in the world, bigger than movies and music combined. It’s the fastest-growing as well. Discord makes it feel like you’re playing in the same room with friends.

San Francisco, California, United States (On-Site)

San Francisco, California, United States (On-Site)

San Francisco, California, United States (On-Site)

San Francisco, California, United States (On-Site)

California, United States (On-Site)

San Francisco, California, United States (Remote)

San Francisco, California, United States (On-Site)

San Francisco, California, United States (On-Site)

San Francisco, California, United States (Remote)

View All Jobs

Get notified when new jobs are added by Discord

Level Up Your Career in Game Development!

Transform Your Passion into Profession with Our Comprehensive Courses for Aspiring Game Developers.

Job Common Plug