QA Specialist - Minor Safety and Exploitative Content

10 Minutes ago • 2 Years + • $108,000 PA - $121,500 PA

Job Summary

Job Description

As a QA Specialist for Minor Safety and Exploitative Content (MSEC) at Discord, you will ensure the accuracy and quality of moderation decisions related to minor safety and exploitative content. You will review moderation decisions, identify trends, provide feedback, and lead calibration sessions. This role involves collaborating with moderators, analysts, and policy teams to improve content review processes and uphold community standards. You will regularly report on quality trends and metrics, contributing to process improvements and policy updates.
Must have:
  • 2+ years of experience in QA, trust & safety, or content moderation
  • Deep understanding of minor safety and exploitative content issues
  • Excellent analytical skills and ability to translate data into insights
  • Strong communication skills for conveying findings and training teams
  • Familiarity with moderation tools and metrics-driven performance tracking
Good to have:
  • Experience working on global teams or in environments with cultural sensitivity
  • Experience with data analytics tools and languages like SQL
  • Proficiency in multiple languages to support international moderation efforts
  • Demonstrated success in driving cross-functional initiatives or policy changes
  • Experience working with machine learning systems, automation tools, and LLM/AI technologies
Perks:
  • Equity
  • Benefits

Job Details

Discord is used by over 200 million people every month for many different reasons, but there’s one thing that nearly everyone does on our platform: play video games. Over 90% of our users play games, spending a combined 1.5 billion hours playing thousands of unique titles on Discord each month. Discord plays a uniquely important role in the future of gaming. We are focused on making it easier and more fun for people to talk and hang out before, during, and after playing games.

We are looking for a detail-oriented professional with a strong passion for safeguarding vulnerable groups and combating exploitative content online. As a QA Specialist for Minor Safety and Exploitative Content (MSEC) at Discord, you will play a pivotal role in ensuring the accuracy, consistency, and quality of moderation decisions that uphold our community standards. This role reports to the Team Lead for Trust & Safety QA and will partner closely with MSEC. Your approach to quality assurance is rooted in empathy, precision, and a commitment to continuous improvement.

What You'll Be Doing

  • Review and audit moderation decisions related to minor safety and exploitative content to ensure adherence to Discord’s Trust & Safety policies.
  • Collaborate with moderators, analysts, and policy teams to identify trends, gaps, and inconsistencies in content review processes.
  • Provide constructive feedback and actionable insights to moderators to improve decision-making accuracy and maintain policy alignment.
  • Develop and lead calibration sessions for the moderation team based on audit findings and evolving content standards.
  • Partner with MSEC and other cross-functional teams to influence policy updates and improve internal tools and workflows for greater efficiency and scalability.
  • Regularly report on quality trends and metrics, highlighting risks, successes, and opportunities for process improvements.

What you should have

  • 2+ years of experience in quality assurance, trust & safety, or content moderation, preferably in a tech or online platform environment.
  • Deep understanding of issues related to minor safety, exploitative content, and global online safety trends.
  • Excellent analytical skills with the ability to synthesize large datasets and translate them into actionable insights.
  • Strong communication skills, both written and verbal, to effectively convey findings and train teams.
  • Familiarity with moderation tools, audit processes, and metrics-driven performance tracking.
  • A calm, resilient demeanor when handling sensitive or potentially distressing content.
  • Ability to flex your expertise to support other QA initiatives, including automation and machine learning, violent and hateful content, cybercrime, and other exploitative content.

Bonus Points

  • Experience working on global teams or in environments that require cultural sensitivity and awareness.
  • Experience with data analytics tools and languages like SQL.
  • Proficiency in multiple languages to support international moderation efforts.
  • Demonstrated success in driving cross-functional initiatives or policy changes in a Trust & Safety context.
  • Experience working with machine learning systems, automation tools, and LLM/AI technologies.

Requirements

  • This role requires regular interfacing with potentially traumatic material, including CSAM and other forms of exploitative, hateful, violent, or shocking content.
  • This role's hours are Monday-Friday, 9:00 AM to 5:00 PM Pacific Standard Time, with occasional flexibility required to accommodate our global partners.

 

#LI-Remote

The US base salary range for this full-time position is $108,000 to $121,500 + equity + benefits. Our salary ranges are determined by role and level. Within the range, individual pay is determined by additional factors, including job-related skills, experience, and relevant education or training. Please note that the compensation details listed in US role postings reflect the base salary only, and do not include equity, or benefits.

Why Discord? 

Discord plays a uniquely important role in the future of gaming. We're a multiplatform, multigenerational and multiplayer platform that helps people deepen their friendships around games and shared interests. We believe games give us a way to have fun with our favorite people, whether listening to music together or grinding in competitive matches for diamond rank. Join us in our mission! Your future is just a click away!

Please see our Applicant and Candidate Privacy Policy for details regarding Discord’s collection and usage of personal information relating to the application and recruitment process by clicking HERE.

Similar Jobs

Looks like we're out of matches

Set up an alert and we'll send you similar jobs the moment they appear!

Similar Skill Jobs

Looks like we're out of matches

Set up an alert and we'll send you similar jobs the moment they appear!

Jobs in Worldwide

Looks like we're out of matches

Set up an alert and we'll send you similar jobs the moment they appear!

Similar Category Jobs

Looks like we're out of matches

Set up an alert and we'll send you similar jobs the moment they appear!

About The Company

Founded in 2015, Discord is a voice, video and text app that helps friends and communities come together to hang out and explore their interests - from artists and activists, to study groups, sneakerheads, plant parents, and more. With 150 million monthly users across 19 million active communities, called servers, Discord has grown to become one of the most popular communications services in the world. Discord was built without selling ads or user data and instead, offers a premium subscription called Nitro that gives users special perks like higher quality streams and fun customizations.


And we're hiring! If this strikes a chord with you, come build belonging with us: https://discordapp.com/jobs for openings.

View All Jobs

Get notified when new jobs are added by Discord

Level Up Your Career in Game Development!

Transform Your Passion into Profession with Our Comprehensive Courses for Aspiring Game Developers.

Job Common Plug