In 2019, NBC News identified four hateful profile accounts on Roblox, one with clear anti-Semitic language, another with a model of a Nazi-era uniform, and two others with Proud Boys-related profiles.
On June 20, 2023, Roblox announced it would enable eligible creators to build experiences featuring mature content for users aged 17 and up. Creators can create content with mature themes, storylines and experiences that could contain violence, blood, or crude humor.
According to a 2023 New York University survey, more than 50% of gamers in five of the world’s top video game markets have seen some form of extremist language while playing multiplayer games in the past year. The proliferation of white nationalist and white supremacist ideologies is well-known in the gaming industry.
Modulate, the creators of artificial intelligence (AI) voice technology, Tox Mod, which scans voice chat using machine learning to find toxic players in online games, launched a new Violent Radicalization detection category in its ToxMod voice chat moderation software.
ToxMod’s new Violent Radicalization detection category is the result of extensive research and collaboration with the Anti-Defamation League (ADL).
The company hopes the new category will address critical concerns within the gaming community by identifying and flagging certain behaviors, including promotion, recruitment, targeted grooming and planning violence.
According to the ADL study, Hate Is No Game: Hate and Harassment in Online Games 2022, the percentage of adult gamers who encountered someone spouting white supremacist ideology in online games went up from 8% in 2021 to 20% in 2022.
The new detection category makes ToxMod the gaming industry’s only voice moderation solution that identifies individuals promoting white supremacist and white nationalist radicalization and extremism in real time, allowing community moderators to take immediate action.
Mike Pappas, Chief Executive Officer and Co-founder at Modulate said there’s no evidence that online games radicalize anyone. “In fact, extremists typically target lonely people to radicalize, and there are tons of studies, especially from the pandemic showing that online games are one of our most powerful tools to help lonely individuals find their communities – and thus decrease the overall prevalence of extremists.”
“There is, however, a growing contingent of already-radicalized individuals taking to the internet to spread their perspective,” said Pappas in an email interview. “And, these individuals frequently use existing platforms like social media and online games to attempt to find their lonely or disaffected ‘targets.'”
Pappas says that until now, exposure to extremist ideologies has been measured only through player reports or surveys.
“Like any self-reporting measure, that provides a useful but incomplete picture,” said Pappas. “In recent years, technical innovations have allowed tools like Modulate’s ToxMod to emerge, which can scan across the whole ecosystem in a privacy-sensitive manner, providing a more accurate picture of the prevalence and impact of these extremists.”
Protective voice moderation
Pappas says that many of the most harmful behaviors – radicalization, grooming, etc. – are perpetrated against vulnerable groups who may not be able or willing to report bad behavior via player reports. “A tool like ToxMod is essential to protect them.”
“The bonus factor is that there’s empirical evidence that strong safety tools pay for themselves – we’ve found ToxMod lifts platforms’ new user retention, for instance, by as much as 10-20%, and other researchers have found that non-toxic games see much more money spent on in-game purchases than toxic titles,” added Pappas.
“Every gaming company – even the ones people are skeptical of – contains a huge number of passionate people who really deeply and personally care about the safety and experience of their users,” said Pappas. “Some studios do a better job empowering these people than others, but that’s why we’ve invested so much in demonstrating the monetary value of this work as well – that gives the platforms the additional ammunition they need to get internal buy-in and move these kinds of initiatives forward.”
Pappas adds that real momentum is building since ADL’s Hate report came out and subsequent pressure from regulators.
“But it’s also clear to us that developers are generally interested in going beyond the ‘bare minimum’ to satisfy that regulatory pressure – they are taking advantage of this momentum to go further and implement robust, large-scale systems that really incorporate safety fundamentally into their design,” said Pappas.
“At the end of the day, the measurement that matters most to me is consumer sentiment,” said Pappas. “Talk to virtually anyone, and they’ll have a story about how they got harassed online or heard someone being harassed or experienced some form of egregious toxicity, and many have stories that are even more horrifying than severe harassment in and of itself.”
“When that stops being the first association people have with online games and online spaces in general); when people of all backgrounds feel comfortable and welcome participating in these online spaces – that’s when we’ll get to consider our work a success,” said Pappas.
Modulate has raised $30 million to date.