Modulate gets $30 million to detox the game’s voice chat with AI

0

Interested in knowing what’s next for the gaming industry? Join gaming executives to discuss emerging parts of the industry in October at GamesBeat Summit Next. Register today.


Modulate has raised $30 million to develop its AI product, ToxMod, which analyzes voice chat using machine learning to find toxic gamers in online games.

ToxMod uses artificial intelligence to highlight issues that human moderators should pay attention to when players chat with each other in online games. It’s a problem that will only get worse with the metaverse, the universe of virtual worlds that are all interconnected, like in novels like Snowfall and Loan player one. The company has increased the round thanks to large clients such as Rec Room and Poker Stars VR who rely on it to help their community managers find the biggest toxicity issues.

“It’s a problem that everyone in the industry desperately needs to solve,” Modulate CEO Mike Pappas said in an interview with GamesBeat. “This is a large-scale market need, and we were waiting to prove that we had actually designed the product to meet it.”

Lakestar led the round with participation from existing investors Everblue Management, Hyperplane Ventures and others. Additionally, Mika Salmi, Managing Partner of Lakestar, will join Modulate’s Board of Directors.

Modulate’s ToxMod is a proactive voice moderation system designed to capture not only overt toxicity (hate speech, adult language), but also more insidious harms like child grooming, violent radicalization and self-harm. The system’s AI was trained on over 10 million hours of audio.

Cambridge, Mass.-based Modulate wants to change the way game developers take on the never-ending fight against online toxicity, Pappas said. He said the funding is a validation of the importance of the company’s mission.

“The core business is proactive voice moderation,” Pappas said. “Rather than just relying on player reports, it means you can actually do that duty of care and identify all the bad behavior on your platform and really do something about it in a more comprehensive way.”

ToxMod uses sophisticated machine learning models to go beyond transcription and understand not only what each player is saying, but also how they are saying it, including their emotion, volume, prosody, and more. This is crucial, because what is harmful in one context may be friendly or truly encouraging talk in another.

ToxMod said it uses its nuanced understanding of voice to differentiate between these types of situations, identifying the worst performers while leaving everyone free to enjoy their own approach to each game. With this sophistication, ToxMod can detect violations with greater than 98% accuracy (which gets even better over time) and allow moderation teams to respond to incidents more than 25x faster.

“I first saw the company about a year and a half ago. We saw it as a team with best-in-class technology. And that’s what we’re investing in,” Salmi said in an interview. “When I saw it, I couldn’t believe what I saw.”

The big question was if they could market this. They did, Salmi said. And Pappas said the company has a number of large unadvertised customers using it.

“Obviously no one else has it. We have been looking for this kind of technology for a long time and nothing has come close,” Salmi added.

Many companies are dealing with huge volumes of toxicity reports. Dennis Fong, CEO of GGWP, which uses AI to analyze text chat, reported that human moderators at these companies can only sift through a tiny percentage of these reports. GGWP focuses on different issues than Modulation, and GGWP also seeks to establish reputation scores for players that can help gauge their behavior over a long period of time.

By using this kind of long-term approach, companies can treat players who are only occasionally toxic in different ways than those who engage in them much more frequently. These so-called reputation scores can travel with players.

“For us, the immediate issue that we’re trying to really focus on is how to shed some light on what’s going on in the first place,” Pappas said. “We start by understanding the landscape and how toxicity emerges, where it occurs, how players act, and how do we work closely with our clients to design education campaigns.”

If players are being punished, they need to understand why. If toxicity occurs amid allegations of cheating, it’s important to know. Modulate is also thinking about how to preserve the sanity of moderators who have to deal with all the abuse.

When it comes to the metaverse, it makes sense for game companies to try to fix these issues in the narrower context of their own games before trying to go online with everyone else’s apps.

The Modulate team in Cambridge, Massachusetts.

Where existing voice moderation tools only focus on the 8% of players who submit reports, ToxMod offers proactive moderation that allows platform and game moderators to make informed decisions to protect players from harassment, toxic behaviors and even more insidious damage. Modulate has helped customers deal with thousands of online toxicity cases.

Pappas said the company makes sure it doesn’t misclassify things like foul language, which can be acceptable in games like Call of Duty, ranked by maturity, versus racial slurs. The idea is to make moderators more effective across the platform. Pappas said the problem detection model is formed over time and it keeps getting better.

Human moderators can sift through results and identify false positives, and the system can learn from them.
“They can start taking immediate action,” Pappas said. “Sometimes he may misunderstand the conversation because human language is complicated. And each game has different standards.

Words like “p0wned” must be considered in context to determine if they are used in an aggressive context. Pappas said it’s important for moderate voice that you can’t rely on basic transcription, which converts spoken words to text, because it doesn’t capture things like pitch or whether you’re shouting or not.

“No company has created this type of dataset specifically designed to focus on real online social voice chats,” Pappas said. “This has allowed us to improve the accuracy of our models, which outperforms all large public company transcription models by a pretty handy percentage.”

Because Modulate focused on the fundamentals of running a strong business, Pappas said he has a good relationship with VCs and finds it easier to fundraise at a time when it’s hard to do that, even for game companies. Salmi said it’s true that VCs are becoming more demanding and investments take longer, and that’s why he’s happy to find a company like Modulate.

The company reached its milestones with just 27 people, a testament to the power of AI.

The GamesBeat creed when covering the video game industry is “where passion meets business”. What does it mean? We want to tell you how much the news means to you, not only as a decision maker in a game studio, but also as a game fan. Whether you read our articles, listen to our podcasts, or watch our videos, GamesBeat will help you learn about and engage with the industry. Learn more about membership.

Share.

Comments are closed.