Call of Duty will use AI to moderate voice chats

5 min read Activision, known for Call of Duty, teams up with Modulate to introduce AI-driven voice chat moderation using ToxMod. This real-time system targets hate speech, discrimination, and harassment. A significant move for online gaming communities. September 01, 2023 08:13 Call of Duty will use AI to moderate voice chats

Activision, the publisher of the popular first-person shooter game Call of Duty, has partnered with a company called Modulate to bring AI-powered voice chat moderation to its titles. The new moderation system, using an AI technology called ToxMod, will work to identify behaviors like hate speech, discrimination, and harassment in real time.

The initial beta rollout of ToxMod begins today in North America. It is active within Call of Duty: Modern Warfare II and Call of Duty: Warzone. A full worldwide release will follow on November 10th with the release of Call of Duty: Modern Warfare III.

Modulate's website notes that the tool "triages voice chat to flag bad behavior, analyzes the nuances of each conversation to determine toxicity, and enables moderators to quickly respond to each incident by supplying relevant and accurate context." The company's CEO said in a recent interview that the tool aims to go beyond mere transcription; it takes factors like a player's emotions and volume into context as well in order to differentiate harmful statements from playful ones.

It is noteworthy that the tool (for now, at least) will not actually take action against players based on its data but will merely submit reports to Activision's moderators. Human involvement will likely remain an important safeguard since research has shown that speech recognition systems can display bias in the way they respond to users with different racial identities and accents.

The partnership between Activision and Modulate is a significant step in the fight against toxicity in gaming. It remains to be seen how effective the new moderation system will be, but it is a promising development that could help to make gaming a more inclusive and enjoyable experience for everyone.

Here are some additional thoughts on the matter:

  • The use of AI-powered voice chat moderation is a novel approach to the problem of toxicity in gaming. It has the potential to be more effective than traditional moderation methods, which can be labor-intensive and difficult to scale.
  • However, it is important to note that AI-powered systems are not perfect and can sometimes make mistakes. It is important to have human moderators involved in the process to ensure that no one is unfairly punished.
  • The partnership between Activision and Modulate is a positive step, but it is just one part of the solution. It is also important for gamers to be aware of the issue of toxicity and to take steps to report it when they see it.

User Comments (0)

Add Comment
We'll never share your email with anyone else.

img