Activision, the publisher of the popular first-person shooter game Call of Duty, has partnered with a company called Modulate to bring AI-powered voice chat moderation to its titles. The new moderation system, using an AI technology called ToxMod, will work to identify behaviors like hate speech, discrimination, and harassment in real time.
The initial beta rollout of ToxMod begins today in North America. It is active within Call of Duty: Modern Warfare II and Call of Duty: Warzone. A full worldwide release will follow on November 10th with the release of Call of Duty: Modern Warfare III.
Modulate's website notes that the tool "triages voice chat to flag bad behavior, analyzes the nuances of each conversation to determine toxicity, and enables moderators to quickly respond to each incident by supplying relevant and accurate context." The company's CEO said in a recent interview that the tool aims to go beyond mere transcription; it takes factors like a player's emotions and volume into context as well in order to differentiate harmful statements from playful ones.
It is noteworthy that the tool (for now, at least) will not actually take action against players based on its data but will merely submit reports to Activision's moderators. Human involvement will likely remain an important safeguard since research has shown that speech recognition systems can display bias in the way they respond to users with different racial identities and accents.
The partnership between Activision and Modulate is a significant step in the fight against toxicity in gaming. It remains to be seen how effective the new moderation system will be, but it is a promising development that could help to make gaming a more inclusive and enjoyable experience for everyone.
Here are some additional thoughts on the matter: