spot_img
spot_imgspot_imgspot_imgspot_img

Activision introduces AI-based voice chat moderation tool in Call of Duty games

Published on:

Call of Duty, one of the iconic shooter franchises, has taken a step to combat toxicity in its video games by introducing a new AI-based voice moderation tool to identify toxic players in new Modern Warfare series.

The AI tool is called ToxMod and it is currently in beta phase to test out the effectiveness against players with toxic behavior including hate speech, discriminatory language, harassment, and more.

ToxMod is developed by Modulate, which claims it to be “the only proactive voice chat moderation solution purpose-built for games”. While the software is used by smaller games, Call of Duty will be the first major title to use the tool.

Activision has also released a dedicated FAQ page to provide more clarification on the voice chat moderation tool and address users’ privacy concerns. Notably, the tool is said to only observe player behavior and does not have the authority to ban anyone.

The main objective of ToxMod is to flag inappropriate words and report it back to the actual game moderators who will have the final say on who needs to be banned, PC Gamer reported.

Currently, there is no way to opt out of this voice chat moderation, but players can turn off voice chat in their game settings to avoid it. However, by turning off the voice chat option, players will not be able to communicate with each other.

Another important thing to note is that trash-talk or friendly arguments, which are a core part of games like Call of Duty, are allowed. Players will only be punished if the chat deviates to hate speech, discrimination, sexism, and other types of intentional harmful language.

Since last year, Activision has started focusing more on curbing toxic behavior in their games, and while the new AI tool will help detect it, the ultimate decision to punish a player will be taken by a human moderation team.

Related