spot_img
spot_imgspot_imgspot_imgspot_img

Activision teams up with Caltech to develop AI system to combat toxic behaviour in Call of Duty

Published on:

Caltech and Activision have teamed up to develop an AI system for Call of Duty (COD) in order to combat the increasing toxicity in the video game. Anima Anandkumar and Michael Alvarez will be leading the development charge from Caltech, according to the official release from Caltech.

COD is one of the most popular multiplayer games around the world. The game rose in popularity mainly around the release of Modern Warfare 2 (2009) and is a fan favorite.

One of the most fun-to-use features in the game is said to be the voice chat or proximity chat where people can come close to another player and talk to them.

Most people use this proximity chat as a way of having fun and socializing but several incidents in the past year have been reported about toxic behaviour from certain players, ruining the game experience for other players. These players reportedly troll other players and display unethical behavior.

To combat this, the game developers have provided the option to report the players, but assessing players based on manual reporting usually takes a lot of time. To provide a quick solution to this problem, Anandkumar and Alvarez have started working on an AI system.

Both researchers from Caltech have already started to train and develop the AI in order to detect trolling in the game. The AI will reportedly detect toxic behavior from both voice and text chats.

“Over the past few years, our collaboration with Anima Anandkumar’s group has been very productive. We have learned a great deal about how to use large data and deep learning to identify toxic conversation and behavior. This new direction, with our colleagues at Activision, gives us an opportunity to apply what we have learned to study toxic behavior in a new and important area — gaming,” said Alvarez.

Activision’s chief technology officer, Michael Vance said, “Our teams continue to make great progress in combating disruptive behavior, and we also want to look much further down the road. This collaboration will allow us to build upon our existing work and explore the frontier of research in this area.”

The proximity chat is on by default and can be turned off but most players choose not to as they consider it as one of the social aspects of the game. Toxic behavior in online gaming has been a serious problem and it’s time game developers started cracking down on this.

Related