Products You May Like
Toxic gamers who enjoy harassing others via in-game comms are set for an equally rude awakening.
As creators and publishers of some of the best co-op games out there, Ubisoft and Riot Games are no strangers to having to deal with toxic players. Fortunately, both companies have joined forces for Zero Harm In Comms (opens in new tab), a research partnership intended to combat bad behavior online. The project will use AI technology to track, tackle, and teach bots how to identify cases of online harassment to “foster more rewarding social experiences and avoid harmful interactions”.
Badly behaved players won’t be able to hide behind the anonymity of their keyboards anymore. The research partnership intends to create a system where information about offending players will be stored and then shared with the entire industry in a database.
gg ez
You’ve probably heard the term “toxic” being used to describe certain kinds of online gamers. Whether you’re referring to specific gameplay techniques that are frowned upon by the wider playerbase or a case of genuine abuse or bullying, it’s hard to put a face to a screen name in the age of internet anonymity.
Although anti-cheat software has been introduced across many big online PvP games, these don’t tend to make note of when players are actively harming other players with their words. The narrow space between smack-talk and bullying is frequently occupied by the kind of trolls you’d hoped to only find in fairy tales.
I, Chatbot
In the announcement from Ubisoft Le Forge, executive director Yves Jacquier expresses his empathy (opens in new tab) for players who find themselves in these horrible positions. “Disruptive player behaviors is an issue that we take very seriously but also one that is very difficult to solve,” he says, referencing how the systems in place continually fail to single out and punish users for their bad manners. “We believe that, by coming together as an industry, we will be able to tackle this issue more effectively.”
And come together they shall, as Riot Games’ executive director Wesley Kerr effusively agrees. “We’re committed to working with industry partners like Ubisoft who believe in creating safe communities,” he says. This partnership with Ubisoft is just one example of the “wider commitment and work that [Riot Games are] doing…to develop systems that create healthy, safe and inclusive interactions.”
The AI software will work by taking chat logs across Ubisoft and Riot’s array of games and removing any cases of sensitive information before being labeled according to the behavior displayed. All of this data will be gathered to better prepare AI bots to spot players who flout community guidelines.
Sure, some games have taken strides to tackle bad manners in their individual games, but if having your equipment taken off you in the middle of your Call of Duty: Modern Warfare 2 run doesn’t sound threatening enough, the fact that hundreds of people will be reading your mean comments will hopefully make unsavory players think twice before mouthing off in post-game chats.