Call Of Duty Will Use AI To Moderate Toxic Voice Chat In Real Time

Call Of Duty Will Use AI To Moderate Toxic Voice Chat In Real Time

Call of Duty voice chat might become a little less toxic soon, with Activision announcing the introduction of an in-game voice chat AI moderation tool launching with Call of Duty: Modern Warfare III in November. The new tool comes from a team-up with Modulate, using their ToxMod AI powered voice chat moderation tech to identify toxic speech in real-time – with a US-only beta set to begin today in Modern Warfare II and Warzone ahead of the full launch alongside Modern Warfare III on 10 November.

The new real-time system will be used to both identify and enforce against toxic speech, including hate speech, discriminatory language, and harassment. Currently, Call of Duty moderates player behaviour with features like text-based filtering for in-game text across 14 languages and player-led reporting, led by their anti-toxicity team, but moderation of voice chat has been relatively lawless given the constraints of listening in to every CoD voice chat for bad-faith behaviour. 

When fully launched globally (excluding Asia) on 10 November, the AI-powered moderation tech will only be able to moderate English-speaking users, although plans for further language expansions have been promised “at a later date.”

Depressingly enough, over 1 million Call of Duty player accounts have violated the game’s Code of Conduct and had voice or text chat restricted (using existing anti-toxicity moderation) since the launch of Modern Warfare II. That’s a whole lot of sweaty gamers locked out of harassing fellow players – and with AI in the mix to pick up on voice chat toxicity, that number will likely skyrocket, although there’s certainly questions to be asked about how accurate exactly the AI system is given AI’s poor track record of inbuilt bias.

According to Activision, 20% of players who receive a first warning for toxic or antisocial behaviour in Call of Duty didn’t reoffend. In the blog post announcing the introduction of the AI system, the developer reiterated that it is “dedicated to combatting toxicity within our always, we look forward to working with our community to continue to make Call of Duty fair and fun for all.”

The extensive accompanying FAQ is available via Activision’s website now. In a potentially unintentional tongue-in-cheek response to a question about whether players can opt out of AI voice moderation, the reply notes that players certainly can opt out – by disabling voice chat altogether. In other words, if you’re not keen on an AI tool listening in to you verbally abusing other players, simply don’t speak in-game.

Whether the roll-out of Call of Duty’s newest AI moderation tool will actually have a major positive impact on player behaviour in the community is yet to be seen – but at the very least, it might make some players think twice before going ham in comms, which is never a negative thing.

The Cheapest NBN 1000 Plans

Looking to bump up your internet connection and save a few bucks? Here are the cheapest plans available.

At Kotaku, we independently select and write about stuff we love and think you'll like too. We have affiliate and advertising partnerships, which means we may collect a share of sales or other compensation from the links on this page. BTW – prices are accurate and items in stock at the time of posting.


One response to “Call Of Duty Will Use AI To Moderate Toxic Voice Chat In Real Time”

Leave a Reply

Your email address will not be published. Required fields are marked *