For years now, gaming companies such as Riot, Blizzard and Twitch have been fighting online abuse and harassment in their own ways, to mixed results. Now they’re trying something different: Working together.
Image: League of Legends
The Fair Play Alliance is a coalition made up of over 30 different companies, including Riot, Blizzard, Twitch, CCP and Epic, that aims to share research and lessons learned around gaming communities in the hopes of cutting down on “disruptive behaviour”. The goal, Riot senior technical designer Kimberly Voll told Kotaku, is to create a consistent set of behaviour standards between companies and a common understanding of the underlying issues that cause poor behaviour in online communities. The hope is that developers won’t have to start from square one any more when creating online games.
“A lot of these challenges today are super intimidating,” Voll told Kotaku at the Game Developers Conference in San Francisco this week. “These are big cultural shifts. As an industry and as a society online, we’re trying to find our way. Having to be a company that steps out and says ‘We’re gonna be the ones to do this’ is kinda scary. This is an opportunity for all of us to say ‘What if we walked together as an industry?’”
Voll acknowledged that Riot has a long and storied history of mistakes in this arena, but that’s kind of the point: Games such as League of Legends exemplify how difficult it is to bring communities back from the brink. Ideally, the Fair Play Alliance will allow other companies to directly learn from each other’s mistakes without stumbling into the same pitfalls.
“There’s a lot to share,” said Voll. “And we’ve done a lot of dumb things. We’ve learned a lot.” She added that Riot’s player behaviour team, once among the more prominent voices in this discussion, stepped back from the public eye a couple of years ago because they realised that their own efforts weren’t quite hitting the mark.
“Players were increasingly telling us they were unhappy,” she said. “That was us going sleeves-up and trying to figure out what was happening.” The team emerged with a new focus not just on disruptive players, but on the ways competitive games can make arseholes of us all. Now they want to share what they have learned with the rest of the class.
Image: Overwatch (by developer Blizzard, another member of the Fair Play Alliance).
The first step in that process is a day-long summit at GDC hosted by the Fair Play Alliance, which begins on Wednesday morning local time with a keynote by Voll. Developers and creators from companies including Activision, Epic and Supercell will openly discuss research, issues they have faced, mistakes they have made, and what they have learned in the process. (One of the many speakers, full disclosure, is my good friend Katherine Lo.)
It’s a solid first step, but it seems like The Fair Play Alliance is still working to secure a foothold beyond that. I asked Voll what lies ahead, and she joked that while the Fair Play Alliance has a website, there are not many other tangible elements to discuss just yet. It’s still working to solidify things such as shared resources, and a system that will allow developers to reach out to knowledgeable individuals when they’re struggling to solve abuse- or harassment-related issues.
In other words, the Fair Play Alliance is only just getting off the ground. The organisation’s loftiest goals, such as creating a consistent set of standards and rules across multiple multinational companies, will also require a lot of differently-shaped puzzle pieces to click together.
“There are a lot of challenges when you’re trying to determine what good behaviour looks like – or at least what bad looks like – on a global scale,” Voll said. Basic respect between individuals, she added, seems like an agreeable goal to aim for, but then you’ve got to consider what mutual respect even looks like in different cultures and groups. Close friends, for example, might hurl gobs of vile trash talk at each other, but they don’t mean it. The same verbal interaction between complete strangers would probably constitute a worst-case scenario. Or maybe not. Context is hard.

But there are also benefits to forming a collective that includes companies with so many different origins and priorities. They can cover each other’s blind spots and, hopefully, hold each other accountable. I mentioned notorious League of Legends streamer Tyler1 to Voll as an example of Riot and Twitch’s punitive systems failing in ways neither company saw coming – ultimately turning a problem child into a star instead of teaching him a lasting lesson – and Voll said she thinks communication via the Fair Play Alliance will stop something like that from happening again.
“I think this is a really good example of a spot where we can learn from one another and understand the repercussions,” she said. “On one hand, we’re very conscious of the slippery slope that is to step outside the game and try to impose values more broadly. On the other hand, every frickin’ thing’s online. There’s no IRL and online any more. This is just all reality now. So streamers are ambassadors of culture bigger than just League of Legends or whichever game. They’re ambassadors of online life.”
And if companies such as Riot and Twitch don’t see eye-to-eye on players like Tyler1? Voll thinks that might end up being useful, as well.
“I think it’s also good checks and balances,” she said. “I think it’s great if Twitch is like ‘Hey, that seemed like a bit too much. Let’s talk about that.’ None of us are gonna get it perfect.”
Comments
7 responses to “Riot, Blizzard And Twitch Are Teaming Up To Fight Toxic Gaming Behaviour”
Good, there’s games that I specifically avoid due to their community alone. Game looks great, I’d like to play it but having to deal with a toxic community at the same time is a deal breaker.
League of Legends is one of those but I’ve pretty much thrown all competitive MOBAs into the same basket because it’s essentially the same community across all of the games.
Oh yay. Can’t wait for arbitrary and personal decisions on what is toxic. And then the enevitable rise in people trying to police this and using it as a threat after taunting someone for hours.
What would you suggest they do instead?
This is a concerted effort by a whole bunch of people who work in community control to learn how to better handle rogue elements that hurt the games financially and socially. How is this any worse than what they’ve tried before?
Hold up, lemme put my tin foil hat on.
It’s certainly a great idea on paper and the concept of teaming up to combat toxicity makes for a good headline but it appears the financial aspect is the prime motivation based on their own mission statement and strategies which are focused more on accessibility and inclusivity rather than the removal of rogue elements.
They are kinda vague here about actually tackling toxicity, lots of maybes and shoulder shrugging on the subject but some very corporate heavy material going on at the business end.
The red flag for my inner nutter is that it comes off very, have your cake and eat it too, in that a swing either way in your community means your cutting somebody out regardless.
I’m far more interested in the kinds of information they will ultimately exchange and gather and how what is actually done with it.
Nobody exchanges or gathers information these days without the intention of making money from it.
I mean, best of luck to them in this endeavour, time will tell I’m sure.
Of course they’re intending on making money out of it. They want more people to have a good time so they keep playing and spend more money. That’s completely transparent. Games are a service now. Services are profitable through continued use. As long as their methods for doing it are ethical, then why the hell not?
Hopefully their methods are going to be a lot less indistinct once this stuff gets off the ground. Then we can start to see how they think they can foster community cohesion in a way that isn’t just “get whale to spend more by making them feel emotionally indebted”.
As far as the swing in the community cutting someone out goes, if they’re cutting out people who are being abusive, then fine. You don’t reward dickheads for being dickheads. A playerbase that feels like they are under siege from people being abusive is not going to build a community, last the test of time, or (ultimately) remain profitable.
I don’t think it’s entirely a cynical move. It’s definitely at least a bit cynical, but that doesn’t bother me personally. Accidentally making the world less shitty is is still a net win. If coca-cola starts a marketing campaign that ends up reducing child poverty, I’m all for it. It’s obviously going to be a cynical money making endeavour, but Coke can make their money off it and I’ll be fine with it. Because I don’t care how the world gets better as long as it does.
What I mean is they don’t want to cut anyone out, they seem to have noticed that both available options result in a loss of players.
Allow the toxic element to fester and it keeps other players away, attempt to cut out the toxic element and chase players away.
Turns out people don’t like dickheads and also don’t like to be censored.
Yeah but should we be caring how the dickheads feel? It’s not censorship to say “don’t be an indefensible fuckwit”