You're playing League of Legends. You make a slight error. You are now dead and, worse, your team — made up of anonymous players online — is dead. Now they're angry. A torrent of abuse streams through vent. You are belittled, bullied, all for a simple mistake. In a video game. This is a common scenario, one that is holding back online communities across borders, across formats.
But what can be done? Are we doomed to suffer these trolls, or can developers and communities work together to find a solution?
People are assholes online. Whether you’re sitting on Reddit, browsing N4G, or listening to torrents of abuse on Xbox Live, it’s been well and truly proven that the internet is full of socially inept trolls. Generally, it’s pretty easy to ignore them, but when you’re stuck dealing with a vocal minority of awful people during a gaming session, it shouldn’t be up to you to quit the game to avoid the abuse.
Traditionally, most controlled services (such as Battle.net or Xbox Live) have abuse reporting. Dedicated abuse teams monitor messaging services and public games, flagging members or content. Producing evidence of threats or inappropriate “imagery” is pretty much sending the perpetrator on a shortcut to a permanent ban.
In most cases, these old school methods of punishment are reasonably successful at culling the top 5% of morons, but what about those who don’t leave evidence of their crime? How about if the harassment isn’t offensive but simply unnecessarily negative, impacting your ability to learn the game or play at your own pace?
Many gamers have encountered this first hand, whether as a bystander witnessing the harassment, or the victim on the receiving end of it. The rapid advancement into the mainstream of all genres of games, even those generally suited to a specific niche, has invited a flood of people with poor people skills and intolerance for new players.
Dealing with a large player base, one usually unwilling or unable to step in to stop abuse, can quickly become a problem. New players, particularly in games with high learning curves, are already squeamish about dedicating time to a venture that may not suit them. Publishers and developers have been notorious for ignoring this problem, at their own peril – games lose thousands of potential new sources of revenue every day thanks to a lack of ambassadors willing to guide newbies.
Many MMOs, particularly vulnerable thanks to a tight market, have developed “buddy” and “mentor” systems designed to allow the community to accept, rather than reject, new players. Older players are rewarded with in-game bonuses for participating, while new players are given help to learn the basics. As a result, there’s a new understanding – the newbie L2P, the veteran is rewarded with experience and/or gold.
Studies on the viability or success of these schemes are pretty thin on the ground, but the simple device of integrating, rather than ostracising, players is good for everyone. After all, who will there be to group with you if new players don’t feel welcome enough to stick around? The increase in players choosing to play within tight, exclusive guilds has become both a solution and catalyst to this problem.
Other games have taken a slightly different approach to dealing with a hostile player base. The notoriously hardcore players of League of Legends are known to take no prisoners. Players are hounded from game one for their mistakes, with countless stories of intense abuse and negative reinforcement. The situation became so fragmented that the developers noticed the growing problem and decided to do something about it.
The Tribunal system involves a judge and jury style system of punishment that takes the onus off the developer and plants it firmly onto the back of the community itself. Players of sufficient standing and status are invited to review reported causes of anti-social behaviour, collate the evidence and vote on a ruling. The punishment, whether via suspension or a permanent ban, is final and completely independent on whether money has changed hands or not.
This method isn’t entirely new, but it’s the first I’ve seen that has been implemented directly as part of a games internal systems. Players who participate are rewarded with in-game bonuses and are encouraged to prevent situations escalating before a tribunal is necessary. Not all issues are taken to this tribunal, and it’s only after a significant number of votes are taken that a penalty is doled out.
But for such a significant investment and injection of trust into the community, the system does have its flaws and critics. Players are not vetted for their age, nor is their record of voting cross checked to make sure they aren’t needlessly punishing randoms for fun, or picking on a particular player. Many players say that the system simply doesn’t work and that many will push their luck and continue to abuse players anyway.
At the same time, however, in the 7 months since the Tribunal was introduced, perceptions have changed. Forum chatter and my own personal experiences have found a much calmer player base since the system was introduced, as reports aren’t heading to a mysterious cloud of CSRs, they are heading straight to the people they might prey on. The system is audited by the developers, Riot Games, to ensure accuracy and fairness.
Bullying and abuse will probably always be a problem; particularly amongst vulnerable targets such female gamers, gay gamers or those with a mental illness. In the end, it’s down to the community to shape up or ship out to ensure a future for online play. The problem is only going to escalate over the next decade, as more community focused games release and so do the methods in which people can communicate over them.
Developers and publishers have been dragging their feet for years to develop systems that keep up with the changing face of gaming. Riot Games’ bold approach needs to be replicated and enhanced across all platforms and genres to provide players with more than a mute button and the expectation that a tiny team of “abuse monitors” will solve the problem for them. It hasn’t worked to this point, so what expectation is there that it will in the future?
Abuse between gamers is completely unacceptable, in any form, in any game and to any player. Being a newbie or a woman is not a valid excuse for harassment or threats, regardless of what may or may not have “caused” the altercation. Tragically there is no single bullet solution to what is a very complex issue. A solution may involve input from all parties to develop a successful end to a situation that would be abhorrent in the real world. Here's hoping we can find one.
James Pinnell writes for Games.on.net, OzGamers and Pixel Hunt. You can follow him on twitter @JamesPinnell