The video game giant Electronic Arts says it is improving its systems for elevating safety and harassment issues that emerge from its player communities in order to ensure the company addresses them.
The changes have been a long time coming and are desperately needed, sources tell Kotaku.
“Recently, we’ve updated our escalation procedure for all employees, and we’re in the midst of re-training to ensure everyone is well-versed on how to manage threats, suicide, harassment, bullying, inappropriate behaviour, etc.,” Adam Tanielian, head of global community engagement at EA, told Kotaku.
“Our systems and process aren’t perfect, and people make mistakes. But we work hard every day to get better, and make sure, at the end of the day, that people in our community are having fun. That’s what it should be all about.”
Two sources familiar with the community management structure at EA — one current and one former employee, both of whom requested anonymity, one for fear of reprisal from EA, the other because they still work in the industry — have told Kotaku that the policies are new to them.
To their knowledge, this is the first time the company has had a formal method for its community managers to escalate threats.
The new policy was e-mailed by the global director of core community and the Game Changers program to the company’s community managers on March 12. “When in doubt—Report,” it reads. What follows afterward is a chart that outlines all the appropriate steps to be taken in the event of escalating a threat.
It is specific about scenarios that should prompt community managers to escalate concerns about. Those situations include issues of their safety and the safety of their players, situations that might harm the company, involve threats, or involve claims of abuse.
Both sources that spoke to Kotaku said that they were unaware of a formal policy or procedure prior to the email.
Recent incidents in gaming communities, including EA’s, highlight the importance of a game company’s community managers to be able to identify troubling behaviour in player groups and to transmit any relevant concerns to people above them.
Last month, for example, a safety issue arose in the player community for the EA game The Sims 4 after it simmered largely unattended for months. Multiple players reported that a member of the EA-backed player group called the Game Changers had been sending inappropriate sexual messages to underage Sims players.
Players who said they had received these messages told Kotaku that they had brought this up to The Sims’ community manager in December of last year, but felt like they had been brushed off.
In March, those players went public and the EA Game Changer in question stepped down. In a March 7 statement, Sims studio head Lyndsay Pearson apologised “for the delay” in investigating or acting on the allegations, saying that “our process to properly investigate and escalate anything of this nature was not followed.”
She also noted: “We are making immediate improvements to our internal investigation and escalation processes, and ensuring our entire team learns from this.”
The two sources that spoke to Kotaku said that this situation with the EA Game Changer shouldn’t have happened and would have been easier to resolve if community managers had more resources and better training.
The source that currently works at EA said that concerns have also been raised about the aggressive tenor of EA’s sports events.
The source who worked in EA community management until recently said that when they worked at EA, they had to build safety protocols for their game’s playerbase from the ground up, including protocols stipulating whether or not the EA Game Changer program should include minors.
However, after this source left EA, they say that their safety protocols got rolled back again. The current EA employee said that safety protocols like these were non existent while they have worked at the company.
Kotaku reached out to EA for comment about the company’s community management training and safety protocols, and we received replies from Adam Tanielian, the head of global community engagement at EA.
He said that while the company’s policies on escalating concerns about threats are still in progress, the system fundamentally works.
“A few years ago, we started to see more threats of violence and suicide in our channels, so we put policies and procedures in place for managers and teams to escalate these situations,” he said.
“It operates on a very simple principle: if you see something, say something.... We’ve built policies around EA-related external communications and have internal digital ethics and conduct training. We try to fill a lot of the gaps we see pop up, but every year we see more risks online that require adjustments, which we try to recognise and adapt as fast as we can.
Despite our best efforts, even when we try to cover all the bases, our community managers will find themselves in unexpected situations and we will learn from them, as we did with this Sims situation.”
We’ve been unable to reconcile our sources’ claims to have been unaware of escalation protocols and EA’s claim that such protocols are simply being updated, implying they’ve been around for a while.
The current EA employee we spoke to remains sceptical of the newly articulated ways to report things up the chain, having seen a couple of new reports of issues sputter.
“From what I hear from people who have used those channels, the net result is basically the same… There’s a ton of ‘we couldn’t have known’ or ‘we couldn’t have seen this coming,’ and it’s nonsense.”
Companies like EA are run and staffed by people, and people make mistakes. But with such a large company working with so many young people in its games’ communities, mistakes in community management can lead to young people being hurt or exploited.
Working with numerous, large communities does not mean that those kinds of scenarios are inevitable, either. It just means that it’s essential to provide community managers with the training and resources they need, and to take their concerns seriously.
“Enough is enough with this stuff,” one of the sources said. “None of this is surprising, but our concerns have never really been heard.”