Twitch’s First Transparency Report Is A Start, But Streamers Want More

Twitch’s First Transparency Report Is A Start, But Streamers Want More
Photo: Kiyoshi Ota, Getty Images
To sign up for our daily newsletter covering the latest news, features and reviews, head HERE. For a running feed of all our stories, follow us on Twitter HERE. Or you can bookmark the Kotaku Australia homepage to visit whenever you need a news fix.

Today, Twitch released its first-ever transparency report, a lengthy, stat-based look at the platform’s safety initiatives over the past year. It contains some interesting, albeit granular, information about Twitch’s efforts to cut down on hateful conduct, sexual harassment, and even terrorist propaganda. But it also fails to clear the haze from the question that has surrounded many of Twitch’s most perplexing decisions: Why?

Certainly, the report contains many interesting numbers. Encouragingly, the company says it has made “a 4X increase in the number of content moderation professionals” over the past year, meaning that if users file a report, it’s more likely that someone will get around to responding to it in a timely manner. Twitch did not, however, say how many content moderation professionals it currently employs, nor did it say whether they’re in-house or contractors (aka the Facebook method, which has led to all sorts of issues over the years).

Twitch also pats itself on the back for greater moderation coverage, noting that between its AutoMod software and human moderators, 95% of live content on the platform was viewed by a moderator of some sort by the end of 2020. Most sections of the report focused on similar increases: chat messages removed by AutoMod and the blocked term functionality, both of which allow streamers to automatically pre-screen messages for specific words and phrases, rose 61% between the first half of the year and the second. Manual message deletion on the part of creators and moderators was up a whopping 98% relative to the first half of the year, which Twitch attributed to the elephant in the room: a 40% increase in the overall number of channels on Twitch between the two halves of 2020.

Twitch also pointed to increases in the number of rule enforcements against reported users and channels. Total enforcements rose 41% over the course of the year, and the numbers reflect that in categories like hateful conduct and sexual harassment, violence and gore, nudity, and terrorist propaganda (Twitch claims this is extremely rare on its platform, but it also depends on what you classify as terrorism). The company also pointed to progress on the part of its Law Enforcement Response team, which made over 2,000 reports to the National Centre for Missing & Exploited Children in 2020. Twitch, however, continues to have issues with young users making channels, leaving themselves open to potential predation.

The report contains a handful of other, similar data sets, most of which paint Twitch in a favourable light. Certainly, they’re a useful measure of Twitch’s growth in these areas, and broadly, the report mirrors similar documentation provided by platforms like Discord, Facebook, and Twitter. The problem with these kinds of reports, however, is that they have a way of appearing to say a lot while revealing very little. Twitch has offered numbers and a small amount of context, but streamers and viewers remain in the dark on major issues that came to light last year.

Replies and quote tweets on Twitch’s Twitter post about the transparency report, for example, are filled with questions about the status of Twitch’s investigations into reported sexual harassment (the ongoing nature of which has benefitted accused harassers, some of whom can still stream on the platform), specific high-profile bans like that of Dr Disrespect, the lack of a trans tag and other discoverability tools for underserved communities, lengthy turnaround times on ban appeals, the Twitch employee who Kotaku reported last year was no longer with the company after accusations of sexual assault, data about DMCA takedowns, and the process by which Twitch applies its rules, which frequently leads to inconsistent outcomes.

Twitch concluded its post about the report by saying it will “look closely at the feedback we receive to inform how we can refine these reports moving forward.” If nothing else, it now has plenty of feedback to work with.

Recommended Stories

Twitch’s Pogchamp Removal Was Never Enough, And Now It’s Turning Into A Disaster

It’s not every day you hear about an enormous platform like Twitch swatting the face of one of its most popular emotes clean off and rendering the president’s account indefinitely inert, as it did in the aftermath of last week’s Capitol Building insurrection. But Twitch does do similar things from...

Read more

Twitch Employee Accused Of Sexual Assault No Longer With Company

Back in June, Hassan Bokhari, accounts director of strategic partnerships at Twitch, was accused of sexual abuse and assault by a Twitch streamer named Vio. Now, after an investigation that began months ago, Twitch has banned Bokhari from the platform, and he is no longer an employee at the company.

Read more

Twitch Is Cracking Down On Small (But Not Big) Streamers Who Made Channels Before They Turned 13

AverageHarry, a 15 year-old aspiring Twitch streamer from the UK, was in spitting distance of his dream. He’d had a breakout 2020, amassing an audience of nearly 90,000 followers, somewhat ironically off the back of a viral clip in which randos in a hotel lobby made fun of him for...

Read more

Log in to comment on this story!