It’s not every day you hear about an enormous platform like Twitch swatting the face of one of its most popular emotes clean off and rendering the president’s account indefinitely inert, as it did in the aftermath of last week’s Capitol Building insurrection. But Twitch does do similar things from time to time, on a smaller scale. You just don’t hear about it. That’s a problem.
We will continue to unpack the ramifications of last week’s insurrection for years to come, but one thing it revealed with immediate crystalline clarity is that, despite years of feet dragging and excuse making so elaborate it verged on an art form, it was not actually difficult for major tech companies to ban Donald Trump and prominent figures who tirelessly spread conspiracy theories that paved the way for violence. Twitch, owned by tech giant among giants Amazon, moved in lockstep with other colossi like Facebook, Twitter, YouTube and Reddit, first removing the Pogchamp emote due to its connection to Ryan “Gootecks” Gutierrez — who called for further “civil unrest” after the insurrection — and then indefinitely suspending Trump to “prevent Twitch from being used to incite further violence.”
Both of these actions were long overdue; Gutierrez spent most of 2020 demonstrably going further and further down the far-right conspiracy rabbithole and publicly endorsing dangerous covid denialist stances, while Trump’s channel aired numerous rallies that violated Twitch’s hateful conduct policies (the company briefly suspended him for this once before, in June 2020, after other platforms like Reddit had already taken similar actions). Twitch did not do either of these things unprompted; a multitude of personalities and pundits had called for Pogchamp’s removal, and Twitch followed other platforms’ lead in suspending Trump’s relatively small channel.
But, as journalist Jules Suzdaltsev put it on Twitter, it’s “probably not a great sign for our democracy that the biggest blow a sitting president can suffer is being banned from a website.” He was referring to Twitter specifically, but his broader point extends to the several other websites that, alongside Twitter, now more or less comprise the entire internet. It is by no existent measure a good, just, or fair thing that Facebook, Google, Twitter, and Amazon command the power to shape the online and offline worlds in their image. In a better world, they’d have been broken up years ago, or they would at least not be in charge of both making and enforcing the rules of online discourse. While many far-right pundits have spent the past few days clouding the airwaves with disingenuous seal borks about First Amendment rights, it is now clearer than ever that companies will only use their powers for the public good when it directly benefits them, or the ground beneath their feet is mere moments from sloughing into the sea. Tech giants played an enormous role in getting us into this fascistic mess, and they didn’t even glance at a mop until it was far too big to be cleaned up.
Twitch’s footprint is not as big as those of Facebook and YouTube, but it is nonetheless an increasingly vital part of the political ecosystem thanks to personalities like Hasan Piker and the recent trend of politicians like democratic representative Alexandria Ocasio-Cortez and newly elected senator Raphael Warnock using the platform to connect with elusive young voters. Twitch’s decisions around political matters carry weight that grows by the day, much like the platform itself — it had its biggest day ever yesterday, breaking records that were repeatedly broken last year. Despite this, when it comes to conspiracies and the kinds of communities that enable them, Twitch continues to reactively take aim at symptoms, rather than causes, and does so in a way that’s less transparent even than the likes of Facebook and YouTube. All the while, it picks its battles such that even when it enforces its rules, harmful cultures have time to perpetuate, and small, frequently marginalised streamers get caught in the crossfire.
Not Very Poggers
The Pogchamp emote’s removal is a case study in this. Publicly, Twitch merely announced that it had removed Gutierrez’s face, citing “statements from the face of the emote encouraging further violence after what took place in the Capitol today.” It did not dig any further into the ramifications of this decision. Speaking to Kotaku under the condition of anonymity, two sources with knowledge of Twitch’s business explained that some “global” emotes — that is, emotes that any viewer across Twitch can use in any chat — have been monetized by the company, generally for merch and “cheermotes.” Cheermotes are animated versions of emotes that viewers can activate using bits, a proprietary Twitch currency that is purchased with real money. In some instances, sources said, the personalities whose likenesses are used for global cheermotes — of which Pogchamp was one — are paid on a per usage basis. This adds up, with one source estimating that Gutierrez was making around $US50,000 ($64,720) per year off his global emote alone.
In addition, said both sources, Gutierrez negotiated an upfront payment of somewhere between $US50,000 ($64,720) and $US100,000 ($129,440) in exchange for Twitch’s use of his likeness. Back in 2018, when the Pogchamp cheermote was first announced, Gutierrez — who’d previously voiced reluctance about his status as a Twitch meme — jokingly said in a video that he’d partnered with Twitch to make the cheermote “through the miracle of the internet and a strong legal team, rather than suing each and every one of you individually for copyright infringement.” Around the same time, other personalities like esports veteran Scott “SirScoots” Smith turned down similar deals due to concerns around likeness rights.
“It would have given Twitch, and more importantly Amazon, exclusive, non-revocable, and in perpetuity rights over that picture and the emote name and the right to use my full name and details in relation to exercising those rights on any platform, existing now or in the future,” Smith told Kotaku in a DM, noting this his emote would not have even been a monetized cheermote, meaning he would not have made money off it.
Kotaku reached out to Gutierrez for more details, but he did not reply. Twitch does not disclose the terms of such deals. That said, the new daily faces of the Pogchamp emote are almost certainly not getting paid, because they’re not getting turned into cheermotes. A third source was adamant that the faces of non-monetized global emotes do not get paid despite their prominence on Twitch, while one of the other two, when asked if the new Pogchamps are getting paid, replied, “Oh definitely not.”
All of this demonstrates that when Twitch decides to ban a streamer or even just remove an emote, there are significant financial consequences. Where there is money to be lost or made, companies will always act first and foremost in their own self interest. From that perspective, removing the Pogchamp emote made perfect sense. It was a highly visible symbol, but it was not attached to an especially powerful or popular figure (despite being a name in the fighting game community, Gutierrez is not well known outside of it). Removing the emote meant both heading off a PR disaster at the pass and scoring some positive public perception points in the process. In losing a significant income stream, Gutierrez shouldered most of the burden, while Twitch freed up some cash. But Twitch’s action did nothing to address the underlying structures and cultural norms that allowed for a personality like Gutierrez to become radicalized in the first place.
This is exemplified by the blowback to Twitch’s replacement for Gutierrez’s image — a new Pogchamp emote every 24 hours — which has predictably fallen on the backs of other streamers who are not particularly powerful. Yesterday’s Pogchamp, queer Black actor and vocalist Critical Bard, faced a deluge of harassment across Twitch, Twitter, and Facebook, almost immediately after Twitch announced that he’d been selected. Many harassers justified their actions by baselessly accusing Critical Bard of being a “racial supremacist,” seemingly because of a recent clip in which he said that white lives “don’t matter” because, “White lives are not a thing; you can be proud of being Italian, you can be proud of being Scottish — you cannot be proud of being white.” In the clip, he added, “Black people have to say ‘Black lives matter’ because we were stolen from a country that we loved, and we’re forced to be here, stripped of our heritage and our identities. All we know is our Blackness.” It is now the most viewed clip of the week across all of Twitch, with over 400,000 views.
The broader context of what Critical Bard said has not stopped harassers from perpetuating a disingenuous, conspiratorial idea that far-right adherents frequently apply to anybody who supports Black Lives Matter: that Black people believe they are superior and seek to replace white people. This is not a random occurrence. Communities on and especially around Twitch regularly perpetuate these kinds of ideas, meaning that even though some of Critical Bard’s harassment is happening on other platforms, it’s Twitch’s problem. It’s not just a “right now” problem, either. It’s as old as the ugly custom of viewers spamming the Trihard emote — based on the likeness of Black speedrunner Trihex — any time there’s a Black person on screen.
Pretty much anybody who follows Twitch could have told you that what’s happening to Critical Bard was inevitable. After all, it has happened numerous times before, in numerous forms, to other Black streamers. Chat spam, DMs, and dummy accounts with racist usernames are some of the oldest tricks in the book, but they keep working.
“I mostly wish there were things in place so racists and bigots can’t have such easy access to a streamer’s platform,” Critical Bard told Kotaku in a DM. “The fact that they could spam racist vitriol at me and then, when I comment about white lives and Black lives as a whole, they can clip a section to further their racist antics to the point of hacking attempts & doxxing — it’s hard to feel safe on Twitch.”
Twitch’s efforts to curb this kind of harassment have thus far proven unfocused and insufficient. Despite that, it continues to put Black streamers on display in marketing stunts without overhauling its process — for example, by designing rules that target this problem and building teams specifically around eradicating it — or taking adequate precautions, like providing Black streamers in the spotlight with support staff dedicated to handling harassment immediately. Like distant, dispassionate clockwork, this is the result, just as it’s always been. Twitch is set to enact new, more specific rules around hate and harassment later this month, and while they appear to be a step in the right direction, they’re an incremental improvement to Twitch’s current approach — not an overhaul. Perhaps they’ll root out some bad actors over time, but even under the new rules, it’s hard to imagine Critical Bard’s situation playing out very differently.
If platforms like Twitch are going to act as globally powerful stewards of discourse, the least users can ask for are rules that actually protect them, and true, comprehensive transparency around those rules. Otherwise, companies will always be able to walk a path of least resistance that allows them to trample smaller, more marginalised creators, while failing to take action when it really counts. For example, in May of last year, Twitch chose not to meaningfully address a far bigger problem than an objectionable face on the Pogchamp emote: then-Twitch superstar Guy “Dr Disrespect” Beahm, one of the most popular streamers on Twitch with nearly 4.5 million followers, spent a chunk of a stream espousing dangerously false covid-19 conspiracy theories. Many viewers were shocked and disappointed, while others cheered him on. For this, he faced no public consequence, nor did Twitch try to supply viewers with more accurate information, like other platforms have taken to doing. Beahm was banned from Twitch later in the year for reasons that remain unclear to the public, but multiple sources have told Kotaku that it had nothing to do with conspiracy theories. Unsurprisingly, this has resulted in a bevy of conspiracy theories.
At the time of Beahm’s conspiracy-addled rant, other, smaller streamers had received suspensions for things as minor as jokes about covid. But over the years, Twitch has earned a reputation for inconsistent application of its rules, leading to repeated accusations of favoritism toward big streamers — the ones who stand to do the most damage where conspiracy theories and hateful conduct are concerned.
Conspiracy To Uphold The Status Quo
Twitch has not done much better with smaller, emerging threats. Multiple times last year, I reported on the Twitch presence of Patriots’ Soapbox (PSB), an organisation credited with having played a key role in popularising the QAnon mega-conspiracy that has directly resulted in, among many, many other things, last week’s Capitol Building insurrection. PSB’s Twitch channel regularly parroted hateful QAnon talking points and covid-related conspiracy theories. It also facilitated the collaborative construction of new conspiracy theories in chat. But even though Twitch eventually took notice and suspended PSB multiple times during the latter portion of 2020, it kept getting further chances.
In October, YouTube permanently banned Patriots’ Soapbox, an event which coincided with a tremendous increase in the organisation’s Twitch viewership — from around 20-50 concurrent viewers at any point during its ‘round the clock streams to peaks of 500+ concurrent viewers, totaling out to thousands. Twitch proceeded to suspend the channel again in November, seemingly once and for all. But then, just a week later, it was back again, to baffled cries from Twitch users. It was not until early December that Twitch finally banned Patriots’ Soapbox for real, two months after YouTube had given the plainly conspiratorial channel the boot. In the time it takes to construct and disseminate conspiracy theories, that’s an eternity.
Twitch never publicly said anything about PSB’s permaban. Kotaku reached out for comment on multiple occasions, including last week, but Twitch did not reply to PSB-specific questions. In regard to covid-related conspiracies, a Twitch spokesperson told Kotaku in May of last year, “We take action on content related to covid-19 that encourages or incites self-destructive behaviour, attempts or threatens to physically harm others, or any hateful conduct,” and replied to questions about a September PSB suspension by saying, “The safety of our community is our top priority, and we reserve the right to suspend any account for conduct that violates our rules, or that we determine to be inappropriate, harmful, or puts our community at risk.” It has never gotten any more specific than that, nor has it explained why a channel that repeatedly broke those rules didn’t get dumped into the abyss any sooner. The closest Twitch has recently gotten to outlining a specific policy around these things is in its new rules, where it says that “hate groups and hate group propaganda” will not be allowed. But will conspiracies be considered hate groups? And again, what will enforcement look like? At this point, it’s unclear.
Twitch’s silent, scattershot approach to conspiracy theories and hateful far-right rhetoric is hardly limited to the aforementioned channels. If you go on Twitch and search just about any popular conspiracy hashtag — #StopTheSteal, variations on QAnon’s “Where we go one, we go all” slogan — you can find hundreds of videos and clips. They don’t have many views, but they’re there. Similarly, it’s not difficult to find small streamers in Twitch’s politics section who will argue up and down that Trump actually won the election, the conspiracy theory at the root of last week’s insurrection.
The keyword in all of this is “small.” Twitch’s largely non-algorithmic structure and focus on livestreaming have helped it dodge the sorts of big conspiracy influencers that overtook YouTube, Facebook, and Twitter, but conspiratorial cultures remain perniciously present in other ways. Twitch is especially susceptible to homegrown conspiracy theories, of which there are already many surrounding specific streamers, due to Twitch’s insistence on metering out career-altering suspensions and bans without explaining why to the public. The recent, seemingly endless torrent of DMCA takedowns — some of which have gotten streamers suspended, others of which have not — has not helped this issue. Twitch’s monolithic culture also intersects with and influences the chats of other streaming platforms, including DLive, which until shortly after the insurrection played host to far-right figures who’d been banned from platforms like YouTube and Twitter. For evidence of this, you need look only as far as the language viewers used in far-right personalities’ chats on DLive and other platforms, which included references to Twitch emotes and slang alongside openly hateful, discriminatory messages.
Doing The Same Thing And Expecting Different Results
There are a multitude of problems with Twitch’s slow-mo whack-a-mole approach to moderation, one of which is that it gives offending channels time to grow audiences, which, even if they do get banned, they can just bring with them to other platforms. On top of that, Twitch’s approach does not isolate a specific problem, more often relying on a vague definition of “hateful conduct” to eventually do the job. It is unclear where the line is, or indeed, if there is a line beyond, “what Twitch does not like at the time.”
Perhaps Twitch’s new anti-harassment rules will force it to give more specific explanations. For now, though, this means conspiracy-brained far-right channels rarely grow too big before getting the boot, but that is due just as much to the leanings of Twitch’s most popular creators, who will not collaborate with or boost far-right channels, as it is Twitch’s direct intervention. Meanwhile, those who already have big audiences or ingrained Twitch clout — people like Beahm and Gutierrez — can grow into much bigger problems that leave Twitch hesitant to act until after damage has been done. More troublingly, this unfocused approach very evidently allows the sorts of people who perpetuate conspiratorial views and harass streamers based on ideas like “racial supremacy” to remain in Twitch’s orbit. They are still a significant part of the culture, whether Twitch wants to acknowledge and specifically target them or not. As long as Twitch fails to do so, those people will continue to find ways to torment streamers like Critical Bard, leaving countless others feeling unsafe.
Without specific, transparent, comprehensively enforced policies in place, it is entirely possible that the next QAnon, whatever form it ends up taking, could leverage the conspiratorial side of Twitch’s culture to jumpstart its growth. It certainly has ample avenues through which to do so. But even if that kind of worst case scenario never comes to pass, the status quo has already proven itself untenable. Twitch cannot simply change Pogchamp’s face, suspend Trump, and fly the ol’ “Mission Accomplished” banner. We’re long past the point of too little, too late, and Twitch has more than enough information to understand how it and its creators are vulnerable. If it does not adopt a more proactive and precise stance, then it’s complicit in whatever happens next.