Video games will always manipulate us. Each challenge and scenario in a game has been carefully engineered to make us react a certain way. Most of the time, that’s what we sign up for. But the moment real money enters the equation, something changes.
In-game purchases, also known as microtransactions, have been at the heart of several colossal fan freakouts. Take Destiny 2, where developer Bungie changed the game’s experience points system after players discovered it was invisibly throttling their progress, which among other things slowed the pace at which they could earn loot boxes they’d otherwise have to pay for.
Around the same time, Star Wars Battlefront II ignited a furious online backlash after pre-release coverage revealed that the game’s purchasable loot boxes contained power-ups that made you more effective on the battlefield. A month earlier, Middle-Earth: Shadow of War courted similar controversy with a convoluted loot box scheme that gave players gear and soldiers for their Orc army. The controversy was so strong that, eventually, Shadow of War removed its loot box system entirely.
The NBA 2K series is so riddled with microtransactions that it significantly detracted from the experience of playing it. Less controversial games like Ubisoft’s Assassin’s Creed Origins still gave players an option to pay extra money for better in-game gear. Odyssey, a game that launched in Australia for $79, had optional purchases of its own surpassing $100.
The debate about in-game purchases predates any of those games.
Way back in 2009, my boss, Stephen Totilo, wrote of a microtransaction-laden Final Fantasy Crystal Chronicles spin-off: “Is there a dirty trick being played here on gamers? Who knows. There is the possibility. That stinks enough.” And in 2021 we’re still asking that question, even if the particulars have changed.
Viewed up close, the differences between the microtransactions in each of the fall games I mentioned above are manifold and relevant. Randomised loot boxes are inherently more exploitative than direct purchases.
“Pay-to-win” systems that give tangible gameplay advantages are more of an obvious problem than systems that revolve around cosmetic items. Microtransactions in full-priced games leave a more bitter taste than in free-to-play games. Zoom out a bit, however, and those differences matter less. Whatever form in-game purchases take, their mere existence damages the trust between people who play games and people who make them. Like Stephen wrote in 2009, the possibility of dirty tricks stinks enough that the tricks themselves are almost beside the point.
Every game with a microtransaction system is a player revolt waiting to happen.
That’s more true of full-priced games than free-to-play ones, but making a game free doesn’t necessarily make players feel any less taken advantage of. To see that in action, look no further than Destiny 2, whose Eververse microtransaction hub were first seen as innocuous until the XP revelations. Seemingly overnight, the grumbles of few players about the Eververse amplified into a roar.
The reason is simple: Whenever some aspect of the game is locked behind a real-money paywall, every decision the developers make will be suspect. All games are designed to make us feel one way or another, and most operate according to calculations and algorithms that are hidden from the player’s eye. But when real money is involved, those hidden systems take on a more sinister quality. And over the years, I’ve seen a common refrain from players: What else aren’t they telling us?
Microtransactions have become so prevalent in mainstream video games that it’s easy to forget what unnatural appendages they are. They’re not exactly “game design,” are they? They’re more of a marketplace experiment, albeit one that requires game design to function. It can be helpful to break the concept down to its basics, as a reminder of how in-game purchases can warp a player’s relationship with a game.
Imagine a video game boss fight. It’s this big monster that you have to beat, and he’s really tough. He defeats you over and over again, because he’s meant to test your abilities. You keep coming back, gradually learning his patterns and eventually overcoming him. After he goes down, you get a sweet new helmet. You equip the helmet and move on to the next challenge.
Now imagine the same boss fight, with one difference: If you want, you can just buy the helmet for $5. Just like that, the fight seems different. On your third or fourth death, you might begin to question the motives of the people who made it. Why is this boss so hard? Was he designed to be difficult in order to test your skill, or because the designers wanted you to eventually just give in and pay for the helmet? Is the game playing fair, or are you being manipulated?
Those two examples illustrate in the simplest terms the underlying problem with video game microtransactions. They will always have a detrimental effect on games, because they call into question a game’s integrity. They may be immensely profitable, and thus a smart business move. They may make it possible for a development studio to stay in business and keep making games. They may even appeal to a subset of gamers who like being able to pay for extra goodies. But in exchange for all that, game developers must be willing to undermine the design of their game on a fundamental level.
Games are built on trust between the designer and the player. We agree to play by the rules and in doing so, agree to trust the people who devised them. That agreement forms the basis for the rest of our relationship with a game, however lengthy or fleeting it may be. In-game purchases call our trust into question.
Some games will add microtransactions the “right” way, and won’t kick up much fuss. Others will get it “wrong,” prompting the next great Internet backlash. Still others might start out ok before making a change for the worse, or vice-versa. The particulars will always be different. But the moment a player is given the option pay extra for something they’d otherwise have to earn, the damage is done.