Illustration by Angelica Alzona
That game you bought last month isn't the same game today. It's almost certainly been patched, updated, rebalanced, and tweaked. The original version is shrinking from view as it washes downriver. But is it better now? Truer? Should you have played more of it back when you bought it, or waited?
Most modern video games exist in a constant state of flux. That includes competitive online games like Overwatch and Splatoon as well as singleplayer games like Breath of the Wild and The Witcher 3. Deciding to play a new game means deciding to play this version of the game, now. If you want to play it today, you play today's version.
Adjusting to this new way of playing video games means embracing an era of ambiguity. Nothing is finished; nothing is final. Should I have waited for Nintendo to patch Breath of the Wild to run at a stable frame-rate before diving in? Or should I have waited for the first major DLC update? Should I still be waiting for whatever they will add later this year? There's no way to be certain.
In 2012, my boss Stephen Totilo wrote an article responding to the game developer BioWare's controversial decision to alter the ending to their game Mass Effect 3. The question, as he described it at the time, was whether we as players would recognise game developers as "artists of a malleable medium."
In 2015, my colleague Jason Schreier and I set out to complete "The Mountaintop," an arduous quest involving the competitive portion of the online shooter Destiny. Many hours later, he finished it; I took a break at about 75%. Not too long after that, the developers at Bungie patched the quest to make it much easier to complete.
I still remember how pissed he sounded over chat as he realised that he'd just spent hours beating his head against a problem that no longer existed. I coasted through the last quarter of the quest, glad I had stopped when I did.
In 2017, if you visit the discussion forums for any major video game you'll see multiple threads complaining about the latest changes; eviscerating the newest patch notes; yearning for a long-past era when the game was just fine and didn't need this latest update in the first place thank you very much. Those arguments stand shoulder to shoulder with others imploring the developers to change the game in some other way. We the audience no longer simply consume; we participate.
Most modern single- and multiplayer video games are better thought of as processes than objects. Thanks to a glut of methods for communicating feedback, audience members — and this goes well beyond video games and into film, music, and art — are newly empowered to challenge artists and try to get them to change their art. Modern game developers go to great pains to let players know they're listening. Their financiers speak in aspirational terms of "games as a service," hoping to launch products that retain audiences for months or years.
They release unfinished games into early access, enlisting their audience to help them improve the final product. They send representatives onto forums and subreddits, hosting Q&A sessions and hopping into threads to respond and engage. We gamers know that if we want something changed and make a convincing (or noisy) enough case, there's a chance that we could actually see our wants become reality.
That is by and large a positive — and certainly interesting — development, but each new update requires more time for us to process than we may be given. We express initial scepticism, we give it the benefit of the doubt, we imagine why it's happening, we try it out and see how it feels, we accept or reject it. Usually by then, the next change has walked up, shaken the Etch-a-Sketch, and started us over.
All that change has helped me understand the appeal of retro games, and in particular cloistered retro hardware like the upcoming Super Nintendo Classic. Here's a small box that will play 21 Nintendo classics from the 1990s. It doesn't connect to the internet. It doesn't receive periodic updates. It just is. Likewise the games you can play on it are the same as they were twenty years ago. There is no best time to play them because it's always the best time to play them.
As we draw closer to the September launch of Destiny 2, I've been reflecting on the last three years writing about its predecessor. The Destiny that I play now exists in the same spot on my PS4 hard drive as the one I reviewed back in 2014, but the similarities end there. Even those of us who have been with Destiny from the start could never hope to achieve consensus on when was the 'best' time to play.
Was it in year one, when everything was new and exciting and Peter Dinklage was still in the game? Or early in year two, when the game hit its creative peak? Or was it during year three, when a new player would find the richest variety of things to do?
When deciding when to play a new game, it's possible to be too early. You're left dealing with bugs that will later be squashed and tripping over potholes that will later be filled in. It's also possible to be too late, starting an online game just as most players have moved on to a sequel or the developers are about to shut off the servers. There are some best practices, of course. Don't rush to buy a game on day one. Wait to hear what kinds of changes the developers are planning before you commit.
Better still, just wait for a sale. It usually won't take long, and as a bonus, you'll probably get to the game after at least one major update.
Playing video games in 2017 means reconciling yourself to the idea that you will almost never play a game at the optimal time. You will sometimes be too early; you will sometimes be too late. Everything is being updated and re-updated, a cavalcade of eager tweets and blog posts and notifications informing us of the latest LATEST version of the latest thing. The only thing to do is ride the river as best we can.