For years, Steam has struggled with the issue of review bombing, where large numbers of users leave negative reviews on games’ pages, often because of controversial subject matter or news that doesn’t actually impact the game’s quality. Today, Valve finally addressed the problem. Sorta.
In a new blog post, Valve announced that it’s decided to try and disarm review bombs — which it said generally only temporarily crater a game’s review score, but sometimes do lasting damage — not by changing user reviews or the way they function, but by adding some graphs.
“In the end, we decided not to change the ways that players can review games, and instead focused on how potential purchasers can explore the review data,” wrote Valve’s Alden Kroll.
“Starting today, each game page now contains a histogram of the positive to negative ratio of reviews over the entire lifetime of the game, and by clicking on any part of the histogram you’re able to read a sample of the reviews from that time period.”
“As a potential purchaser, it’s easy to spot temporary distortions in the reviews, to investigate why that distortion occurred, and decide for yourself whether it’s something you care about,” he added.
“This approach has the advantage of never preventing anyone from submitting a review, but does require slightly more effort on the part of potential purchasers.”
He also noted that Valve considered other solutions like removing review scores, halting reviews on games during time periods that appeared to be review bombs, and changing the way user review scores are calculated.
Ultimately, though, Kroll and co felt like all of those things were either too invasive or made reviews less indicative, which is why they went with graphs instead.
So basically, Valve saw a situation in which people were manipulating data and decided to add more data. It’s the most Valve solution ever.
I’m not gonna beat around the bush here: I don’t think this is a great way to stop review bombs, which most recently tanked Firewatch‘s review score in the wake of the Pewdiepie controversy, but have also dinged everything from indie game Titan Souls after a beef with YouTuber Totalbiscuit to Baldur’s Gate: Siege of Dragonspear after people found out the game contained a trans character.
Even with these changes in place, review bombers will still be able to exert undue influence on games’ scores, which will remain a metric that dilutes conversation around games down to overly simplistic factors like “is it long” and “does it do the graphics.”
Not only that, review scores affect the way games are regarded by the Steam store’s increasingly important algorithms, meaning that review bombs can damage their chances at success in that way, too.
And as ever, smaller developers will remain most susceptible to the ravages of review bombs, something that will have an at least subliminal (if not overt) influence on their creative choices and actions, given that their livelihood might be at stake.
In some cases, yes, users see a low score — or a high overall score and low recent score — and decide to do some digging. They generally skim a handful of reviews to get the gist of what’s going on. I’m not sure how much graphs really change that process, though.
If you’re a knowledgeable Steam user, it’s not hard to figure out when a game is being review bombed. If you’re not super aware of how Steam works or the reasons a game might be getting bombed, I’m not sure you’re gonna start looking for graphs in an obscure tab near the bottom of a game’s page.
I imagine people like that would see the score, figure the game’s got some issues, and move on. I don’t really know who this is for, is what I’m saying.
The problem with review bombs isn’t just one of awareness. It’s also the damage they can do. Valve has, at best, only addressed the first half of the problem, and in a way that doesn’t even strike me as particularly useful.
I suppose we’ll see what happens, though.
Comments
16 responses to “Valve’s Solution To Steam Review Bombing Is To Add Charts”
If there ever was a way to make everyone happy this would be it… but knowing the gaming community this will somehow magically not be it.
As a software engineer I think I need to stress that this is an incredibly difficult problem to solve properly, but I agree that I don’t think adding charts really helped.
Problems I see (rhetorical – seriously, don’t try post your responses to these, I know there are strategies, but they won’t be perfect):
– Just looking at the raw data, how would you know if the review bomb is legitimate or not? (developer introduces a huge bug in the last update, vs “political” review bombs)
– When is a review bomb a review bomb? What if the reviews suddenly drop to neutral instead of strictly negative?
– The intensity of a review bomb is probably seasonal, you’d need to make sure the algorithm doesn’t skip “off-peak” review bombs
– Should you use signals to determine if the review bomb is warranted? Did the developer stop updating the game, etc.
Most of those points are already addressed in the article – Hovering over the period in question gives a sample of reviews from that time so you can see what they’re about. Presumably clicking on that section will show all of them.
The charts serve to highlight the problematic period and make it easy for potential purchasers to see what was going on. Whether the negative scores are warranted or due to external factors that they personally don’t care about.
Doesn’t fix the overall score but it at least lets the customer make an informed decision.
Thank you for explaining charts to me.
Well it seemed like you needed it after your comment. Either that or a reading comprehension lesson.
I think the point is missed by the author. The graph isn’t there to stop review bombs, it’s there to add context to them. It doesn’t telling people what to think, it just gives them the information they need to evaluate whether some reviews are worth disregarding. That evaluation is still down to the individual based on their personal values.
The chart counts reviews themselves, not the aggregate score. Reviews can only be positive or negative, there is no neutral option. It doesn’t matter if the aggregate score goes from positive to negative, positive to neutral or very positive to less positive; all the equation cares about is whether a numerically significant number of negative reviews were submitted in a statistically relevant timeframe.
I’m also a software engineer, and I think it’s a great addition.
“So basically, Valve saw a situation in which people were manipulating data and decided to add more data. It’s the most Valve solution ever.”
Wrong way to look at it, completely. What they’ve done from what I can ascertain is they’ve included a way to view data by months(or whatever.)
The only reason you’d criticise change like this is if you agree with the sentiment that consumers are stupid and can’t be trusted with their own judgement.
Ultimately this is all going to be bandage work until people just stop being knobs, both in regards to review bombing but also higher-ups that probably get off on seeing high rotten tomato scores that are without a doubt the reason why developers and publishers give a shit about these ratings in the first place.
But the Steam community is shit. User reviews are garbage – joke titles getting positive reviews just to post a meme, review bombing, accusations of bias or buying reviews… so much of it is pointless noise that the entire thing has become debased as a measure of a game’s worth. I don’t know how this can be fixed though.
The answer is skim reading reviews, and a quick hop into the Discussions page.
Few people want to do that. There’s an arseload of games to look at – if it doesn’t grab me early on, I probably have better things to look at. Valve’s plan to leave everything up to the community is a bad idea and now they have to come and fix it.
There IS no other solution. Reviewing is way too subjective to get meaningful data out of it. Anyone who thinks a review can be objective is kidding themselves.
I detailed the steps below, and it’s basically about filters. A whole lot of shit doesn’t ever pass the first three steps of, “Do I like the title/genre/screenshots?”
If it’s getting any further than that, or a quick skim of the reviews (to get a vauge impression of whether people are complaining about culture wars bullshit, memeing, or the developer’s idiotic PR stunt vs anything about the game itself), that’s worth the time spent.
You’ve posted a lot of stuff down below – but that’s probably more time than most people are willing to invest. I’m not going to trawl through the store pages for something I might or might not want. I’ve got a load of games to play, with more coming out all the time. Steam is saturated with titles. I’ll probably read the first few reviews and look at the number of positive to negative reviews, and then stop reading the store page – usually because the first few reviews are all variations on subsequent reviews or jokes.
There’s probably no other way for Valve to fix it – but that doesn’t mean that user reviews aren’t broken or a bad metric as it stands (especially for popular titles). You know how I find out about games these days? Sites like Kotaku or YouTubers bringing them to my attention. If I hear about something that looks interesting, I go watch footage of it on YouTube and decide whether it’s something I’d like to play. Given that devs are now using streamers as advertising for the cost of a Steam key, I’d wager that a lot of other people do this, too.
There is an alternative option – look at the overall score as a loose guide only, then use curators and/or favourite reviewers. You’re right that all reviews are necessarily subjective, so a good way to find relevant ones is to select people whose taste in games has a lot in common with yours, and read their reviews specifically. Personally I find Total Biscuit and Jim Sterling to give great reviews even when our tastes differ, so I subscribe to both of them as curators and I follow through to their written or video reviews via the curator system.
My method for evaluating a Steam purchase:
1) Is the art/title catchy/sound like a genre I’m into?
YES. Judge all books by their covers. Rule 1. Seriously, you’d be surprised. Also goes for titles you may have already heard about, but hey, here it is on the store page!
2) Are the tags for a genre I’m into?
I personally filter out anything that has MOBA, Sports, Racing, you might filter out visual novel or RPGmaker. RPG is making me more interested than ‘indie roguelike’. Does it have ‘onlinePVP’ tags, but you’ve gotta hunt for Single-player? They probably only added it to tick a box, but you know where their focus is.
3) Are the screenshots interesting/informative?
Oh, that sounded like a cool anime/cyberpunk RPG, but from the screenshots it looks like it’s a bullet-hell shooter. Also a good way to spot visual novels/RPGmaker titles. Does it have text and concept art super-imposed over the screenshots? It’s a shitty mobile game port. Fucking avoid and/or report it to Steam as a violation of their T&Cs. Same goes for ‘photo mode’ dramatized screenshots that hide the UI and just show off skill shots or posing in-game. Do not reward those fucks for their deceit.
4) Is the price right? Discounted heavily?
Doesn’t necessarily stop me looking at the page if the first three points caught my interest, but if the first three DIDN’T get my interest, a low price can still capture it.
Beat the first three hurdles from the front page? Let’s jump into the store page for the title.
5) What’s the overall review score?
Mostly positive/negative doesn’t mean shit when there’s only 20 reviews up. You might need to wait, or look closer. 1000+ reviews? Now that’s more useful. Compare recent to lifetime. If Lifetime is good, but Recent is bad, they probably pissed everyone off with an update or PR snafu. Or they were in Early Access/crowdfund-backer-only beta and pissed off the backers by not meeting promises or letting johnny-come-lately scrubs get access to the same rewards the special snowflake backers wanted to keep exclusive. Either way, worth looking at some actual reviews, in a minute. Be aware that almost every F2P title will rate lower than a box-priced game of equivalent quality because there will always be a negatively-rating, “Everything should be free, this is P2W” brigade who don’t think any amount of money is reasonable to spend for a proper game experience. Fuck them.
6) If there’s a video that looks like it features gameplay and not just cinematic concept art, watch it.
You could also go look up a Let’s Play on YouTube, but the video is RIGHT THERE. Just click it. Move the volume slider up.
7) Let’s go looking at the actual reviews. Just the first few overall are probably fine, maybe take a peek at the recents down the side.
Reading comprehension and a working knowledge of human behaviour are your friend here. Tonnes of brief, pity, irreverent, “Did something stupid, 10/10” reviews? Why, hello idiotic youtube/zeitgeist fad. Dig deeper, switch to negative-only reviews which will be more honest. Is every negative review complaining about SJWs/MRAs? Fucking culture war bullshit is getting in the way. Watch a Let’s Play instead. Are most of the negative reviews talking about bugs? Maybe wishlist/follow it. Or decide if you’re cool with it if the first few points REALLY caught your interest. Are they complaining about price? Wishlist, unless you don’t give a shit. You know your budget. In general, negative reviews will give you more useful information about whether a game contains your personal deal-breakers. If the only complaints everyone has is about some irrelevant bullshit you don’t care about? Well, you might be on to a winning purchase!
8) Still undecided, it looks kinda cool, but price is a bit high, bugs are a bit concerning, complaints about lack of content leaving you thinking you might wait for the season pass?
MAYBE go into the discussions.
You wanna look for the stickied threads first, see if the questions you want answered are there. Maybe the first page or two has your question, too. How’s the single-player, is it viable? What’s the multi population like? Oceanic servers? Developers have abandoned the game and aren’t doing updates, this title is a ‘finished’ early access title that essentially failed development? This is where you’ll find out about it.
If still in doubt… Wishlist.
Fun fact: developers can and do look at how many people have wish-listed their games. Your expression of interest might lead to a price drop or post about content update/bug fixes.
You totally picked this out of my brain 😛
Aside from what I mentioned above about using curators, this process is basically the same as what I do. I’ll skip over any review that’s a one liner in step 7, but I’ll deliberately read the two top rated properly written positive reviews, and switch to negative only and read the two top rated properly written negative reviews.
On point 5, you might find SteamDB useful for supplementary information. It uses the Wilson confidence interval to generate a rating that gives better context to low numbers of reviews versus high numbers of reviews. I’d love for Steam to adopt the formula, I think straight ‘positive / total’ scoring systems are too bare bones.