Big Study Of 10 Million Steam Reviews Is Absolutely Fascinating

Steam reviews have a reputation for being whiny garbage, but they can still be really useful when the trash is filtered out and some serious data extracted. Which is exactly what this in-depth study of the service has found.

"An Empirical Study of Game Reviews on the Steam Platform" is the name of a study published by Dayi Lin, Cor-Paul Bezemer, Ying Zou and Ahmed E. Hassan from Queen's University in Canada. While conducted with mobile app reviews in mind - seeing how the underlying trends between the App Store and Google Play Store compare with Steam - the data they have extracted is still pretty useful for anyone who has ever released a game on Steam, looked at their reviews, and thought "oh no".

The team looked at every review left on 6224 of the games available in early 2016, excluding only those with 25 reviews or less so as to lessen the chances of bias messing with the results.

They then developed custom crawlers to first sort the updates to each of those games, and dug into the 10,954,956 reviews - across all languages, with 6,768,768 being in English - to cut out "non-informative" ones that featured no useful text, or just something like an emoji. Even the amount of time played before leaving a review was taken into account:

However, the number of playing hours (i.e., the number of hours that the reviewer played the game) that is shown with each review is not the number of playing hours at the time of posting the review, but the number of playing hours until now. Hence, in order to study the timing of gamers posting reviews, we developed another real-time crawler which only crawls reviews that are received within the last 6 minutes of the time of crawling, to collect reviews that have an accurate number of playing hours. Therefore, we were able to collect the dataset with an error margin of 6 minutes. We ran the real-time crawler for a month and collected 28,159 reviews with an accurate number of playing hours.

Reviews were then sorted into the scope of game (indie or big studio), whether they were early access or not, free-to-play or pay upfront, and if the review was positive or negative. It left them with:

Digging into the specifics, we get to the really interesting stuff. Some of the team's findings include:

  • "Negative reviews are slightly longer than positive reviews, but the difference is negligible."
  • "Early access reviews are slightly longer than non-early access reviews." That's because users playing early access games are often leaving feedback alongside overall sentiments. It was also found that early access reviews tended to be more positive than reviews for completed titles.
  • "Players write longer reviews for games for which they paid". Free-to-play games averaged 105 characters per review, while paid games averaged 215 characters per review. "One possible explanation is that paying for a game makes players feel more strongly about that game."
  • The language skills employed in reviews had a "median readability level" equivalent to an American Year 8 student. Which, I'll be honest, is a lot higher than I was expecting.

Once overall statistics and number-crunching had been performed, 472 reviews were chosen for a deep dive based on their tone and substance, and broken down into one of these six types:

The highlights of this part of the study include:

  • Only 42 per cent of reviews provide information that's deemed valuable to developers looking for feedback on how to improve this game, or games in the future. That sounds like a small percentage, but it's actually hugely useful, because it shows that for games with hundreds of reviews, there can be dozens (and even hundreds) of reviews that contain useful information, even if they're negative.
  • "Players complain more about game design than bugs." Thirty-four per cent of reviews talk about negatives related to the design of a game, while only eight per cent mention bugs or technical woes, "suggesting that players value a well-designed gameplay over software quality".
  • "Negative reviews contain more valuable information about the negative aspects of a game for developers", suggesting that for all the pain it might cause, digging through negative reviews of a game will be more useful to a developer than going through the positive ones.
  • "Gamers play a game for a median of 13.5 hours before posting a review". And surprising absolutely nobody, negative reviews of games are submitted "after significantly less playing hours than positive reviews".

The conclusion the team came to was that, once you could cut through all the noise, Steam reviews can be really important to developers. They have just got to look at the trends and data instead of focusing simply on how many negative vs positive reviews their game has received.

You can read full study here.


Comments

    Yeah, I'd like to give my review of this article that is a review of other people reviews:-
    Interesting. Some of the grammar is a bit buggy ).

    "Negative reviews are slightly longer than positive reviews, but the difference is negligible."

    Mind = Blown.

    This review of the study of steam reviews made me bang my head on my desk at around the third paragraph. I would leave it until it is patched.

    Oh hey, look, it's basically what I've been saying about the usefulness of reviews (especially negative over positive) when evaluating a purchase! :D

    I'd have selected slightly different categories, though, or at least included additional ones for political bandwagoning. Protest reviews from crowdfunding backers/beta-testers/culture war activists etc will tend to skew the results and are almost always worth ignoring.

Join the discussion!

Trending Stories Right Now