Review Scores Are Broken — Here Are Some Sensible Solutions

Review Scores Are Broken — Here Are Some Sensible Solutions

Reviews score eh? First there was the outcry over Eurogamer’s 8/10 Uncharted 3 review (a review that I personally really enjoyed reading) and then Gamespot’s 7.5/10 for Skyward Sword. Bloody review scores — everyone knows they’re broken. Kotaku reader and GameTaco stalwart Smoolander has some top notch ideas about how we change them!

We Need A New Review Score

Review scores are broken.

I know this because the internet told me, and the internet has never led me astray before. There are so many examples of broken reviews amongst the plethora of games recently released that I am currently going deaf due to the shrieking of “SHILL” on every second web page.

Many people have come out in defence of review scores, blaming the public for misunderstanding them, and others have pointed the finger at the clearly biased people who assign them. Undeniably, to any partial observer, this whole situation is now a complete catastrophe that if left unchecked will tear a hole in the very fabric of space and suck the entire internet into it.

Some people have tried to offer helpful suggestions as to how to fix the problem – Matt at Australian Gamer provided some advice to assist dodgy reviewers – yet everything that has been put forward still relates to the current method of scoring games.

This is a vicious scenario and for the gaming community to fix it we need to take drastic action. We need to throw out the current way we rate a game and devise a new fool proof rating scheme that will be above criticism. While everyone has been throwing pointless arguments into the void, thankfully I have been formulating new methods that will be above reproach.

Now we only need to agree which of the three perfect systems should be adopted and all our issues on the internet will be solved.

Review Scores Are Broken — Here Are Some Sensible Solutions

The Orwellian Score Named after the brilliant man himself, George Orwell, this scoring system ensures that everybody will be happy. Using a patented 10 to 15 point rating scale, it acknowledges that every game is perfect, but some games are more perfect than others.

Even if another game scores higher than your new favourite game, your new favourite game is still perfect. Nobody loses. Publishes are happy, fans are happy, and reviewers can’t possibly get it wrong.

Review Scores Are Broken — Here Are Some Sensible Solutions

The Sir David Attenborough Score Sir David Attenborough is a highly respected man throughout the world, and it is obvious the love that he has for the animals and vegetation of this planet. So what better way to score games than by implementing a system that would be equally as respected and loving as Sir Attenborough?

No more howls of derision when a game “only” receives an 8/10, instead everyone will be required to look at games in the same loving way that Sir Attenborough looks at nature and assign each game the score of an animal. In scientific terms.

“I award this game an Archinida Scorpionida”. “This game is nothing short of a Camelus Camelidae”.

This scoring system includes the bonus of Sir David Attenborough narrating video reviews of games himself. How can this not be the perfect method that we should adopt?

Review Scores Are Broken — Here Are Some Sensible Solutions

The Healthy Score Similar to the Sir David Attenborough scoring system, this review method will adopt a much healthier approach to game scores and remove those negative stereotypes about gamers being unfit and overweight.

We’re all encouraged to consume five serves of vegetables and two serves of fruit each day by nutritionists. What better way to achieve this than to rate every game with a vegetable or fruit. Your game playing each day contributes to your health.

There would be nothing more exciting than reading a two thousand word review only to see a conclusion of “This game thoroughly deserves a Cumquat” or “The game has some faults but it really is a Passion Fruit”.

This scoring method also has the added benefit of receiving the National Heart Foundations “Tick of Approval”. Anytime your Mum or partner complains about the time spent playing games, just hold up that game box and point out the big red tick on the front and proclaim “I’m just being healthy”.

There they are. Three perfect methods for scoring games that could not possibly contain any flaws. Now we just need to act in the civilised manner that is so common on the internet and decide which of these should be adopted by the gaming community.

Comments

  • If they used the whole scale instead of just the top end, we wouldn’t be in this mess.

    Too many games are given cumquats when they’re really just bananas. If we called a banana game a banana every time, people wouldn’t be outraged but instead they expect a banana to be called a cumquat.

    Worst of all, grapes are just completely neglected for anything that even barely functions when there are plenty of games that deserve to be left at that end of the scale.

  • Gamespot’s review of Skyward Sword was interesting. Although I found it interesting that they started penalizing Zelda games for being too samey with the release of Skyward Sword. Why not start 10 years ago?

      • Or rather, people have snapped out of the mindset that “everything that Nintendo makes is A++++ god-tier”… during the reign of the Wii, they could’ve farted in a can and people would’ve lined up around the corner to sniff it.

        • Which is what Cod is also doing now. It’s samey, but no one cares. Plus they continue to refine it and experiment with it in small ways. There’s no denying that they’re good games though.
          btw, just want to add, i am not a fanboy, i also have a ps3 and a 360

          I still prefer Kotaku’s Liked and Dislike approach. I think it’s best X3

    • Quote from the same author: “Mega Man 10 – The 10th installment in this long-running franchise proves that some formulas don’t need to change”. Nothing like consistency. But his review has people talking and presumably getting the site traffic, so it seems like it worked.

  • Why not just do something like YouTube did with the Like/Dislike system and go with a ‘Yes, this game is worth playing’ or ‘No, it’s not worth playing’.

    Simple. 🙂

  • My brother just entered the room while I was watching Gamespot’s ‘Skyward Sword’ review.
    He asked, “What did it get?”.
    I replied, “A 7.5.’
    He stormed out of the room repeating the word “bullshit” several times.
    He did not watch any of the review.

    THIS is what is wrong with review scores.

    • I have a point to make in the assumption your brother has not played Skyward Sword (if he has, ignore it).

      That is not what is wrong with reviewers & scores, that is what is wrong with gamers/fanboys. He has never played the game so why is it bullshit that it could be a 7.5 game?
      It is not at all inconceivable that a game is not as good as previous versions or not as good as it is hyped to be.
      Not everyone likes Uncharted, or Halo, or Mario therefore it is not out of the realm of possibility that this reviewer sees it as a 7.5 game.

      The real problem with review is they are inconsistent and some are clearly bought (GTAIV). I’ve read reviews where points are deducted for being ‘too innovative’ or ‘uses the same tired formula’ whereas others are praises for the exact same thing.

      • I think you’re misunderstanding me.

        The problem is a 7.5 is considered bad by many people.

        It would be between a three-and-a-half to four star rating, which is considered “good” in film and music.

  • The fanboys will never change unfortunately. Sure they grow up but two more take their place. Frankly I’m content to leave them in their holes arguing over which identical fps is marginally better than another identical fps. I’ll be playing games that interest me.

  • This is the reason I prefer scoreless reviews sometimes. Then again, when reading a review, I focus more on the pros and cons stated rather than the score at the end.

  • I think there should be a scale that goes to the negatives. -8 to you damnation. You have sullied my Gamertag as I can not erase you from it.

  • Reviews are and always have been “opinion pieces”.

    I seldom base game purchases on raw scores, but rather the argument presented by the reviewer.

    There are people who simply won’t like what others agree is a “perfect” game.

    That’s perfectly fine.

    What isn’t fine is people simply blindly following reviewers without digging down to figure out what exactly they’re expecting.

    Read several reviews, try to get an accurate impression of the pro’s and con’s and make an informed decision.

    • Don’t get me wrong: if Skyward Sword plays as they say it does and works as similarly to other Zelda titles, I would agree with the score. I definitely enjoyed reading the review though.

      Also +1 for the Fonzie system.

  • This might not be perfect but I like the 4 star review system.
    1 = Very crappy, nobody should touch this or you will perish
    2 = A decent game, might be worth a spin if you are a fan of the franchise or the genre
    3 = A well made game in many aspects albeit some notable flaws but still it’s a great game overall
    4 = Not necessarily perfection but games don’t get much better than this, a truly magnificent title

  • “Sir David Attenborough narrating video reviews of games”

    Fund it!

    What we really need is a scoring system where the median isn’t 7/10, where people realise that 5/10 is ‘average’ and not a terrible stinker, and where people *actually read the text of the reviews* and also stop deriving all the meaning in their life from whether or not some guy on the internet agrees with their predefined opinion of the game they haven’t played yet.

  • I think review scores are worthless. I say do away with them entirely. I feel that there has been an inflation problem in scoring, with 7 becoming the new 3. If a AAA title gets anything below an 8 it’s an outrage, when in reality an 8 should represent an almost flawless experience. So I would recommend doing away with scores and focus on a well thought out and constructed review analyzing and highlighting pros and cons in an objective manner. This approach would do away with shitstorms generated by a single digit and hopefully spark intelligent analysis and response from consumers.

  • I liked the system used at Ars Technicas game column. Buy, Rent, Skip.

    It used to be a mid range score 50 or 5/10 was average, middle of the road affair. with a 6 or 7 being a above average.

    Scores are pretty BS these days. When I read a review. I want to know what is right AND what is wrong with it. I’ve been burnt in the past. Especially on PC ports. (some PC reviews, Aren’t.) Sometimes there are issues that don’t get a mention you have to go the forums to find out.

  • I base reviews towards a percentage of how much money you should pay for the game new. If it scores an 8/10 then the game is worth about 80% of the asking price.

  • you know the reviews i seam to like the most…..
    Kotaku “gut check” articles
    They pointed me toward rayman origin = AWESOME GAME
    thanks for that Kotaku, keep it up

  • I would prefer to see reviews be ended with pros and cons rather than an overall score.

    Alternatively reviewers could have their own unbiased opinion but that’ll never happen.

  • I love the Orwellian approach best. @Bluemaxx the gut check articles are great but be sure to read them as some of the writers may not have played the game and are basing their opinions on preview footage.

  • I kind of like the idea of ratings as a price you should pay for the game. Eg you could argue that skyrim is worth $120 its that good but need for speed the run is worth $50. Nfstr might be a good game but its not worth the same as skyrim.

  • We need to get rid of the scores altogether. When you put a score on a review you’re attempting to assign a quantitative measure (i.e.a number) to a qualitative piece of information (the review). At this point, a lot of people just disregard (or, I suspect, don’t even read) the review itself (which is actually the important part) and instead attempt to just use the numeric scores as a basis for some kind of absolute comparison between two (potentially very different) games.

    Looking at the score instead of the actual review content is a pretty stupid way to compare two games – you can play 10,000 games and write 10,000 individual, nuanced reviews.but you’ll only ever have a finite number (10 or 100 or whatever scoring system you’re using) of possible scores you can assign to all of those thousands of games.

  • In all seriousness, I think that the ‘5 stars’ approach, most commonly used with films, is the way to go. Although you can translate it into a 10 point scale (or 100 point for that matter – which metacritic would inevitably do), the star system seems to have different connotations.

    For example, 3 stars doesn’t seem anywhere near as rubbish as a ‘6’. 5 stars also doesn’t seem anywhere near as outrageous as awarding a perfect ’10’ either.

    Using the star system would encourage reviewers to use the rest of the scale, i.e. 1-6, or, 1/2 star to 3 stars. More focus would then be paid to the actual content of the review, the qualitative criticisms and compliments.

    • This is another good system. But I think it should be limited to a single star. Let’s rate a game a ‘Harrison Ford’ or ‘Julia Roberts’. If you thought the game was really terrible you could give it a half star – ‘Gary Coleman’.

    • Only problem. Dont let David from the ABC show, “At the Movies” review anything, otherwise nothing would get over 3 stars! hahaha how Id love to see that dude review Austin Powers Goldmember:D

  • Perhaps the problem is that there are too many reviews.

    If there was one independant agency responsible for reviews… maybe.

    Perhaps we need to standardise a scale…
    Let’s say reviews are out of 50.
    That’s five areas, 10 points per area.
    Graphics, Gameplay, Ideas, Implementation (ensemble) and 10 points for the reviewers own discression.

    Let’s include a sixth area, the points of which are deducted from the totals.
    Failings or flaws or something.

    ———————————————–

    Or, you could do it like pipe band judging.
    Everyone starts off with a score of 100 (it’s less on lower grades, but we shant go into that here). Then points get deducted for each error (if a mistake is continually made, then you lose marks each time it happens.

    • I should also point out that pipe band judges are audited every year, and they have two judges at comps… so there can’t be a claim of bias.

    • The problems with systems that assign points for meeting specific requirements is that there are often games where that stuff simply isn’t important.

      Just look at Space Marine. The plot isn’t brilliant, the music hohum, the controls (on PS3 at least) were imperfect but it was crazy good awesome fun.

      Yes, “crazy good awesome” was how I would describe the fun that was had in Space Marine.

      Assigning values really doesn’t mean anything. It isn’t a competition where they’re trying to objectively determine what is best, they’re trying to give people feedback on if they would enjoy a game or not.

      • But they’d be bumped up by the aggregate.
        The reviewers own points, and the ensemble score, in addition to whatever they’d recieve in the other areas would bump it up to something competetive.

        Maybe you award a set number of points for fun too.
        If we limit the scores to specific fields then we can control and illuminate the process.

        • We also end up with developers neglecting other areas because they’re more concerned about high review scores.

          Look at all of the technical wizardry that goes into film making. We don’t judge films on each aspect, we look at the whole package and see how well it goes together.

          Same should work with games. Review scores aren’t scores they aren’t objective ratings of how good the game is, they’re numbers to arbitrarily represent the opinions of the reviewers.

          Don’t make it technical, don’t make it complicated. Really, anything more than a 5 star scale (don’t buy, look into it if you’re into this sort of thing, pretty okay, better than most, go buy this now!) is unnecessary.

  • I only read Tim Rogers reviews now. It’s a whole different world when the author has no constraint, no rules to abide by. It’s great. I wrote a love letter to Tim here about it.

    http://wp.me/p1NBQj-8O

    Scores are broken and reviews are unexciting and this neverending cycle is sucking the life out of any integrity video games journalism has.

    End rant.

  • I suggest a new rating system: The Firefly Quote Rating System.

    “Final verdict: I give Skyrim an ‘I’ll be in my bunk.'”
    “Despite these flaws, Batman: Arkham City scores ‘I can kill you with my mind.'”

  • These are all great systems for ranking games, but I suggest the male perspective (sorry for being sexist here), to talking with a woman. Such as at a party or shindig. It would either be a yes ‘1’ you would definitely approach and say hello to that female (game) or no ‘0’ she’s just not my type (of game). Obviously there would be pros and cons to each argument but wouldn’t it make things simpler?

  • Screw the number rating system, why don’t we move towards 1UPs “grading” process (from D- all the way to A+)

  • Here’s how a review score should be:

    Story- x/10
    Graphics- x/10
    Sound- x/10
    Singleplayer- x/10
    Multiplayer- x/10

    Gameplay- If game has online play, average of both Singleplayer and Multiplayer scores. If game doesn’t have online play, use score from SP and disregard MP

    Overall Score: Average of all the scores
    —————————————–

    Games should be judged on those aspects, and not on hype. A game isn’t a masterpiece unless everything about it is absolutely flawless. You can’t give a game a 10/10 just because “oh i like the graphics” or “there’s so much hype”.

  • I like Giant Bombs five point recommendation scale fine. It’s simple, but effective. It’s just annoying when people try to translate a score given on that scale to a score on another scale. That’s just not how it works and it’s kind of mind boggling that there are people that can’t seem to grasp it.

  • I thought the old system on screwattack was a good way of doing things. At the end of their reviews there was no score but every game was a “buy it”, “rent it” or “f*** it” rating. With their new redesign they have gone to an x/10 scoring system which was a shame. This system does have some problems in Australia considering that many places don’t have a decent rental system, but it gets the point across.

  • How does a number mean anything? If I rate you a 5, what does that even mean? I’ve always like this quote from Yahtzee:

    “And I don’t believe in scores because I don’t believe a complex opinion can be represented numerically. You like numbers? How about four, as in fourk you! Do you really need someone in authority giving you a simple “yea” or “nay” before you buy anything? Why don’t you roll over so they can stamp on the other side of your face?”

  • Having been reading through people’s comments I think the review system isn’t the problem, it’s the wankers reading them.

    I read that 1 star should be absolute trash and if you play it you will perish. The same poster said 2 stars was a decent game, to be played by fans of the genre.

    I also read that 8/10 is near flawless. If I do a test worth 1000 marks, and get 800 marks, that is still 200 marks away from flawless.

    It’s peoples interpretation of the scores that are the issue.

    I know how to fix it.

    1. Only compare games of the same genre – a 9/10 fps is a very different game to a 9/10 platformer.

    2. Use a better then/worse then system. This goes along with the score, and gives people perspective on the reviewer. Here is an example: I review gta4, mafia 2 and gta: San Andreas as all 9/10 (hypothetically). In my review of gta4 I would state that it was better than mafia 2 and worse than gta:sa (even though they were all 9/10 games)

    By doing this people won’t mind that their favorite game that is coming out is less than 90 on metacritic, because it’s not being compared to batman Arkhangelsk city or skyrim. As long as it did better than mw3, a battlefield fan doesn’t care about skyward sword.

    Hopefully you guys know where I’m coming from…

  • Get rid of scores, give opinions. If your readers can’t take the time to digest what you found good and bad without a number to help them understand, your better off without them

  • Why get rid of the scores? Films have scores and it works. Game reviews just need to be more consistent in their criticisms. If they’re going to deduct a point from a game for whatever reason, then apply the same standards to all games.
    And another problem is so many people look at a game that gets anything below a 9 as ‘crap’ or not good enough. That has to stop.

    I consider 1-6 to be crap-average. With 7-10 being good-amazing.

    I like the 10 point system. I don’t think 5 points gives enough room.

    • Because it is impossible to boil down a highly complex and subjective opinion into a numerical score. What you like I may not like and . What you find fun, I might find juvenile and putrid. The list goes on.

      You cannot give a numerical score to a medium dependent on the factor of YOU, specifically YOU, are having fun. It’s inherently subjective and it’s silly that the whole thing hasn’t been scrapped. Rockpapershotgun is a perfect example of a popular site which has figured this out.
      I can tell you right now their opinion pieces on games has helped me decide whether I’ll like a game more than someone trying to a give a objective, numeric score to a game.

  • Yes you can. Films have them and it works, for the most part.

    Yes game reviews are mostly just opinions. But they do also address whether a game is well made or not. A well made game gets a higher score and a poorly made one gets a low score, with the reviewers enjoyment of the game increasing or decreasing the score a certain amount.

    For instance. I’m not that big of a fan of Gears of War. But I can see all the great qualities they have, and they’re well made. So I understand the high reviews it gets, especially if the reviewer enjoys it.
    I would give it a 8-8.5 because of those qualities, but deduct a point or two because I don’t enjoy it that much.

    1 = poorly made game. 10 = superbly made.

    I don’t buy games based on what score it gets. And I almost never read the reviews. But I still think there’s nothing wrong with having scores. I like looking at scores and seeing what games get.

    • It doesn’t work in the film industry either.
      You just said it yourself, you took a mark down because YOU didn’t find it fun. How on earth can you use an objective based scoring system if you take away marks based on subjective views? It simply does not work and it makes no sense at all.

      As for not needing reviews if there’s no numbers? Are you crazy?
      You can still review a game, detailing your experience and whether you liked the game, based on valid points you find, like for instance you found the game unintuitive to play.

      You can come up with legitimate arguments for not buying games by presenting convincing arguments and you don’t need numerical scores to back that up.
      You cannot write numerically objective reviews based on subjective experiences. It can’t work and doesn’t work. Guys like Yahtzee and the writers of RPS show how it can be done, so the real question is why the rest of the journalistic industry doesn’t fall in line?

      I imagine its because of the page views generated by people who just look for scores and because publishers need those scores to get a high metascore and appease stockholders. It probably won’t go away entirely, but minimising this system would be in the best interest for the industry

  • If reviews are just opinions, and that’s it, then we don’t need them at all.
    I think you can translate a reviewers personal opinions and enjoyment of a game and whether it’s a well made game into a score, as long as the reviewer is consistent.

    Gamer review scores are usually very close to the reviewer score. So I think that shows it works pretty good.

  • heres what we need for revies “Scores”: IS THE GAME GOOD OR IS IT SHIT, thats all we need to know and reviews smust always be conducted by people who prefer the genre of the game they are reviewing. That means you dont get s fucking COD fan to review a game like ARMA or Skyrim or Mario Bros. Dont go giving an Action RPG fan the task of reviewing a Turn BAsed RPG unless they actually like those tpes of RPGs as well. You dont get a guy who hates COD to Review COD either.

  • There’s too much in-between ‘good and shit’ though. A game might not be good, but might not be shit either. Or a game could be more than good. It could be amazing.

    The review system’s fine as long as it’s consistent and people stop demanding every game get a 9 or perfect 10.

  • My take:

    Games journalism sites should share their contributors around more, or the contributors themselves should be always on the move between employers. Depending on the quality of their services, more people will wait and not buy on release day, but when Mr or Ms Game Reviewer XYZ has had sufficient time to engage what they are meant to be writing about. This would also mean the the same for the website their review is going up for :p

    When reading a movie review you rarely base an opinion of one person as the whole publication’s viewpoint. ‘This film had violence towards animals in it, The Age gave it 2 stars as a result’. Actually no, it was one person with one opinion. Historically, I’d say look at Rotten Tomatoes as an example but Web 2.0 changed things.

    This could be the same for game reviews, rather than Eurogamer/IGN/Gamespot/whoever being held accountable as one entity, the reviewer’s name should be up front and obvious throughout the review.

    Some sites try this approach already, IGN mentioned during the PAX live podcasts they try to let their ‘reviewers’ personalities’ shine through more. Destructoid love to trot out Jim Sterling every time there’s a chance to get an article more clicks than average . It’s trendy at the moment to hang crap on Yahtzee – and The Escapist website as a whole – these days but they both hardly do any critiquing anymore, but that’s more to do with the site re-inventing itself bit by bit.

    It can and already has had a chance to fail miserably though. Consider:

    Stratton gave a movie 2 stars, says he’s lost respect for the director (and just look at his career as a film reviewer). Fat Swearing ManChild #632 talks about Yoshi’s sexuality, pretty much because he can do so in a flash clip. Ebert says a supporting actress’ singular performance made the whole movie for him (‘cos he’s Ebert). Hot Gamer Gurl w/ b00bs swoons over Ezio and his strange exotic accent (‘cos b00bs).

  • An overrated Mw3 is a seven at best, now before i get ragged on, this game would have been better in 2007. most reviews are biased toward personal opinion. Who knows maybe the journalists are bought out and succumb to a bit of extra coin. I dont rate reviews that much as I’m sure most people do also

  • Game developers (and their marketing departments) are part of the problem with the current game rating system.

    If reviews don’t give scores that the devs are happy with – they run the risk of not getting early access to future games.

    Marketing departments are satuating online “user” reviews with extremely positive reviews for their own games – whilst also writing negative reviews for competing games.

    If review websites don’t suck up to the devs and write positive reviews – there is also the risk they may loose future advertising revenue.

Show more comments

Log in to comment on this story!