In Real Life

Why Are Game Developer Bonuses Based On Review Scores?

Last night, Obsidian’s Chris Avellone tweeted an interesting detail about his roleplaying game Fallout: New Vegas.

“[Fallout: New Vegas] was a straight payment, no royalties,” he said in response to a fan question about the game’s financial success. “Only a bonus if we got an 85+ on Metacritic, which we didn’t.”

Metacritic, an aggregation website that collects scores from selected review sites and compiles them as a weighted average, currently lists the Xbox 360 version of Fallout: New Vegas at 84 (out of 100). The PC version is also listed at 84. The PlayStation 3 version of the game is listed at 82.

In other words, Obsidian may have missed its bonus and lost out on a significant amount of money because of a single point.

We’ve reached out to New Vegas publisher Bethesda, the company that financed the game, to try to confirm Avellone’s statement, but they would not comment. If the New Vegas designer’s tweet is accurate, then Bethesda put a portion of Obsidian’s financial fate in the hands of a select group of game reviewers.

Finances have been an issue for Obsidian — earlier this week, the independent studio had to let go of 30 staff because a game it had been developing for the next Xbox was cancelled. So a potential Metacritic bonus may have been no small matter.

I understand the logic used by publishers like Bethesda when they dole out bonuses based on Metacritic numbers. As an aggregation of critic review scores, a Metacritic average can be an important benchmark for the perceived quality of a game. And it certainly makes sense that a boss would want to reward its employees based on the quality of their work.

Except Metacritic scores are not objective measures of quality. The Xbox 360 Metacritic page for Fallout: New Vegas consists of 81 reviews. If Obsidian’s bonuses were determined by this aggregator, they were not based on the game’s quality — they were based on 81 peoples’ opinions of the game’s quality.

Metacritic scores are not objective measures of quality.

Look through Metacritic’s list of critic reviews. The list of selected websites is comprised of both professional and volunteer reviewers. Some write for the web. Others write for print. Some scores are weighted more heavily than others (Metacritic does not publicly discuss the formula it uses to create its averages). Some scores are even treated differently than others — a 7 at Game Informer does not mean the same thing as a 7 at Edge, for example.

Many of the reviews attacked the game for its bugs and glitches, many of which were fixed in subsequent patches and downloadable content packs. While reviewers may have been justified in marking down scores for the buggy product, those scores may not have been relevant after a month, or even after a week. Most review outlets don’t change their scores once patches have been released. Is that something Bethesda took into consideration?

There is no such thing as an objectively good game. Nor is there such thing as an objectively bad game. We all secretly hate some games that are beloved by the rest of the world, and everyone has their favourite black sheep. I’ve strongly disliked some highly-rated games, like Dragon Age 2, and fallen deeply in love with some poorly-rated games, like Suikoden V. Should my personal opinion really be condensed into a mathematical formula and used to decide somebody else’s bonus?

At Kotaku, we don’t use review scores. Metacritic doesn’t count our reviews. What if that made the difference? What if an outlet’s choice of reviewer changed everything? What if a developer’s bonus was determined by a single person’s arbitrary distinction between a 7.8 and a 7.9? What if a game studio faced financial trouble after it missed its bonus by a single point?

This isn’t healthy for anybody involved. It’s not healthy for a reviewer to have to worry whether his criticism will directly affect peoples’ jobs. It’s not healthy for developers to focus on pleasing reviewers, rather than pleasing consumers. It’s not healthy for individual opinions to impact bonuses and salaries.

Publishers need a better tool for measuring a game’s quality. I don’t know what that tool is. I don’t know that it exists. But using Metacritic to hand out bonuses is dangerous — for developers, reviewers, and, quite frankly, you.

(Disclosure: While working at Wired.com, I gave Fallout: New Vegas a 9/10. My review appears on the game’s Metacritic page.)