Wednesday 26 September 2012

Metacritic-al Industry Issues


Let’s run through a little story…. A new Video Game blockbuster has been released a few weeks ago with well crafted design choices, substantial upgrades to all its previous franchise outings, large well-designed maps, and incredible graphics which has recently helped it become the most sold game of all time. You have decided to see whether this new action epic is the game for you and run of searching the web to hear what everyone has to say. IGN give it 9.5 out of 10, Eurogamer slightly lower that score to 8/10 as they found a few bugs along the way and Gamespot pull an 8/10 too for the same reasons.

The problem is that even though this great new game has been a suitable success in terms of sales - it’s Metacritic score fell below the 85% banding that the developer’s contractual agreement held to prompt the publisher to pay out any additional bonus to them. Such is the world!

Imagine for me how completely absurd this would be – it’s the biggest selling game ever – and then reflect that this is exactly the situation that some game developers have been experiencing first hand.  Metacritic, for all it’s usefulness, is the new platform for games to be graded against; however the review aggregator has in some cases become a detriments to the developers of the games that are directly influencing their traffic.

The site accumulates the scores from most of the major sites, newspapers and magazines then swirls them together with blogs and smaller outlets to create an average score result (giving you an instance glance at the markets total opinion).

None of the problems are necessarily the fault of either Metacritic itself, or the people who created it, but nonetheless it can be the fall of some royalties and bonus payments just based on the score it spits out. The idea of the site itself is great and can really help the consumers see what the overall views are of the community. What is wrong is the way that the Metacritic averages are being used by the games industry to determine how games are made and sold, and the negative effect that they are having on criticism.

Metacritic rate moves, TV shows and music as well as video games but none of these industries hold their arbitrary average in such high regard. It’s difficult to understand how much the large publishers actually care about their Metacritic averages. Those publishers that do care though will be suffering an impact on every area of their games campaign including marketing and PR as much as their development process itself.

Basing success on a Metacritic score is difficult and many games have fallen into this pattern. The yearly released like Fifa/Tiger Woods sometime hit a divot (get-it) in their Metacritic score if their new title doesn’t improve enough upon the foundations of a previous release. It’s a slippery slope to measure against even if a game gets a lower score it doesn’t necessarily mean it won’t sale millions of copies. On the other side of the coin games that get brilliant aggregates scores can still flop in sales – it really is a doubled edged problem.

It’s also common for some sites to hold their own aggregated scores for the member reviews. One example is Gamestop that holds an average of all their subscriber reviews to give an aggregated total overall. This gives a perspective of the games but then gets abused by those disappointed by a gameplay section, a part of the story or in some cases just a control problem. Mass Effect 3’s ending was a prime example of these and saw many people lowering their scores by a point or two just due to the impact of the finale as opposed to how great the rest of the game stood up.

For good reason it is very difficult for people to go on record about Metacritic scores but there are various pieces of information scattered around the net from ex-staff (and some current developer staff) outlining these issues.

One famous example came earlier this year in the form of Obsidian’s Chriss Avellone when he tweeted – but since deleted – that they missed out on a bonus from Bethesda as their well received Fallout: New Vegas failed to his a Metacritic score of 85 but one point.

Of course these issues don’t apply to all publishers and developers but it shouldn’t really apply to any of them. These scores are taken from various sites with different reporting ethics. Some users give high scores knowing that the top end listings will mean their sites produce more traffic while others base their scores on marketing investment given to them by the publishers themselves. Both of these don’t therefore provide a true perspective of the game itself and can become the difference between bonuses being paid and also whether a sequel will be approved.

Each Metacritic score can undoubtedly undermine exactly the principle of a review itself as these should be honest about the games themselves and can even be affected by a person’s individual views too. It has become too much of a common place in the games industry and is more of a detriment than ever. We all want to know what the overall consensus in on a product but why should these sometime biased views count towards a games success?

We all know a good/bad review can be the difference between improved sales or not but the roll onto the developers is where the issue really occurs. Ultimately it’s bad for the industry, bad for the consumer and even worse for the developers that put such hard work into their games.

0 comments :

Post a Comment

Leave a Comment...