Tuesday, March 27, 2007

Brutal!

Here it is: the lowest score I've ever given.

(N.B. Online they only grade it from .5 to 4 stars, but in the paper it'll be a 1.0/10.)

Thursday, March 22, 2007

You don't say

I was looking for a specific bit of information for my Def Jam: Icon review, and came across this tidbit from the press release announcing the game's shipment:
Written by renowned Hollywood writers Virgil Williams and Cle "Bone" Sloan, DEF JAM: ICON delivers an intense storyline unrivaled in any other hip hop lifestyle/fighting game.

I could not agree more. This has the best storyline of any hip hop lifestyle/fighting game I've ever played.

Tuesday, March 20, 2007

God of War II

God of War II isn't exactly perfect -- occasionally it could use fewer enemies and a little bit less neck-snapping -- but you'd be hard-pressed to find a better platformer released within the past couple of years. It's a genuine must-play. And if it turns out to be the last great game for the PlayStation 2, then it's hard to imagine a more impressive way to go out.

Friday, March 16, 2007

Controversy!

The incendiary blog post that's been getting all the play this week comes courtesy of Destructoid. Written by someone called Reverend Anthony, it's called "Why Video Game Reviews Suck." (The article is split into two parts; feel free to read part one and part two.) This is pretty well-worn terrain at this point, but it's still a subject worth exploring because video game reviews do, in fact, suck. Unfortunately, the author never actually gets around to discussing the problems with video game reviews. The entirety of his screed is dedicated to tearing down game review scores, which are a totally different beast.

Let's start with where the author and I are in agreement. Game scores are essentially useless. That's about where our agreement ends. I'll explain my take on this a little further down, but first let's look at what Reverend Anthony is actually saying.

He thinks scores are useless because they are overwhelmingly positive. The data don't actually back that up. Blogger Bill Harris ran the numbers, and found that less than 2% of the games released in 2006 had a Metacritic score of 90 or above. We can take this even further. Overall, Metacritic lists the aggregate scores of 1,417 PlayStation 2 games. Of these, 62 have a score of 90 or higher. That's about 4.3%. (I'll be sticking to the list of PS2 games for this post, but the GameCube and Xbox are about the same, each coming in around 5%). If you assume that scores should be evenly distributed along the 1-100 range, then you'd expect 10% of the games to be ranked in the top 10% of scores. That doesn't happen. Clearly, reviewers are more reluctant to hand out such accolades than the good Reverend gives them credit for.

But what about the other end of the scale? Why are crappy games given anything but the lowest possible score? Here he has a point. Of those 1,417 PS2 games, only 6 have a score in the 20s. The lowest score is 24, given to Gravity Games Bike: Street. Vert. Dirt. In fact, only 112 PS2 games, or about 8%, have a Metacritic score below 50. If you consider the mid-point of this scale to be the pinnacle of mediocrity, than something seems off. The majority of scores are clumped between 5 and 9. In other words, 40% of the possible scores are assigned to 88% of the games. That actually doesn't bother me. It makes sense to me that you'd see some kind of a bell curve if you were to graph game scores. I think it would make sense to the Reverend Anthony, as well. The difference is that he really wants that peak to come in at 5. I'd expect it to be around 7.

Partly, that's because of the analog between game scores and other types of scoring systems. If you compare it to the four-star rating system favored by most movie critics, you'd get a comparison like this:

**** = 10
*** = 7.5
** = 5.0
* = 2.5

Most critics don't give zero stars, or only do in special circumstances. That's why on Metacritic, which has ranked "virtually every film since the beginning of 1999, and selected films from prior years," only 105 movies have a score below 20. That's 105 more movies than games with that score, but the point stands: it's pretty rare. The two are not so far apart in this way.

The other argument for a clumping of scores above 5.0 is the academic argument, which the author addresses. It goes something like this:

A = 95 (9.5)
B = 85 (8.5)
C = 75 (7.5)
D = 65 (6.5)
F = 64 (6.4) and lower

The logic here is that there's not much difference between giving a game a 6, a 4, or a 2, because they're all an F. I agree with the Reverend Anthony that this doesn't work for video games, and I think he nails why:
But video games do not simply "pass" or "fail." With video games, you most definitely CAN enjoy a sub-average game for some of its aspects. ... Just because a game is sub-average doesn't mean certain people won't enjoy it, and therefore it matters that sub-average games are differentiated from other sub-average games through use of the 1-5 section of the scale.

I'm not sure if anybody would seriously argue that there's little difference between a game with a Metacritic score of 60 and one with a score of 20. Which would you rather play? But I disagree with the implication that there should be an even distribution of scores, instead of the grouping around the C range, or the 70s. Here's Reverend Anthony's take:

If someone was to walk into an EB Games, close their eyes, and randomly choose a game from the shelf, they would most likely not get something good. You might think there’s a 50-50 chance you’d come up with God of War or at least something kind of cool like Red Dead Revolver, but all the more likely is that you’d end up holding a crappy bowling sim or a licensed platformer starring The Olsen Twins.

He's probably right about that. But there's one aspect that he's not mentioning: Nobody reviews those games. I'm exaggerating slightly; Mary-Kate and Ashley Sweet 16: Licensed to Drive was reviewed by four outlets, for a Metacritic score of 49. I agree that that does seem high, but to be fair I have not played the game. The point I'm making, however, is that even publishers whose job it is to review games do not touch this crap. Gamespot didn't review it. EGM didn't review it. Nintendo Power did review it, but they have about as much journalistic integrity as Pravda. On the flipside, why would a publication that is not focused exclusively on games ever even consider reviewing a game like this? You'll occasionally see a Metacritic entry from the New York Times for blockbuster titles. Of course that will raise the aggregate. Those of us who get to pick and choose what we cover will tend to focus on giving exposure to quality games. That's not a bad thing.

I've already gone on a bit longer than I intended to about the game scores themselves, so let's move on. The biggest problem I have with the Reverend Anthony's piece is that he never acknowledges that these scores are usually accompanied by text. In focusing solely on the problems with review scores, he inadvertently proves why I think they should be eliminated. They keep people from actually reading reviews. When readers are more concerned with the score than with the review, there's less incentive for critics to say anything interesting. That, frankly, is what I thought "Why Video Game Reviews Suck" was going to be about.

Why do video game reviews really suck? Because they're attempting to justify an arbitrary score! As soon as you quantify the experience of playing a game, you have to start running down the checklist: graphics, sound, control, "fun factor," and so on. Suddenly you're not applying a critical eye to what the game is about, putting it into a context the reader can understand. You're judging a show dog. You can't simply isolate each part of a game in order to render judgment on the whole. I mean, you can, but it results in the same formulaic, workmanlike reviews we've reading for decades. It's like the difference between listing the ingredients and actually tasting the soup. What does a 7.5 game have that a 7.0 game doesn't? How, pray tell, do you know when a game deserves a 9.7 and not a 9.6?

Video games are pretty amazing these days. Just within the past month, I've played games with some really interesting things to say about subjects like free will (God of War II) and freedom versus security (Crackdown). Would you ever know that from reading reviews of these games? Of course not. What you're likely finding out about them is that they have great graphics and many hilarious ways to kill people (I'll be totally honest here and admit to being guilty of this myself). But there's more to say about games, if someone would just say it. Scores be damned. As the medium grows up, it's going to be necessary for the critics to grow along with it -- and the readers, too. Until they do, game reviews will keep right on sucking.

Thursday, March 15, 2007

Retrospecticus

To coincide with the release of God of War II, I put together a list of essential PS2 games for the Phoenix. It's not a ranking of the "best" games or anything like that, just a bunch of games that I think a lot of people have good memories of. I await the inevitable "list fails" comments.

Also on the comments tip, it's been brought to my attention that someone calling himself "Elebit" just recently commented on my Elebits review, employing the phrase "graphic noob" several times. Onward and upward!

Tuesday, March 13, 2007

MotorStorm

The MotorStorm review is up now. All in all, it's a good game with some problems that are representative of the first wave of software for a new system. I am encouraged about the PS3's future prospects.

I also thought it was charitable not to mention the many people in the MotorStorm online community who tend to quit races whenever they're not winning. More than once, I've had every other racer drop out when I've had the lead. I don't get it. Even when I'm putting along in last place, which is usually the case, I take my medicine like a man.

Thursday, March 08, 2007

Free online play is great and all

I've been going online with MotorStorm, and I have to say it's pretty fun (review coming next week). But in the vein of my last post on the subject, I just don't know what the architects were thinking in some cases. For instance, you have to accept the online agreement every single time you sign in. Not just the first time you go online with your PS3, or the first time you play MotorStorm, but literally every time you sign in. That's retarded.

The stupid thing about it is that you can't just pick an option to get into a game as quickly as possible. You have to pick a server at random, and then pick a game at random, and then sit in the lobby until the current race finishes. You can't watch the race in progress or pass the time except by looking at the list of players. Or by writing a blog post.

The last thing that's irritating me is that, despite my five-day headstart on most of these people, many of them are already much, much better than I am. At least I won my first race.

Tuesday, March 06, 2007

In which I clear the slate

This review of Battlestations: Midway is among the most negative I've ever written. I don't doubt that WWII nuts may derive some pleasure from it, but by video game standards it hardly does anything right.

It always feels a little satisfying to slag off a game, though, if only to remind myself that my critical faculties aren't fading. I don't want to fall into the trap of scoring on a scale of 7-9.

Blog note: I've enabled anonymous commenting, but you still need to do the word recognition thing.

Friday, March 02, 2007

Sony! Soni! Soné!

I'm not going to attempt an in-depth analysis of the Sony-Kotaku kerfluffle, as the implications seem fairly self-evident. A part of me thinks that the scoop was not nearly interesting enough to merit the blow-up on either side, but as a matter of principle I will always side with the free press.

The larger thing to take away from this is Sony's continued, baffling incompetence in almost every phase of the PS3 launch. And I'm not just referring to their continuing PR arrogance and pronouncements from Bizarro World ("We can't keep them on the shelves!" being the most notorious of these). I finally acquired a PlayStation 3 a couple of days ago, and while I have yet to stretch its legs, I've already raised an eyebrow or two at some of their more illogical decisions.

First, it's no secret that the PS3 has been sold as the most powerful game system around -- one tech site even calling it a supercomputer. That may or may not be true (the only thing I can say for sure is that the thing runs freaking hot), but I'm willing to accept the claim of raw hardware power. So if this thing is so powerful, why did Sony only decide to include a composite video cable in the box? No component cables, no VGA or DVI, no HDMI. Surely it was a cost-cutting consideration, but a boneheaded one. In the meantime, it does come with an ethernet cable despite native Bluetooth support. I think they made the wrong compromise here.

But even worse is the front-end interface. Xbox Live may have spoiled us all, but my mind is already blown by the inefficiency on display here. The only thing I've done so far is download the Resistance demo. It took about five steps more than I would have thought necessary. I selected the option to log into the PlayStation store and got a little message that I was logging in -- at which point I was returned to the home menu and had select the store option again to access the storefront. Why isn't that one step? What am I missing? As for the store itself, well, Penny Arcade sums it up much better than I can. Suffice it to say that you need to perform about three actions in order to accomplish any one thing. It's bizarre.

I did want to help out, so when the PS3 asked me to take a survey, I accepted. It was standard stuff: how many video game systems do you own, what kind of games do you like, that sort of thing. When they asked what was the primary reason I bought a PS3, I said for the Cell processor. But I lied. I bought it because I'm bad with money.