A rotten rating system

Iage via rottentomatoes.com
Iage via rottentomatoes.com

Why numerical ratings need to die

By Benjamin Howard, Columnist

Ah, the golden glimmer of the 10/10 rating on IGN, or the 100 per cent on R0tten Tomatoes. The concretely defined numbers can make a rating seem like a cold hard fact. Unfortunately, these numerical ratings are anything but factual or scientific. No matter the rating scale, whether it’s out of 5, 10, or 100, there are no common rating criteria. A standard is missing not only between different review websites, such as IGN or Destructoid, but within these websites. A 10/10 from John Doe on IGN is not the same as a 10/10 from Joe Blow on the same website. With no common grounds for numerical ratings to stand on, it only gets worse when one reaches the core problem: numbers do not convey feelings well.

This is easily demonstrated by how often critics break their own scoring conventions, such as when a critic will give a work 4.5/5 stars, or 7.8/10. In that situation, the critic, feeling unable to properly convey a nuanced opinion within the rating system, has decided to cheat by giving out a decimal, or a half. One might ask, then, why not simply increase the total? So, instead of rating out of 10, why not 100? Well, there’s still a problem: feelings cannot be quantized. The difference between 78/100 and a 79/100 is completely arbitrary, especially when talking about something as subjective as art.

Of course, when ratings made by professional critics are failing to function, they’re only worse in the hands of the public. Websites such as Rotten Tomatoes and Metacritic use the combined scores of many critics, professional, and amateur alike, to calculate the average rating of a particular movie or video game. It sounds like a good idea; the voice of the people can average out the “snobby” voice of critics, and vice versa. Unfortunately, the voice of the people lacks nuance. They vote to extremes. They either liked it, in which case it gets five stars, or they disliked it, so it gets one star. I might come off as elitist here, but coming from someone who enjoys a thoughtful critique, it’s disappointing to see so many things grossly overrated (and underrated) by the public.

Worse than the public’s tendency to overrate a product is its tendency to overrate the rating itself. This is partly due to the emphasis the reviewer himself puts on the numerical rating. For example, at the end of an IGN review video, the screen is covered by a giant, out‑of‑ten score. And, sure enough, in the comment section of said video, most are arguing about the number the product received. In the end, the customer often gives more consideration to the score of the review than the review itself. All the nuance and detail of the reviewer will fly right over the viewer’s head when waiting for that big shiny number at the end.

Numbers and opinions don’t belong together; they serve only to deceive and confuse. That’s why I’m giving my review on reviews a 4.2/7.