Clash Likes To Score: The Psychology Of Scoring

Clash has had a rethink regarding how it scores its album reviews – and our new guide to what the whatever-out-of-10 mark that accompanies each article can be read here. 2014 is a big year for Clash – it sees us turn 10, and reach issue 100 of our magazine. It sees us further our reach into the digital market with our award-winning app, and develop this website that you’re looking at right now.

In many ways, it’s time to grow up – or, rather, to show the industry just how much we’ve grown already. Clash has perhaps been seen as small fry in the past, compared to longer-standing magazines sharing shelf space, naming no names. But the progression that we’ve made of late has been incredible – what other magazine has Elton John on board as guest editor, for one thing? The magazine looks and feels incredible – nobody can deny that. 2014 is going to be big, mark these words. This website is going from strength to strength, too, with big-name exclusives published regularly, and readership figures only ever heading upwards.

It’s because of this confidence in Clash’s marketplace position, in the value it offers readers as a multi-platform brand covering everything from breaking news to our own promoted events, that we’re happy to look at our own reviews section and make changes. Too many 8/10 scores in a single issue can be confusing – how are you, the reader, supposed to really know what is a brilliant album and what’s just a bit better than the one the band in question put out before? It’s time to shake things up and make the higher-scoring LPs stand out – which is what we’re doing from issue 93 (April 2014).

As part of this rethink, we’ve asked a handful of respected critics for their thoughts on scoring albums – and in some cases, not scoring them at all. As the numerical mark at the end of a review’s copy can often serve, to the reader, as reductive shorthand, rendering what comes above it needless. Or does it?

- - -

“I sometimes wonder if anyone bothers to read a three- or four-star review of a band they’ve never heard of,” says the Guardian’s head pop and rock critic, Alexis Petridis. And his point is one that Clash’s repositioning of its scoring structure hopes to address: that too many high scores in too short a space of time can stunt the chances of exploration. Personally, I can remember checking out a good few LPs based on the combination of high ratings and the reviewer recommending them. Nowadays, I don’t think that regular readers of NME, Q or Kerrang! have that same connection with a writer. Opinions expressed are those of the publication, of course, not the individual, whatever the argument of an editor otherwise. Metacritic doesn’t divide by scribe – it breaks down by brand.

And it’s review aggregator sites that are driving scores in attention-seekingly disparate directions – rate an album highly and your review will appear at the top of Metacritic’s pages. Score it appallingly, and it’ll still likely feature in a prominent position, while also representing a red rag to the bullish fanbase of whoever the act in question is. Land on Metacritic’s main page for Beady Eye’s ‘BE’ LP of 2013 (link), and you’ll see a bunch of average marks for an average album. Clash’s is at the top as the most positive – look, we’re changing the system, okay? – but down below is a shockingly red 0/10, slapped on the Dave Sitek-produced set by Drowned In Sound.

“I think the review argued the case for the 0/10 score,” says DiS editor Sean Adams. “It was by a young writer perhaps trying to make a name for himself, who grew up as a music fan in a time when Oasis were the biggest band in the world, who writes for a site that loves TV On The Radio, being offered the chance to review a record that on almost every level was abysmal. The review wasn’t meant to be serious – but it stirred up a lot of debate, and is one of those moments that has made me rethink our editorial process.”

So trashing an album by a popular band isn’t just clickbait tactics, then? Not for Adams: “I woke up to an utter shitstorm (when that review had run). Some reactions were really aggressive – so much so you’d have thought I’d just come out as a Tory. I was taken aback by the reaction from people who believed this was something desperate that we had considered doing as a way to drive up traffic. But in fact we got more traffic from a thread full of fish puns (on the social board) than that review – and getting a few cheap clicks isn’t a priority for us when we get millions of page views a month without needing to lower ourselves.”

Low scores can be attractive reads, though, as Luke Turner, associate editor at The Quietus, explains: “I’m always going to be more interested in a 1/10 review than a wheedling 5/10.” Says the site’s editor John Doran: “Looking at the situation from the outside, if I were buying albums based on scores I’d need to see a clutch of 9s and 10s across the board. Face it, when any of us see a 5/10 we automatically think ‘dogshit’, not ‘average’. This trend is only following in the wake of what happened in computer games criticism in the ‘80s, where scores essentially started at 7 and peaked at 10.”

But scoring in the videogames market is hugely scrutinised today. Look at the perfect 10 IGN awarded the game Uncharted 3: Drake’s Deception (link) – and then look at the comments below. “This review proofs that there’s no such thing as independed game journalists, magazines or sites… the revieu is even more ridiculous than the game…” The typos are the original poster’s own, so furious were they that they couldn’t use a QWERTY keyboard properly.

Ben Maxwell reviews games for Edge, and says: “An 8/10 in 2014 is very different to an 8/10 of 10 years ago, due to the exponential improvement in technology and coding trickery. In music, while there is technological progression in the quality of production, the margin between an album released in 2014 and one released in 1964 is much slimmer – a good tune is a good tune, but a game with controls acceptable 30 years ago might not be as enjoyable today. Personally, I tend to ignore the number on music reviews and focus on the writers who I feel do a good job of conveying a sense of the music, and who make meaningful comparisons.”

Turner echoes the sentiment: “I’ve never bought a record based on a score – I think that’s quite weird. I’d always go with the writer, and if it’s something like Q or NME bigging up a Muse album, I don’t believe it because that’s what they do. I follow writer by-lines and personal recommendations, and that’s about it.”

All of which will be music to Doran’s ears, as he says: “Really, it would be better all round if everyone ditched scoring. But scoring is too useful to advertising and marketing departments. It’s really the money people and lazy readers who are being serviced by scores.” It’s worth noting here that The Quietus does not score its album reviews.

Stevie Chick, a freelance critic of no little repute (for MOJO, the Guardian, the BBC, NME), author and all-round good egg, agrees with Doran. “I think number grades on album reviews are a really bad idea. They encourage laziness on the part of the reader – they encourage the reader to see (critics) solely as Consumer Guides. They reduce the review to a simple, binary ‘should I buy this or not?’ I’ve come to believe that your typical reader is unlikely to be convinced to buy something we rate as 6/10 – even though many records that I think are worth a punt end up reading like 6/10 reviews. Why are we discouraging people to go to the trouble of reading the actual reviews?”

Doran’s point regarding scored reviews being of most interest to marketing departments is certainly one evidenced by posters proudly bearing four-star ratings – often without any qualifying quotes to back them up. “You see a lot of film and music posters with star ratings on them rather than quotes, don’t you?” says Petridis. “I hate them, I think they’re totally reductive and false. Nobody other than a rock critic sits there listening to an album and thinks, ‘Oh, I’d rate this as four out of five.’ But I suppose that is useful to advertisers.”

Adams doesn’t quite see scores in such a negative light. “I think scores can help focus the mind of the writer, as well as making things easier to consume for a reader. I’ve always flicked through magazines, and now I find myself skimming through other websites in search of something intriguing to read, and scores can be something that helps me stop to read (that piece). Of course, the scores are useful for Wikipedia, Metacritic and aggregators like that, which generate traffic, so I can’t really see much harm in them.”

So if scores are so important to traffic – clickbait-style zeroes aside – why doesn’t The Quietus grade its reviews itself? (Scores are assigned to its reviews on Metacritic and Any Decent Music.) Explains Doran: “It was born out of sheer bloody mindedness. From the start we insisted on doing the opposite of what we were told we ‘had’ to do by web guru consensus. So just as we went long-form, before that was a ‘thing’, we discarded scoring.

“I think we only regretted it for a couple of weeks, early on, when we thought we’d be ignored by Metacritic. But we’ve never thought about reintroducing them. And luckily it helped us slough off the irritating tl;dr crew early doors. We wanted to make lazy readers with short attention spans as unwelcome as possible. We’re not into pandering to people who aren’t like us. I’m not saying we’re right and they’re wrong, but we wanted to make a website that we’d want to read.”

The core concern that has driven Clash to rethink its scoring strategy was that too many reviews “indicating universal acclaim”, to refer to Metacritic’s 81/100-plus category, were reaching the magazine. Doesn’t it make a minefield for readers, who struggle to identify which new band is the best of that month’s bunch?

“I think a slew of 4/5, or 8/10 reviews in a magazine reads white,” says Petridis. “I think it would be interesting to see what the reaction would be if a magazine adopted a tighter scoring system across the board. I’ve no idea how PRs would take it, though.” I guess we’ll find out when the April issue of Clash is published. And it could be much worse – Pitchfork’s rating system, like IGN’s has been previously, is based on 100 possible points. Says Doran: “These marks have pretensions to scientific method, which is utterly preposterous. There isn’t a single person who has ever worked there who can explain what the different between 8.3 and 8.4 for an indie rock album is, even if their lives depended on it.”

Personal opinions on the pros and cons of scoring aside, though, it’s generally recognised that rating records is a necessary part of the process – even if out-of-10s are assigned by an independent aggregator rather than hosted on the original site. So what actually qualifies as a 10/10 album? Is it even possible to award maximum marks to a new release, without seeing how one’s connection to it changes over time? Most reviewers only spend a few days with an album before covering it – and sometimes, as private playbacks and limited-repeat streams become annoyingly commonplace, not even that much time is available.

“If it was up to me, I’d be more generous with my 10/10s,” says Chick. “But then I don’t see my job as a critic, so much as being a filter, someone questing to find good stuff, stuff to celebrate, stuff to revel in. I’m less interested in telling you that the new Oasis album is predictable garbage. Hindsight helps, but I’d argue it’s a moveable feast – my opinions on record change and evolve in time, but the only aim for a review is to be right for that moment in time. We can’t manufacture hindsight, we can only be honest in the moment.

“I’d say the East India Youth album, ‘Total Strife Forever’ (Clash review) was deserving of 10/10, although I only gave it 4/5 in MOJO. I didn’t really push it at the time, but it’s a record that has really touched me, which I’ve played incessantly since receiving – and, indeed, since reviewing, which isn’t always the case. It’s an album I want to listen to again, just moments after it’s finished playing.”

“I thought 2013 was a really good year for music, the best in ages,” says Petridis, “and I gave quite a few 5/5 reviews. Most recently I gave 5/5 to the reissue of Run The Jewels (Clash review). But it’s a good point about how you feel about an album changing after you’ve lived with it a while. Ultimately, all reviews are more or less based on snap judgements. There are albums I really love, that took me six months to get into. You’re never going to be able to replicate that as a critic.

“That said, I think listeners are more inclined to make snap judgements about music these days, just because of the sheer amount of it that’s available to them. We live in a Spotify and YouTube and illegal downloads world – the way I grew up with music, as a teenager, was that you could probably afford one album a month, and that was it. You had to try to get on with it, because you weren’t getting another album for weeks. That doesn’t apply anymore.”

Doran says he’s not heard what he deems to be a new-release 10/10 since becoming a music writer. “It’s probably ‘White Blood Cells’ by The White Stripes, or ‘Turn On The Bright Lights’ by Interpol – both good records which I no longer listen to, but which will always have a place on my shelves. Actually, the last album I gave 10/10 to was ‘Monoliths Dimensions’ by SunnO))), in 2009, for Drowned In Sound (link). This is a decision I’m perfectly happy with. In fact, it now ranks in my top 10 albums ever.”

Adams names The Antlers’ ‘Burst Apart’ and Sharon Van Etten’s ‘Tramp’ as personal recent perfection: “Both sucked me into their somewhat gloomy caverns, filled me with emotions and shot me out feeling slightly different every time… They both glisten and ache, in the best possible way.” But does a score ever inform the review that precedes it? Can a writer, a reviewer, really know the grade at the article’s end ahead of reading back their own words?

“I usually find the process of committing my thoughts to the page and setting up an argument for or against a record informs the final score,” says Adams. “But I usually have a vague idea if it’s a 2/10 or an 8/10 following so many listens to an album.” Petridis works to a comparable pattern: “I have an idea of what I’m going to give an album when I listen to it, but it tends to be formalised at the end. What does the review read like? Does it seem like a three-star review, or a four-star review? I kind of hope that the review leads the score, rather than the other way round, if that makes sense.” Says Chick: “The number isn’t set until I’ve finished reviewing. I’ll have an inkling, but anything can change until the review hits the editor’s inbox.”

And so we’ve come back around to where we began: does the review matter, ultimately, when there’s a score on the piece? According to these prominent critics: yes, as without the writing process the score can’t be fully realised. Newer reviewers can seem overly eager to award high marks, but that’s part of evolving as a voice worth the reader’s attention – the more you write, the more context you gain, and the easier it becomes to really distinguish a 6/10 from a 7/10. Proper editorial guidelines help, but ultimately it’s the writer’s own ever-changing moods that most inform the mark handed to any release – the mark subsequently passed to marketing for public-facing promotion.

“I like to say, ‘Look over here, there’s something amazing, you might like it!’ says Chick. “But that’s just how I like to roll.” Perhaps we should all go tumbling after, more often than current scoring etiquette allows us to.

- - -

Clash’s new guidelines for album review ratings come into effect for the April 2014 issue (issue 93). Find them here

In the coming days, Clash will publish Ten 21st Century 10/10s – albums that comprise critical foundations for everything we choose to write about.

Buy Clash magazine
Clash on the App Store

Open bundled references in tabs:

Leave a Reply