Survey Says: You Stink

Feb 19th, 2009 // 10 Comments

Remember in Dead Poets Society where they try to analyze poetry via graph and then Robin “O Captain My Captain” Williams tells them to rip up their books and then Eric Foreman’s dad, Clarence Boddicker, makes House’s friend kill himself? Well, people keep trying to quantify musical quality a la J. Evans Pritchard, Ph. D., with varied results. On one hand, you have Pandora, which uses its Music Genome Project to match up artists with similar artists based on a few hundred fairly objective measures—does it have acoustic guitar, does it have female vocals, does it feature distorted guitar, that sorta thing. I think we can all say that’s it’s reasonably successful, though aberrations do occur, and it often presents a flawed, monolithic view of an artist’s work, which is OK for the Ramones or Jesus and Mary Chain, but less successful for, say, Prince or the Beatles. Do you mean the “Sign O’ The Times” Prince or the “I Could Never Take The Place Of Your Man” Prince (same album!)? Do you mean the “Rocky Raccoon” Beatles or the “Helter Skelter” Beatles (same album!)?

Music Xray is a “social-network-like music submission and review utility” that will try its hand at the American Bandstand method of judging music:

1. It enables music industry professionals to earn a bit of revenue through the music submission and review process.
2. It enables song owners to get legitimate feedback quickly and inexpensively via an environment where they can review the reviewers.
3. It exposes songs (in some cases) to the people that can connect songs to exposure opportunities.
4. It gives Music Xray the best way to get a ton of songs professionally tagged, which will help to make all the songs in our system easier to find.

I think it works like this: you submit your song and then a bunch of “experts” or “peers” give feedback on your song, both open-ended and based on pre-selected parameters. They call these “canned reviews.” Relative quality of various aspects of the song are placed on a Likert scale of sorts, and people grade them accordingly. Sample critiques:

Critique Mix and Production
Production/mix is too thin
Production/mix is too full
Sounds muffled
Sounds tinny
Lead vocals too loud
Lead vocals too quiet
Backing vocals to loud
Backing vocals too quiet
Lead instrument(s) is (are) too loud
Lead instrument(s) is (are) too quiet
Dislike lead instrument effects
Drums too loud
Drums too quiet
Bass too loud
Bass too quiet
Keyboards/piano too loud
Keyboards/piano too quiet
Other instrumentation is too loud
Other instrumentation is too quiet

Performance Critique
Overall performance needs improvement
Lead vocal needs improvement
Try a more appropriate key
Backing vocals need improvement
Lead guitar needs improvement
Instrument solo needs improvement
Percussion needs improvement
Beat drags
Beat is top heavy
Bass needs improvement
Keyboards / piano needs work
Horns need work
Strings need work

Composition Critique
Chorus melody not compelling
Chorus melody too repetitive / basic
Chorus doesn’t resolve / resolves poorly
Verse melody not compelling
Verse melody too repetitive / basic
Verse doesn’t resolve / resolves poorly
Bridge melody not compelling
Bridge melody too repetitive / basic
Bridge doesn’t resolve / resolves poorly
Middle eight need works
Overall tune does not stay with me
Lyrics need work
Lyrics aren’t compelling

As a semi-believer in Platonic ideals and such, I admire these attempts to quantify the seemingly unquantifiable, and I also like the idea of empiricism in art, mostly when it makes me right and you wrong. However, like any research design or survey, this list comes with its own set of built-in biases. First off, it seems to be based on songwriting and arrangement assumptions that favor the traditional and the rock and roll—not every song has a bridge or even a chorus (see: lots of the New Order catalog). Do we really want to encourage songwriting cliches like the middle eight or the bridge? And while I’m a bit of a pitch and performance stickler, I know bands that would fail at these categories and still be great (see: lots of the New Order catalog). What makes lyrics compelling? I love plenty of songs with atrocious lyrics (see: lots of the New Order Catalog), and I often bristle at poetic ones. Great music can be poorly played by amateurs with ridiculous lyrics, and even seasoned pros can make mistakes. I’ve heard both Bill Bruford and Bill Ward bobble some beats before, but it didn’t make them any less rad drummers.

The site hasn’t launched yet, and it’s a work in progress, something that Bruce Warila of Music Xray acknowledges. In fact, he is taking comments and suggestions here. The concept of a social networking site based on criticism appeals to me, but the measurement could use a lot of work. In good research design, it’s always better to ask more than less.

My song sucks? You must be an idiot! What’s the right way to review music? [Music Think Tank]


  1. Marth

    Put a song through this process enough and it will come out the other side as a sine wave.

  2. Lucas Jensen

    @Marth: That might sound interesting.

  3. Anonymous

    There’s good music and bad music.

  4. Halfwit

    @juiceandgin: You’ve clearly never been involved in a writer’s workshop.

    And, yeah… it’s amazing how amazing New Order are (were?) considering how much the components would appear to suck on paper.

  5. doublewhiskycokenoice

    wait, that was repping ‘rocky raccoon’, right?

  6. Anonymous

    The Most Wanted song and The Most Unwanted Song:


  7. Lucas Jensen

    @MhS: I have always loved the Unwanted one WAY more.

  8. Lucas Jensen

    @Halfwit: Oh, New Order are in my top 5 ever, and yet they fail in so many categories that I think are important: pitch, lyrics, tempo, etc.

  9. Poubelle

    @juiceandgin: And there’s a foolproof way to tell the difference: good music is anything I like and bad music is anything I don’t like.

    On a serious note, those are some great tags.

Leave A Comment