God Save The Queen.

God Save The Queen. "They just look so.... American."

That was my Mother's response when I excitedly showed her a picture of one of my favourite bands: Avenged Sevenfold. Don't get me wrong, my mum is just as much of a metal-head as my brother and I. I was still a little confused, and asked her what she had meant. Her reply was that they looked 'clean, buff and well-turned-out." Since when has it become 'American' to take pride in your appearance? Since when has having tattoos and piercing been 'American'?

It seems to me that Britain has started to adopt a 'love to hate' attitude towards America and nothing is missed by the radar. Everything from the clothes we wear to the music we listen to to the magazines we read are 'just so American'. There is a strange feeling about American musicians, particularly the newer rock bands that are popping up everywhere. The originals, like Guns 'N' Roses are totally untouched, despite the fact that they are part of the sex, drugs and rock'n'roll generation.

Was it their image that started this anti-American trend?

Maybe people here are resentful of the fact that Britain is under-represented, even in our own magazines, Kerrang! for example. If you pick up Kerrang! most weeks, there will be an American band on the cover. Flicking through its' pages, there are glossy shots of American artists. Don't get me wrong, I don't have anything wrong with Americans at all; a lot of my favourite bands are from across the pond. However, I do have a problem with people constantly telling me that my tastes are 'too American.' I mean, what else am I supposed to do? There is absolutely no way in the world I'm going to start shopping at Topshop, and I'm constantly surrounded by American influences.

So what if I want to get snakebite piercing? Why does that suddenly make people tell I'm so American? Why is it such an American thing? And why is it so bad?

It's like everything alternative has been cast under the shadow of Americanism, whilst everything 'fashionable,' despite its roots, remains perfectly acceptable.

Why has it become such a bad thing to be American? It's not that I'm unhappy being English, I'm totally unashamed of my heritage, it's just irritates me that people are so disrespectful towards our neighbours.

We can't escape it, either. You turn on the TV and you have at least half of your channels coming from America. Some of the most popular channels, MTV and E!, for example, are American, and yet we still look at them with disdain. Perhaps we have tried to turn it into a negative thing because we have nothing better to do. (Us Englishers are famed for complaining!) But maybe our parents are bored.
I have no idea why we dislike 'Americanism' so much, but I do know that I'm sick of being told how bad it all is!

Latest articles