January 4th, 2013 at 11:20pm
Of course, America is mostly no worse than any other country. Everyone notices it because it is one of the most important countries in the world - seriously, I and a lot of other 'foreigners' appear to know more about the country than some Americans (that's not a stab at Americans, just an emphasis on it's importance). If so many people do not even know which continent my country is on, how are they supposed to know the negatives of the country? Being in the spotlight has to have downsides too.
I think another problem is that America is often presented as a country which is very developed. Whilst this may be true in an economic sense, people don't realise that racism/sexism/xenophobia etc. still exist there, so they are shocked to find out. It honestly shocked me. The whole 'American dream' still exists in a way and it's pretty disconcerting to know that even in one of the most economic and socially developed countries in the world, you can still face the same prejudice and discrimination.
The only problem I've had with individual Americans is ignorance in the area of the rest of the world. I think Americans don't really 'need' to know much about the outside world unless they plan on living abroad or having a career which involves traveling. It's not a huge issue with most people, but some of the things I hear leave me stunned.
also, just bless these comments! i thought i was going to get a lot of hate for this, honestly.