You have an odd sense of the word respect. I don't think many people respect the USA at all. Respect is earned. What has the US done in the past little while to gain respect? Countries are following along for their own political reason. The US has a lot of power and the government thinks it's above world law. You can't fuck around with them. After the USSR was split up, the US has been the only true world power.
Seriously. No respect there. Fear is a better word.
Lets see, in reverse chronological order for the last century:
Aid to basically every poor and starving nation on the face of planet earth. (ongoing)
Innumerable medical and scientific advances. (ongoing)
Putting up with a bunch of ungrateful Euroweenies, dictator shills, and anti-semites in New York City, otherwise known as the UN. (ongoing)
Massive Aid to the Tsunami victims in Indonesia, from the pockets of individual Americans no less.
Integral in Ending the Cold War
Integral in Ending World War II
Integral in Ending World War I
Yeah, America has never done anything for the world. Except save its ass every time those ungrateful Euroweenies are getting their ass handed to them by the Germans or the Russians. When Europe needs something done to further their interests in foreign lands, who do they send for? American troops.
America is the only hyperpower in world history that takes this kind of crap. We do it because we're better people than the ungrateful bastards we begrudingly (sometimes) call allies. Do you really think Stalinist or Fascist heirs, had they won aforementioned conflicts would sit back and let a bunch of irrelevant, jealous stooges attack their society and culture on a daily basis if they had the kind of power the US currently enjoys? We had to threaten to blow the whole damn world up with the Soviets until, thankfully, their economy collapsed from within.
I should have added a fourth category to my list of people who hate America: The historically ignorant.
I admit that America's education system is a joke (and I place the blame for that solely on an education system run by unions and political correctness), but only those completely ignorant of world history over the past century would have the gall to ask "so what has America done for us recently?"
What the hell has Europe or anyone else done for us recently, other than send us off to fight their wars and their battles and then attack us for being "World Police?"
If it wern't for America we'd be typing online now in German or Russian, if at all (Americans invented the internet).




















