In sociology class last week we had a discusion on why so many countries seem to dislike america. During this discusion I heard two good theories one was that america is too involved in other countries business and are viewed as arrogant and overly self-rightous and in some cases they are trying to change the way other countries are ran. Another person in my class stated that maybe other countries that are religious based see america as a land of evil because we do alot of things that goes against their religion and they dont want our 'corruption' to spread over to them. These are all just theories that were brought up in class that I found interresting But I want to know whay you think so many other people or countries seem to dislike america?
Bookmarks