What are your opinions on the United States of America in the world today? Do you think it's a just society in which liberty and freedom are its primary virtues? Or do you think it's a bastion of evil filled with all of humanity's vices, intent on destroying and conquering the world a little at a time? Or are you one of those "green" people who think that it should be leveled to the proper enviornment (i.e. non human)?

Where do you think the United States belongs on the political atmosphere today?