i'm a very left wing kind of guy. further so than alot of other people but not extremely.

why is this seen to be a bad thing by alot of people most of whom are american (just something i noted). mean right-wing these days seems to be normal and is never questioned but as soon as you start talking about left wing views then people start to look down at you and mutter "commie"

why are views of a left wing nature frowned upon now more than they have been considering the world (except america) is becoming more liberal? (liberal another word you can't use without being almost spat on)

why are left wing and liberal so hated these days?