American Politics 101
I'm hoping that some of you can help me out here. I here a lot of hatred in the media between Republicans and Democrats, with terms like "liberal" and "conservative" spat at people like they are offensive.
Yet compared to most of the other countries in the developed world both your parties are right wing - you don't have the social democratic tradition on any scale as we have across Europe and Australasia.
So as the two sides aren't that different, were does the hatred come from and what is apparently making it so much more bitter as time goes on? We in Britain have just (not) elected the most right wing government in 20 years, and yet Obama does some stuff so far to the right that our government would wince in pain rather than adopt similar policies.
Is there any political movement of the left in America? I mean left of "liberal" which where I come from (and what I see of the Dems) is centre right?