This one comes up from time to time, so let’s consider: “America is a center-right nation.” In some sense, this is probably right. Yglesias, a year ago: “I would go stronger than that, actually, and posit that American politics in the future will mostly be dominated by a center-right political coalition just as it always has. This is just how things work. A political coalition grounded in the social mores of the ethno-sectarian majority and the ideas of the business class has overwhelming intrinsic advantages against contrary movements grounded in the complaints of minority groups and the economic claims of the lower orders.” (But is that too strong? Was the U.S. a center-right nation at the height of the New Deal?)
But there are clear senses in which it is not right that the U.S. is a center-right nation. For example, it’s at least odd to have a center-right nation that lacks a center-right. There aren’t that many Olympia Snowes around – not even Olympia Snowe herself, during this whole health care business. It’s not as though America is the country where, when you elect a guy like Obama, you have to beat the center-right off with a stick, compromise-wise, when the center-left is plainly crying out to meet somewhere in the middle.
I have my own thoughts about this, but I’ll just throw this out. How is it possible, and what does it mean, to have a center-right nation, ideologically and electorally, that lacks a center-right, ideologically and electorally?