The first big article on Vox, Ezra Klein’s new journalism venture, was Klein elaborating on a study about how people project their biases onto data even if it is refuted by the data itself. Giving people information actually makes them even more likely to hold to their views. I believe he wrote about this same subject in Wonkblog years ago I can’t find it though this article by Plumer mentions the result.
Krugman then commented that despite the fact that the study says liberals and conservatives are equally bad at projecting their beliefs on data, they are in fact different. He cites climate change and Obamacare and 2012 presidential election as evidence that conservatives believe what they want to with no equivalent rejection of reality by liberals. Of course, Krugman is just projecting his liberal bias onto history, right? And of course I agree with him because I am also liberal! How can we possibly objectively address this question if we are all suspect?
I can’t answer that. What I can ask is how did people arrive at the hardened ideological views that seem imperturbable by evidence? And here is where I think the difference is. I think we should do something about climate change because all the evidence suggests it is real and that it is disastrous. I believe in gun control because I dislike people getting shot and it seems that gun control prevent firearms deaths in other countries that use it. Universal healthcare is cheaper and covers more people than the American system of private insurers as proven by the multitude of countries with universal healthcare that is cheaper and covers everyone.
Now I am educated and have a lot of free time to read about these things, but a lot of people use affiliations to political parties, community organization, etc. to come to their views about things rather than hard data. That’s why liberals have a small subset of people crazy about genetically modified foods and vaccines (though I think conservatives contribute members to these groups too), despite no evidence that either of these things are harmful. But by and large the liberal establishment/elite/whatever you want to call it seems to arrive at empirically backed policies. This follows because liberals want government to do things and more importantly do them well which requires some form of self-analysis.
Conservatives, on the other hand, believe, and I frame this in the most generous way possible and ascribing sincerity to what they say, that we should just leave people alone. This view does not encourage an empirical mindset, but rather a destructive paradigm interested only in tearing down any attempt by society to impose a burden on an individual. Facts are only useful if they agree with you. You can see this by the fact that Republicans would disagree with Obama if he said the sky was blue, but you never see Democrats engaging in the same degree of opposition. Democrats are a positive party engaged in doing things, whereas modern Republicans are mostly defined by their opposition to Democrats.
Now once you have these views, however you acquired them, be it data or affiliation or some other reason, Kahan’s study comes into play. We all use heuristics, smarter people probably even more than average. So if you come at me and ask me if these results support gun control, I “know” that they do because previous studies or my friend have said so. Furthermore, we also suffer from a confirmation bias that is particularly strong when it prevents cognitive dissonance about ourselves. Essentially we discard or rationalize information that disagrees with who we think we are. If climate change is wrong then I made a bad decision or was fooled by scientists or I picked the wrong political party. But I am smart and clever and make good decisions, so there must be something wrong with data that disagrees with me.
Taken this way Kahan is really just repeating an already well studied effect (confirmation bias and belief perseverance in the face of inimical evidence) and maybe showing just how far we will go (even ignoring fairly easy math) to rationalize our beliefs. It is how we come about those beliefs that is different. Then again we run into the primacy effect, why should earlier data hold more weight, and I wonder how the human race manages to accomplish anything with all these irrational behaviors.