Sunday, August 31, 2014

Fight With Your Friends: The More You Agree, The More Likely You're Factually Wrong

Turkish lawmakers brawl in Ankara on August 4


The Death of Epistemology


Fight With Your Friends About Politics

The more you agree with people on your side of political debates, the more likely you are to be wrong about the facts.
One annoying effect of extreme political polarization is that people on different sides can't agree about facts. Americans' opinions tend to coincide with those of others on their own end of the political spectrum about things such as whether there was a cover-up about Benghazi, whether the IRS has illegally targeted right-wing groups, whether Obamacare is hurting the economy, the extent to which fracking presents an environmental threat, and about many other factual matters. Certainly, Democratic politicians, activists, and spokespersons very often agree about things like that, and so do Republicans.

But though we have come to expect this, we ought to regard it as discrediting to people on both sides, as strongly suggesting that they are not committed to saying what's true. Relentless political partisans—Reince Priebus or Debbie Wasserman Schultz, Charles Krauthammer or Michael Tomasky, Rachel Maddow or Sean Hannity, Hillary Clinton or Marco Rubio—should at this point be regarded as having next to no credibility on factual questions.

Commitments with regard to values—to justice, or equality, or liberty—define your position on the political spectrum, or even what we might call your political identity. Such things may be among your most intense beliefs, and motivate many actions. But with regard to most factual questions, such things are just irrelevant.

Philosophers traditionally draw a distinction between normative and factual claims. We might distinguish them by noting that different sorts of reasons count for or against assertions about facts and those about values. Whether it would be a good thing for it to rain on Saturday in Fresno, or whether it would help us make progress if it does, is irrelevant to the question of whether it actually will rain Saturday in Fresno: That depends on whether drops of water actually fall out of the sky.

In sincerely trying to find the truth about factual claims, it is often important to decide whom to believe. To do that, it’s important to figure out who is advocating something on the basis of evidence. The people most likely to be sensitive to evidence—and therefore most worth listening to—often disagree with the consensus of the people around them on factual matters. We ought provisionally to regard people who frequently act as dissidents, heretics, and pariahs in their own political group as being more committed to speaking the truth than people who usually or always agree with the consensus. A person who diverges from the consensus of the people with whom she agrees politically may have other problems of credibility. But she does not start off with this one.

In order to build an argument for this, let's conjure some imaginary situations. First, imagine that we are psychologists, and for whatever reason we are conducting a study of the belief systems of people with green eyes. It turns out that they all believe that cutting taxes increases revenue. We'd be stunned by the arbitrary unanimity. But now imagine further that it turns out that green-eyed folks unanimously share many other beliefs: that there is a highest prime number, for example, that there is extraterrestrial intelligent life, that it will rain next Saturday, and that there was no cover-up about Benghazi.

One thing we could certainly conclude is that whatever they might say about themselves, many green-eyed people are not basing those beliefs on the evidence. That is because the evidence in each case is split, and we'd expect disagreement among people with green eyes, or would expect the split among people with green eyes to be similar to the split in the population as a whole. Whether you have green eyes is completely irrelevant to rationally assessing the effects of tax cuts on revenue.
Something is making the green-eyed people believe these things, we must suppose, but it is not having reasons. Maybe it's genetic, or maybe having green eyes is associated with some sort of neurological glitch.
Now imagine that all the leftists in Fresno think it will rain on Saturday and all the rightists think it will not. We'd find that as arbitrary as in the case of the green-eyed people and tax cuts. Whether you are on the left or right has no connection to having reliable information on whether it will rain next Saturday in Fresno.

And whether you are on the left or on the right is no more relevant to the question of whether there was a cover-up about Benghazi than it is about whether it will rain. Really, it’s not. Actual evidence here would concern such things as who knew what when or who communicated what to whom. Whether you lean forward or back, whether you think we need more equality or more liberty, more welfare programs or freer markets: These have nothing to do with what happened after Benghazi, or what the actual effects of fracking are, or whether restrictions on gun ownership reduce violence. If people on the same side with regard to political values agree about such controversial factual questions, it is very likely that they are generating these beliefs in a rationally arbitrary way.

I'll try to make the point clear with a mathematical proof, or at any rate some back-of-the-envelope calculations.

Suppose that a good assessment of the evidence that it will rain on Saturday in Fresno makes the probability 50 percent either way. If everyone's belief were responsive to the evidence, and since right-wingers and left-wingers have roughly the same access to evidence, we'd expect a 50/50 split within each group among people forming an opinion. In such a situation, we could provisionally estimate the likelihood that their beliefs are based on evidence by the distance from this 50/50 split within each group. In a case where all the right-wingers think it will rain and all the left-wingers think it will not or vice versa, we should infer that there's at least a 0.5 probability (on a 0-to-1 scale), with regard to any person in either group, that he believes what he believes because of factors other than the evidence, or that his belief is rationally arbitrary.

No comments:

Post a Comment