Skip to content

Democracy and false information: some bad news

A recent study by Nyhan and Reifler has received quite a bit of attention recently. The study
aimed to assess how people’s beliefs change in response to evidence.  The researchers gave participants mock news
stories which contained mistakes (for example, they claimed that WMDs had been
found in Iraq). They also included in some versions of the story a correction.
They found that subjects who received false information followed by a
correction actually believed the false information more than those who received
no correction. Given that we want people to be able to make informed decisions
when they vote, this study is bad news. It suggests that people tend to believe
what they want to believe, without much regard to the facts. The effect was
greatest on those most partisan: those who wanted to believe that WMDs were
found were left with a stronger belief than ever.

What I want to highlight is that this study is just one of
many. There is a well-established body of reseach on the sleeper effect.
Suppose I give you a claim together with a “discounting cue” – some reason to
regard the claim as highly unreliable (eg. I tell you that I read that
according to BP, oil is good for the marine environment).  I then ask you whether you
believe that oil is good for the marine environment. Presumably you will say
“no”. BP would say that, wouldn’t they? The bad news is that in a few weeks
time the claim and the discounting cue become dissociated. You now only recall
that you heard somewhere that oil is actually good for the marine environment
and you come to believe it. There is also the familiarity effect: repeating a
claim often enough and the likelihood that subjects assign to it goes up;
there is also the fact that taking a claim as true seems to be the cognitive
so that giving subjects claims and then distracting them is likely to
have them believing the claim.

It is tempting to see these facts as bad news for democratic decision-making. Against this, some people have pointed to the fact that groups of deliberators are, under certain conditions, better than individuals. The catch is that those conditions are hard to realize. Consider the much cited Condorcet jury theorem. The
theorem proves that the larger the group of deliberators tackling the problem,
the more likely the group decision is likely to be right. However, the theorem
is true only if two assumptions are also true: the probability that each
deliberator is right has to be above 50% and the deliberators have to be independent
of one another. Clearly, political deliberation is not independent: instead, we
are all influenced by the same set of information and misinformation. In  these circumstances, the propensity of groups
to produce good decisions is likely to be swamped. Whether the effects are
large enough to ensure that democracies do not, or often do not, elect the best
governments is hard to measure (though whatever your political views, you will
probably agree that it is hard to think that the electorate always gets it

This work leaves us facing a difficult problem: what to do
about misinformation? Recently, the asylum seeker debate has resumed in
Australia, and our favourite far-right ex-politician, Pauline Hanson, has
popped up to tell us that refugees receive preferential medical treatment and
benefits much bigger than those available to native-born Australians. These claims are, of course,
false. But correcting the misinformation may do more harm than good: it  will likely increase belief in the claim
among those like Hanson, and even among those who are not partisan the sleeper
effect may ensure that the discounting cue becomes dissociated from the claim.
Correcting misinformation may disseminate it more widely, without decreasing
its credibility. If you don’t agree, maybe you’d better just keep silent. 

Share on

2 Comment on this post

  1. There is a study into Genetically Modified Food or Nano Technology -don’t remember which- that said the same thing. People like to validate the views they already have rather than change them with evidence. Throw in Confirmation bias, that people would rather believe an attractive confident individual rather than someone with the facts and with the Dunning–Kruger effect where they cannot even tell that they are incompetent and all sort of odd views from our asylum seeker debate to Global warming become comprehensible.

    What was that about democracy being the best of a bad lot?

  2. Neil: there is a weaker theorem than Condorcet’s that can defend the accuracy of groups. Provided that the errors of the ill informed are random it is not necessary for the deliberators to have the chance of being right greater than 50% (nor need they be independent, although that introduces some complications). In such circumstances even a tiny minority who are well informed can result in a vote getting it right. For example, 98% are ill informed. On average you’ll get 49% of them voting in either direction and the 2% well informed voting in the right direction, leading to 51% of the vote being in the right direction.

Comments are closed.