Skip to content

What is the chance of an MP being wrong?

When MPs took a maths exam it showed that the members of parliament are pretty bad at elementary probability. When asked “if you spin a coin twice, what is the probability of getting two heads?” 47% of conservatives and 77% of the Labour MPs gave the wrong answer. About 75% of the MPs felt confident when dealing with numbers, although they generally though politicians did not use official statistics and figures correctly when talking policy.

How should a rational person react to this news?

If your ability to make rational inferences about probabilities and likelihoods is bad, then your decisions will be irrational. Relevant information will not affect your decisions in the way it should, and they will be suboptimal and biased. So insofar politicians are seen as decision-makers this is very, very bad news. Many, if not all, government decisions deal with reasoning under uncertainty and dealing with numerical data. Especially thinking about risk requires nontrivial understanding of probabilities.

Even worse, being a rational decision-maker doesn’t seem to influence your chances of winning a political position. The question about how other politicians use numbers show that the MPs are likely aware of the sloppiness, but their own confidence suggests that they are strongly overconfident in their own limited abilities. It is pretty likely that voters select for confident people rather than people who make the right decisions – or care about their rationality.

However, politicians also have a role as representatives and exponents of preferences of the electorate. In this case the problem might be smaller: we can hope they set the right agendas, and then civil servants who know probability do the proper implementation. Unfortunately there is no reason to assume civil servants are really rational with probability either, and if the policy has been set on irrational grounds – perhaps due to misunderstood figures – then even a rational implementation will be faulty.

A final save might be if it is about framing and ecological validity. Maybe the politicians are bad reasoners about abstract probabilities but do a decent job for real likelihoods. As Leda Cosmides and John Tooby showed in the Wason card task, people are much better at reasoning about a social situation than an equivalent abstract one. So maybe the politicians would not think that a policy requiring project A and B to both succeed has as good chance of succeeding as one dependent on just one of the projects. But I have my doubts.

Having fallible, irrational and biased agents is not necessarily a problem for the collective rationality of a system if care is taken to design it to filter out these factors (consider how the Condorcet jury theorem, rewards tied to correct decisions, or peer review systems can improve things). But if the system is designed by agents that do not care about its rationality (because they do not see the importance, or because they are not rewarded for doing it), then it will likely just distil certain biases. Powerful systems will then promulgate and enforce those biases onto other individuals, even when the individuals would not have made the same decisions if they could have acted freely.

So a rational citizen would likely want to either reduce the power of the irrational government (assuming that decision capacities now would be in the hands of people with individual incentives for being rational; there is no use in reducing government if power over you just moves to other irrational institutions), or lobby as hard as he can to make other voters to put pressure on politicians to construct pro-rational decision-making procedures. Sounds like that goal would need a very catchy slogan to get anywhere.

Share on

6 Comment on this post

  1. Hi Anders,

    It does indeed seem bad that so many MPs so frequently got the question wrong. (And I wonder why the Labour/conservative differential – maybe there’s an education difference? And does this give one some prima facie reason to support Consertiaves over Labour?) Still, it might not be as big a problem as you indicate. Any given decision made by an MP is not made on his/her own. There are aides, adivsors, civil servants and other MPs all providing input (and indeed the major policies they vote on aren’t really crafted by individual MPs on their own but instead by a large cadre of party officials). Because of this, the Condorcet jury theorem applies very well, I think. I suspect that if you offered even relatively complex math problems and allow them to take a couple days and consult whomever/whatever they like, the results will be significantly more accurate. Biases will still get in the way of rational decision-making, but that’s bound to happen whether or not the MP’s math is good in the first place.

    However, you indicate that being bad at math (one form of irrationality) is correlated with being prone to biases (another form of irrationality). Is there much evidence to support this? If so, then the study is troubling indeed (and, again, prima facie reason to think that Labour MPs are more biased than conservative MPs and therefore more reason to trust Conservatives. Though, it would be better to test such MP bias directly…). If not, though, for the reasons above I think this study isn’t very worrying.

    1. I like the idea of testing what the collective ability of government would be. That sounds like a very useful research project that ought to be undertaken: just how smart and unbiased is a government committee?

      I have a suspicion that the Condorcet theorem is limited in real cases, since much group decisionmaking is influenced by group biases (the most extreme members tend to sway opinion, group problem solving works best for problems where a solution once found is obvious, groupthink, etc)

      In this case math ability was merely the proxy for bias, although this particular example is very relevant: if you think A & B equally likely to A or B separately, you are suffering from the conjunction fallacy. In general, cognitive ability does reduce some biases (not all), and education also affects at least biases from economic rationality ( http://aida.econ.yale.edu/seminars/apmicro/am05/benjamin-050414.pdf )

    2. “Any given decision made by an MP is not made on his/her own. There are aides, adivsors, civil servants and other MPs all providing input (and indeed the major policies they vote on aren’t really crafted by individual MPs on their own but instead by a large cadre of party officials). Because of this, the Condorcet jury theorem applies very well, I think.”

      Actually, CJT requires independence between voters, otherwise you don’t really get the benefits of the law of large numbers. Furthermore, since it is essentially no more than an application of the law of large numbers, it shows that if average competence is low (as seems to be the case here) a large number fare even worse than a random individual!

  2. Interesting post thanks, Anders. The inability of MPs to reason probabilistically could/would be a huge deal *if* policy depended followed anything like a rational actor model of decision making. But there are lots of good reasons to expect that it doesn’t**. A huge amount of actual decision making is about bargaining, which i expect they’re better at than probability theorists, and another huge part is coalition building. These are the somewhat oleaginous arts of politics, and though we might complain about them (easy enough from within our stable hierarchies) they’re actually the things that determine the scope of the art of the possible. I’m sure there are specific policy instances where probabilities matter – in many medical, economic and environmental cases, for instances – and this is just where we often see MPs hand the ball off to experts.

    Regarding your two options – reduce government or lobby for more rational processes – I think you’re right that these are often good ideas, but there’s always hostility from within the bureaucracy (and from Guardian readers) towards the first, and there’s hostility from campaigners (and from Guardian readers) towards the second. [examples of the second include GM foods and nuclear power, where a lot of highly informed opinion is sanguine about new technology, and points out that public perceptions of risk are badly out of kilter with true risk. This gets the response from NGO sorts that the experts are irrelevant because the people should be listened to… democracy is wonderful when it’s a tailwind for your cause.]

    **There’s lots of literature on how policy departs from rationality – Lindblom’s stuff on muddling through, Graham Allison’s beautiful case study of the Cuban missile crisis, etc.

    1. True that bargaining and coalition building are important skills. In fact, one reason we have the politicians we have is that learning those skills takes a lot of effort and training, and that means successful politicians are more likely to have spent their youth learning them rather than probability or other skills. The problem is that there seems to be little reason for them to go for truth-seeking policies of all the rewards (and their outlook) is aimed a coalition building, so although they could hand off policy to experts checking evidence and probability, they will be less motivated to do so than it is rational.

  3. ” Unfortunately there is no reason to assume civil servants are really rational with probability either, and if the policy has been set on irrational grounds – perhaps due to misunderstood figures – then even a rational implementation will be faulty.”

    This is a rather timely comment, given the cock-up of the West Coast Mainline decision.

Comments are closed.