Skip to content

Lie detectors and epistemic duty

The British government is about to introduce compulsory lie detector tests for sex offenders released on parole.   The British police want to use lie detectors in the detection of crime. Is this the right thing to do?

The answer to that question depends on a complex set of duties. Obviously it is highly desirable to prevent people on parole committing crimes. Equally obviously it is highly desirable to catch criminals (only when the laws are just, of course). For this reason we are tempted to latch onto anything that promises improvements. The government and the pilot study they are basing their proposal on are promising us improvements by using lie detectors. Others elsewhere have considered whether lie detectors do in fact offer better outcomes .  Even if lie detectors do offer better outcomes, which they might do whether or not they are good at detecting lies, that still wouldn’t make it right to use them.

A highly significant duty that bears on the question is the epistemic duty borne by the law and by the police, the epistemic duty to know the truth of guilt and innocence. Here I am concerned only with that duty. I do not consider other duties, such as protection from harm, that might be served by deterrent effects achieved independently of knowing the truth.

Knowing the truth requires using reliable methods. Consequently the law and the police are required to use reliable methods, both to detect the guilty and not falsely accuse the innocent. So, does the lie detector satisfy this necessary condition on its use?

Contrary to the impression being given in the papers, scientists do not regard the lie detector as part of forensic science:

[it] may be useful as an investigative aid and tool to induce confessions, it does not pass muster as a scientifically credible test (Iacono  “Forensic “Lie Detection”: Procedures Without Scientific Basis”  Journal of Forensic Psychology Practice. )

Even the US Supreme court agreed on this

Unlike other expert witnesses who testify about factual matters outside the jurors’ knowledge, such as the analysis of fingerprints, ballistics, or DNA found at a crime scene, a polygraph expert can supply the jury only with another opinion. (United States versus Scheffer 1998)

The reasons for this are straightforward. The lie detector doesn’t detect lies. It merely detects changes in pulse, blood pressure, respiration and skin conductivity and prints them out as wiggly lines on graph paper. Certainly those wiggly lines are objective evidence of a scientific kind. They are evidence of physiological arousal. But are they objective evidence of lies?

The changes detected by the lie detector are effected by many circumstances in a persons life quite independently of any questions they are being asked. Furthermore, they are easily manipulable by the testee.  Consequently the base evidence is unreliable and thoroughly corruptible.

The detection of lies depends entirely on the operator’s interpretation of those changes, based on their timing and what questions he asked the testee.  This matter of interpretation is not quantifiable and consequently the detection of lies is a purely subjective matter. Not only is it purely subjective, but just as we find in any such matter of subjective judgement, there is significant variance from one operator to another.

The testing method itself depends on deceit. Firstly, the testee is told that the detector can detect lies when it can’t. Secondly, the testing method is not confined to the administration of test questions whilst physiological arousal is being measured but can extend to further questioning intended to elicit admissions. But this last is just interrogation yet its outcome will be presented to a wider audience as an outcome of a scientific test.

Attempts to measure reliability were reviewed by the British Psychological Society, whose working committee reported:

There is reasonable agreement between the reviews regarding guilty suspects. Correct classifications were made in 83 per cent to 89 per cent of the cases, whereas incorrect decisions (classifying a guilty suspect as innocent) were made in 10 per cent to 17 per cent of the cases. However, there is less agreement amongst the reviews regarding innocent suspects. Nevertheless, the findings for innocent suspects are less encouraging than for guilty suspects. Depending on the review, between 53 per cent and 78 per cent of innocent suspects were correctly classified and between 11 per cent and 47 per cent were incorrectly classified. (BPS Working Party report 2004)

Despite the strength of ‘83% to 89%’ apparent success rate, to anyone familiar with statistics these results are not at all encouraging. The first issue is that this is giving us the wrong conditional probability. It is telling us the probability of a positive test result given guilty whereas we want the inverse probability, the probability of guilt given a positive test result. Just so you can be clear how deceptive this can be, it is possible to have a test with P(positive|guilt) = 99% for which P(guilt|positive)=1%. So it is possible for the test to correctly detect 99 guilty people out of 100 guilty people and yet to falsely accuse 99 innocent people for each guilty person it correctly identifies.  This sounds so counterintuitive as to be mathematically impossible but consider this example: a lie detector that called everything a lie. It would correctly detect 100 liars out of 100 liars so would be perfectly reliable in that respect. Unfortunately it would also falsely accuse every truth teller. In a population with 1 person in 100 a liar P(positive|liar) = 100% and P(liar|positive)=1%. This problem was one of the besetting sins accompanying the introduction of DNA evidence in court and failure to attend to it is called the prosecutor’s fallacy.  In this case we have reports of false positive rates as high as 47%, hardly encouraging.

A further problem with the studies reviewed by the working party is how well they measure what we are interested in. In the experimental studies there is a general failure in psychology to use single or double blinding. There is the question of how good a model the experiment is for the kind of lies we are claiming to detect. We have mock crimes, i.e. acts that are not crimes, and subjects who either did or did not do the act and are being tested for guilt. In the  field studies we fall into notorious difficulties such as impossibility of single or double blinding, biased interpretation and lack of independence of what is being tested from what is counted as the truth of the situation (i.e. confessions are not independent of lie detector testing…if you pass the test you are less likely to confess even if you are guilty). For these reasons the outcomes of the studies are generally unreliable and give only weak evidence of the true rates.

On this basis I think that for the law and the police to use the lie detector would be a clear derogation of their epistemic duty. Given the significance of that duty for the law and the police I think it is highly unlikely to be outweighed by other duties that bear and consequently the proposed use of lie detectors is very probably immoral.

Share on

4 Comment on this post

  1. Taking a single dose of an anti-anxiety pill will even normalize some of the physiological fluctuations that signify “lying.”

  2. The “pipeline to truth” effect is why so many authorities love polygraphs: if the subject believes the machine will correctly detect deception they will be more likely to confess. But this means that systematic use of polygraphs *requires* spreading misconceptions, making deceit (or at least over-selling, since many people in law enforcement also hold naive views towards the technology) towards people in a particular vulnerable situation part of standard procedure. It also means that spreading true information about the epistemic validity of polygraphs will interfere with the legal procedures: police could argue that telling people about the facts would impede many investigations. It hence seems that accepting current deception detection systems might undermine the epistemic quality of society.

  3. Anthony Drinkwater

    Thank you, Dr Shackel, for your post, with which I agree entirely!
    I would suggest that the reason why at least some of us object to the spreading of misconceptions is that it treats us as children rather than adults. And that this is based on an absolute value : regardless of whether there is a net benefit to universal well-being or not, infantilising (dumbing down, some might say) is wrong.

  4. Yes, I think these comments are correct. If, as Anders suggests, a necessary condition on the lie detector being reliable is that there are widespread false beliefs about it then its success requires an epistemic corruption of society which is a further strike against it. Mr Drinkwater’s point is that creating misconception requires government paternalism. For some reason government paternalism is popular but its popularity is in turn based on a false belief: that the government can look after you better than you can. So not only is the government paternalism required by the lie detector a social corruption in itself but the acceptance of government paternalism also depends on a further epistemic corruption. It just gets worse and worse!

Comments are closed.