Skip to content

Evidence on Evidence Based Policy

You were no doubt as surprised as I was when the Blair government announced it was henceforth doing evidence based policy. It was just like when the medical profession said it was going to do evidence based medicine. You mean—they weren’t already? Still, even though the promised reform doesn’t really sweeten the bitter truth, it is a move in the right direction. Or at least, it is a promise to take a move in the right direction. But was the move taken?  When it comes to the, perhaps minor, example of speed cameras, we have clear evidence whether the policy was evidence based. And the evidence says, No!

According to official figures today ‘Speed cameras have failed to cut accidents on many roads and have actually led to a rise in casualties on some routes’  (http://www.telegraph.co.uk/motoring/news/8719263/Speed-cameras-fail-to-cut-accidents.html).

In fact, this shouldn’t be a surprise at all. This evidence has been around for a long time. All the common errors and biases in thinking have been put forward to justify speed cameras, from confirmation bias to ignoring base rates. Statisticians of many stripes pointed out on many occasions the fallacious statistical thinking being used to justify their introduction. For example, the claim that speed cameras cut accidents was based on noting that there were fewer accidents at a camera site in the year after it was introduced than in the year before. Sound convincing? It shouldn’t do. If you examine nearly all the placed that had accidents last year, they had fewer this year. This is called regression to the mean. The very incidents that are used to justify introducing a camera are highly likely to be a random coincidence. Taking this to be evidence is like a lousy archer shooting arrows at a barn and then painting his target round the area of a lucky cluster.

A further symptom of evidence based policy would be making possible the systematic examination of whether a policy is succeeding by publishing all the relevant data. In this case, it would not be hard to do, but the fact is that councils are not publishing the data and will not give you the data if you ask—just as bad as the climate scientists who refused to share their data with critics because those critics were trying to prove them wrong! Apparently, the Department of Transport will ‘now conduct a detailed statistical analysis of the data to assess the effectiveness of speed cameras in improving road safety’. So only 19 years after they were introduced, and only 17 years after independent statisticians proved the inadequacy of the claims being made that speed cameras improve safety, the DOT gets round to checking if it was a good idea.

Perhaps the most worrying fact is that it is probably naïve to expect politicians to institute real evidence based policy and most especially to expect them to institute evidence based assessment of the success of their policies. Most of the benefit to a politician of a policy is in the announcement and the nice title for the law—if it wasn’t so obviously bogus I’m sure we would have the ‘making everything good’ law.  Provided the policy sounds like it fits their prejudices his electorate will re-elect him, so there is little additional benefit to showing it worked but a big cost if the evidence shows it didn’t. Or perhaps you thought politicians were motivated by more noble sentiments than the rest of us?

Share on

7 Comment on this post

  1. This ramshackle collection of data is not good-quality evidence because it has not been randomised. To properly trial this, you’d need to identify sites where speed cameras were deemed useful, but only deploy them at half of those locations, and watch accident rates in sites with and without cameras. Without this crucial randomisation, correlation and causation cannot be disentangled: even if the non-reduction in accidents turns out to be statistically significant nationwide, it could simply be that the kinds of places where speed cameras have been placed are the kinds of places where accidents would have increased anyway.

  2. In addition to what Andrew said, even if there was proven to be no effect on the the number of collisions, speeed cameras might be justified: cameras might still have reduced the death and serious injury rates by converting high speed impacts into low speed impacts. The recently published data on this goes ignored by the author here and by the journalist.

    And there's no justification at all for the article's headline claim that: "Speed cameras have … led to a rise in casualties on some routes, official figures show," which is repeated by the author here. The figures cited do not indicate any causal relationship!

  3. Also, contra the nonsense published by the Telegraph, the best extant evidence is that speed cameras are effective. Here's a gold standard Cochrane meta-analysis (that is: a review of published, peer-reviewed studies):
    http://onlinelibrary.wiley.com/doi/10.1002/14651858.CD004607.pub4/abstract
    Summary:
    "The consistency of reported reductions in speed and crash outcomes across all studies show that speed cameras are a worthwhile intervention for reducing the number of road traffic injuries and deaths. However, whilst the the evidence base clearly demonstrates a positive direction in the effect, an overall magnitude of this effect is currently not deducible due to heterogeneity and lack of methodological rigour. More studies of a scientifically rigorous and homogenous nature are necessary, to provide the answer to the magnitude of effect."

  4. Well, my point is about evidence based policy in general and speed cameras are just an example of a policy that has been pursued with very little attention to the evidence, very little attention to getting evidence and with politicians claims about safety grounded in known fallacious forms of thinking. I can see that I might have been thought to endorse the Telegraph’s headline. It wouldn’t surprise me if it is true but it wouldn’t surprise me if it is false either, because claims to the contrary have often been based on the fallacies I mentioned, and so too can be this claim. But what I mainly meant shouldn’t be a surprise is to learn that the evidence on evidence based policy is that not much evidence has got into the base. This has been evident for some time and I continue to think that the case of speed cameras is an excellent example of this.
    I don’t disagree with Mr Steele’s remarks because they are merely a reiteration of criticisms of politician’s and governmental claims about speed cameras that have been made by statisticians all along. Simon is dismissive of the Telegraph headline, but doesn’t seem to have noticed that the claims by politicians that speed cameras reduce casualties have been just as shakily based as the headline to which he takes such exception—but then it is so hard to spot when politicians pander to prejudices of which we approve.
    If the mentioned study is as Simon says the best extant evidence then the best extant evidence is not very good. The summary quoted is inconsistent since unless you know the magnitude of an effect and the costs of an intervention (including here the costs of all lengthened journey times) you can’t know if it is a worthwhile intervention. The summary is also misleading. Take ‘Excessive speed (driving faster than the posted limit or too fast for the prevailing conditions) has been found to contribute to a substantial number of crashes.’ That excessive speed defined as ‘too fast for the prevailing conditions’ has been found to contribute to a substantial number of crashes is a merely analytic truth, since here the meaning of ‘too fast…etc’ includes such contributions. So that claim has no empirical content and it’s use here is nothing more than to insinuate what hasn’t been shown about driving faster than a posted limit. Such techniques belong to the politicians propaganda armoury. To find them used in a purportedly peer reviewed journal is shocking. I don’t have time to go over the studies they based this study on but it is worth noting that after 20 years they could only find 28 studies that they could include and they acknowledge that the quality of those studies was ‘moderate at best’. Nothing they say in the summary shows that the 28 studies on crashes they cite aren’t based on data that is regressing to the mean. Finally, their methodology is not (contrary to what Simon says) a meta-analysis because, as they say ‘due to considerable heterogeneity between and within included studies, a meta-analysis was not appropriate’.

  5. "Perhaps the most worrying fact is that it is probably naïve to expect politicians to institute real evidence based policy and most especially to expect them to institute evidence based assessment of the success of their policies."

    The civil servants I know would by and large agree that they skimp on the review or evaluation phase of the policy cycle. Basically, once a policy is in place, none of those involved in its design or implementation have many incentives to revisit it; data comes in gradually ("the policy will work, but we expect some adjustment costs while it beds in"), the civil service and govt have lots more work to get through, and of course those associated with designing, selling and implementing policies are not necessarily going to be cool about sober ex-post assessment of those policies.

    Given the current enthusiasm for independent institutions, I've often wondered if it would be interesting to have an Independent Commission for Policy Evaluation, populated by wise old heads from the civil service and folks who are good with data. They'd conduct forensic evaluations of policy and report their findings. It wouldn't solve the generic problems in "evidence" and policy but it might be a valuable way of compensating for a predictable pathology in the current system.

  6. Fair enough, Nick, I shouldn't have called the Cochrane study a meta-analysis. It was a systematic review. That said, you wrote: "Nothing they say in the summary shows that the 28 studies on crashes they cite aren’t based on data that is regressing to the mean."
    Since when was it good epistemic practice to criticize an article based only on what's (not) in the summary? Here's part of what they say in the article about RTM: "in regards to risk of bias and potential confounders, our main focus was on three variables known to be particularly important in road safety evaluations i.e. regression to the mean, long term trends and changes in traffic volumes … We checked whether studies had controlled for regression to the mean (RTM) and for those which did not take it into account, we assessed where possible, if it was likely to be an important source of bias. Only ten studies out of a probable twenty four studies, where RTM may have been a sizeable factor, either described and/ or controlled for its effects."
    At the end of the day, the authors of the Cochrane study are careful and methodological – and they agree with you that there is a need for better quality evidence. It's just a shame that people like the Telegraph editors present absence of good quality evidence as if it were evidence of the opposite (and by extension yourself, by clearly endorsing their headline claim as you did in in your post – I don't know how else to interpret writing of the form, e.g. "'The holocaust never happened' (Sunday Sport, 2011) In fact, this shouldn't be a surprise. The evidence has been around for a long time.")
    Finally, it's simply not true, as you now suggest in your comments, that the claim that speed cameras reduce casualties is "as shakily based" as the claim that they have led to a rise in casualties on some routes. Where is there any evidence (beyond Telegraph headlines, that is) for the latter claim?

  7. The elephant in the room on this thread is the assumption that evidence-based policy is necessarily what we should be aiming for. That politicians are motivated to a significant (though not exclusive, some of them actually care about their legacy) extent by "the announcement and the nice title" is only "worrying" if we've previously been stuck in some absurdly naïve bubble of wishful thinking (and thought that Yes Minister was a joke). It's just a fact of life.

    What makes this OK (ish) is the checks and balances contained in modern democracies, the balance between political accountability and institutional independence (at least one can hope that short-termist politicians and self-serving mandarins will sometimes cancel each other out), a degree of public scrutiny (but don't expect full transparency, that's also naïve and often counter-productive).

    But the real problem with the concept of "evidence-based policy" is the implicit moral realist assumption that if we only knew all the facts, we would know what to do. No. First we have to know what we are aiming for. Then the facts (i.e. evidence) can tell us how we are likely to get there. Furthermore, our brains process vast amounts of information, performing cost-benefit analyses against objectives of which we are usually not consciously aware. This is what determines people's actual behaviour. The role of conscious, public policy is to correct some of the more obvious biases and undesirable effects that result from that behaviour.

    None of this is to say that evidence is unimportant, but policy needs to be based on more than just evidence. It needs to be based on clear objectives, and sensitivity to the public's wishes even when they appear irrational or ill-informed. Evidence needs to enter at some point, but successful policy will more likely result from trial and error based around honest objectives than sophisticated analysis that provides a spurious sense of certainty.

    The above discussion on whether speed cameras do or don't reduce accident rates where they are installed is a good example of where an obsessive focus on a particular type of "evidence" can completely miss the point. Many people feel that breaking the law should be punished (except of course when we are the ones who have broken the law, then we come up with all sorts of excuses). Either have speeding laws and enforce them, or just turn them into recommendations and hope for the best. Speed cameras are more than anything else an anti-cheat device. All societies have anti-cheat devices, it's part of the glue that binds them together. Whether cameras also have the result of reducing accident rates where they are installed is a relevant but to some extent marginal consideration.

Comments are closed.