Asbestos kills more people per year than excessive sun exposure, yet it receives much less attention. Tom Douglas (Oxford Uehiro Centre for Practical Ethics) explains why asbestos is still a serious public health threat and what steps should be undertaken to reduce this threat. And yes, the snow in The Wizard of Oz was asbestos!
Science and medicine have done a lot for the world. Diseases have been eradicated, rockets have been sent to the moon, and convincing, causal explanations have been given for a whole range of formerly inscrutable phenomena. Notwithstanding recent concerns about sloppy research, small sample sizes, and challenges in replicating major findings—concerns I share and which I have written about at length — I still believe that the scientific method is the best available tool for getting at empirical truth. Or to put it a slightly different way (if I may paraphrase Winston Churchill’s famous remark about democracy): it is perhaps the worst tool, except for all the rest.
Scientists are people too
In other words, science is flawed. And scientists are people too. While it is true that most scientists — at least the ones I know and work with — are hell-bent on getting things right, they are not therefore immune from human foibles. If they want to keep their jobs, at least, they must contend with a perverse “publish or perish” incentive structure that tends to reward flashy findings and high-volume “productivity” over painstaking, reliable research. On top of that, they have reputations to defend, egos to protect, and grants to pursue. They get tired. They get overwhelmed. They don’t always check their references, or even read what they cite. They have cognitive and emotional limitations, not to mention biases, like everyone else.
At the same time, as the psychologist Gary Marcus has recently put it, “it is facile to dismiss science itself. The most careful scientists, and the best science journalists, realize that all science is provisional. There will always be things that we haven’t figured out yet, and even some that we get wrong.” But science is not just about conclusions, he argues, which are occasionally (or even frequently) incorrect. Instead, “It’s about a methodology for investigation, which includes, at its core, a relentless drive towards questioning that which came before.” You can both “love science,” he concludes, “and question it.”
I agree with Marcus. In fact, I agree with him so much that I would like to go a step further: if you love science, you had better question it, and question it well, so it can live up to its potential.
And it is with that in mind that I bring up the subject of bullshit.
Written by Tom Douglas
Headlines such as these occur with monotonous regularity. Widespread asbestos use throughout much of the 20th century has ensured that the next contamination scandal is never far off, and asbestos-related legal decisions and personal tragedies often make the news as well. But despite the ongoing media attention, asbestos has not captured the public imagination as a public health threat, at least, not in comparison to other comparable threats like excessive sun exposure and drink driving.
Asbestos is a versatile fibrous mineral that can be cheaply mined and has unusual fire resistance and durability. Its use exploded in the twentieth century, when it was included in such diverse products as automobile brake linings, pipe insulation, ceiling and floor tiles, textured paints, concrete, mattresses, electric blankets, heaters, ironing boards and even piano felts. There is no safe threshold for exposure to asbestos dust, with even single exposures having been linked to cancer. Rates of asbestos-related cancer have recently been on the rise in Europe and Japan and look set to climb in many developing countries where asbestos is still being widely used, often without safety precautions. According to WHO estimates, asbestos now causes more deaths globally than excessive sun exposure. In the UK it is estimated to cause almost three times as many deaths as road traffic accidents.
In the strange, upside-down world of the Southern Hemisphere, cold and gloomy Winter is quietly slinking away, and raucous Spring in all his glory begins to stir. Ah, Spring! The season of buds and blooms and frolicking wildlife. One rare species of wildlife, however, finds itself subject to an open hunting season this Spring – the anti-vaxxer.
In April this year, the Australian Federal Government announced a so-called “no jab, no pay” policy. Families whose children are not fully vaccinated will now lose subsidies and rebates for childcare worth up to almost AUD$20,000 per child, except if there are valid medical reasons (e.g. allergies). Previously, exemptions had been made for conscientious and religious objectors, but these no longer apply forthwith.
Taking things a step further, the Victorian State Government earlier this week announced an additional “no jab, no play” policy. Children who are not fully vaccinated, except once again for valid medical reasons, will additionally now be barred from preschool facilities such as childcare and kindergartens.
I should, at this point, declare my allegiances – as a finishing medical student, I am utterly convinced by the body of scientific evidence supporting the benefits of childhood vaccination. I am confident that these vaccines, while posing a very, very small risk of severe side-effects like any other medicine, reliably prevent or markedly reduce the risk of contracting equally severe diseases. And finally, I believe that the goal of universal childhood vaccination is one worth pursuing, and is immensely beneficial to public health.
Despite my convictions, however, I still find myself wondering if the increasingly strict vaccination regime in Australia, and every-increasing punishments for anti-vaxxers, is necessarily the best means to go about achieving a worthy goal. It’s not clear, to me, that the recent escalation will have significant positive effects beyond a mere simple political stunt.
Many people are suspicious about being manipulated in their emotions, thoughts or behaviour by external influences, may those be drugs or advertising. However, it seems that – unbeknown to most of us – within our own bodies exist a considerable number of foreign entities. These entities can change our psychology to a surprisingly large degree. And they pursue their own interests – which do not necessarily coincide with ours.
Not long ago the UK implemented an NHS surcharge – an extra fee that non-EEA nationals (Australia and New Zealand are also exempt) applying for leave to remain in the UK must pay. It costs £200 per year, and must be paid up front. So, for example, if you are applying for a work visa for 3 years, and you have a family of three, you must pay £1800 to cover the surcharge for you and your family (on top of other visa costs).
It is difficult to find much public discussion in the UK regarding this surcharge, outside of a few articles that recently noted that the surcharge is unlikely to do what we were told it would do – namely, benefit the NHS. (See here)
Is the surcharge a just policy? Continue reading
A recent series of papers have constructed a biochemical pathway that allows yeast to produce opiates. It is not quite a sugar-to-heroin home brew yet, but putting together the pieces looks fairly doable in the very near term. I think I called the news almost exactly five years ago on this blog.
People, including the involved researchers, are concerned and think regulation is needed. It is an interesting case of dual-use biotechnology. While making opiates may be somewhat less frightening than making pathogens, it is still a problematic use of biotechnology: millions of people are addicted, and making it easier for them to get access would worsen the problem. Or would it?
Let us suppose we have a treatment and we want to find out if it works. Call this treatment drug X. While we have observational data that it works—that is, patients say it works or, that it appears to work given certain tests—observational data can be misleading. As Edzard Ernst writes:
Whenever a patient or a group of patients receive a medical treatment and subsequently experience improvements, we automatically assume that the improvement was caused by the intervention. This logical fallacy can be very misleading […] Of course, it could be the treatment—but there are many other possibilities as well. Continue reading