Skip to content

Should bio-scientists think about bio-weapons?

Following the September 11 attacks and subsequent Anthrax attacks, the US began introducing new biosecurity regulations as a counter to bioterrorism. The centrepiece of the new regulatory framework has been a list of 'select agents' – pathogens with particular potential for use in weapons of mass destruction. Agents on the list are subject to special regulatory measures limiting how the agents can be stored, transported and used.

Last week, the Proceedings of the National Academy of Sciences published an analysis of the effects of the new regulations. The authors estimate that there has been a two to five fold decrease in the ratio of scientific progress to amount of funding for research on select agents over the relevant period. Picking up the story, an article in The Scientist magazine claims that the apparent loss of efficiency is due to the chilling effect of the new regulations on research (though see the comments for some alternative explanations). It quotes scientists bemoaning the huge amount of paperwork imposed by the regulations and noting the difficulties that they create for international collaboration and, given the need for extensive background checks and psychological testing, staff recruitment.

It's interesting to consider the extent to which the Scientist's complaints (and scientists' worries more generally) are are an objection to the way that biosecurity is being done, or to the very idea of biosecurity.

No doubt there are genuine concerns with way biosecurity has been done. It's widely believed, for example, that the regulatory reaction following 2001 was both excessively restrictive, and excessively focused on bio-terrorism, rather than state sponsored biological warfare, which many regard as a much more serious threat. But one suspects that even if finely-grained better-focused biosecurity regulations had been introduced in a cautious and responsive way, there would still have been a fair amount of resistance from the scientific community. Biosecurity regulations applied to militaries, industry or agriculture might leave science pretty much unscathed. But biosecurity applied to science itself can really only work (by which I mean, reduce the risk of biological attack) by directly or indirectly influencing what kind of scientific work gets done (for example, through restricting funding to 'high risk' areas of research), by limiting the dissemination of scientific knowledge (for example, by restricting publication of 'high risk' research), or by restricting who gets access to the the risky pathogens (for example, through security checks on lab personnel). All of these are in tension with strong norms of the life sciences. The first impinges on freedom of inquiry, the second on freedom of expression, and the third on a more general scientific ethos of openness and inclusiveness. We might thus expect any form of biosecurity to provoke significant opposition from life scientists. Biosecurity means making violating or suspending widely held professional norms in much the same way that euthanasia does in medicine.

Regulators and others who support active biosecurity measures thus seem to face a choice between accepting ongoing discontent from scientists, or attempting to change the norms of the life sciences, for example, by questioning those who hold that rights to freedom or inquiry and expression admit of no exception. It's tempting to favour the second approach on the ground that it will ultimately foster a better and more stable relationship between regulators and scientists, and that it will relieve some of the burden on regulators: if scientists themselves are thinking about biosecurity, then there'll be less need for aggressive regulation. But there's something to be said for the first option as well. Tension between regulators and the regulated is not always a bad thing. After all, sometimes there are good reasons for regulators to adopt different norms from the regulated. For example, some believe that economies function best when individual economics agents follow norms of aggressive profit/utility maximization while regulators look out for the social good. Similarly, some believe that doctors should follow the norm 'always do the best for the patient in front of you, without regard to the costs for society', while hospital administrators, insurance companies or public healthcare providers keep a lid on healthcare costs. Perhaps there are good reasons to accept a similar 'division of labour' in the sciences. On this view, scientists should perhaps adopt norms that recommend relentless curiosity and the aggressive pursuit of truth regardless of the social consequences, while regulators concern themselves with issues like the prevention of biological warfare.

This approach will only work, however, if effective biosecurity regulation is possible even without the co-operation of the regulated. It's far from clear that this is the case. It must be extremely difficult for regulators to find individuals with 'terrorist tendencies' working in laboratories handling potentially dangerous pathogens; it may be much easier for scientists themselves to identify such individuals. If it's right that effective biosecurity without the co-operation of scientists is difficult or impossible, there seems to be a strong case for attempting to change scientific norms, for example, by seeking to stimulate thinking by scientists about the security implications of certain pathogens, or certain types of scientific knowledge. Many of those working in one new life sciences discipline, synthetic biology, have been quite willing to start thinking about these issues. And in the physical sciences, norms certainly changed away from 'blind openness' (perhaps too far) following the advent of nuclear fission. This gives us some reason to suppose that norms could change in other disciplines too.

REFERENCES:

Dias et al. Effects of the USA PATRIOT Act and the 2002 Bioterrorism Preparedness Act on select agent research in the United States. Proceedings of the National Academy of Sciences  (2010).   

Grant, Bob. Biosecurity laws hobble research. The Scientist. 10 May 2010. 

Share on