Follow Rebecca on Twitter here
I’m working on a paper entitled ‘Cyborg justice: punishment in the age of transformative technology’ with my colleagues Anders Sandberg and Hannah Maslen. In it, we consider how punishment practices might change as technology advances, and what ethical issues might arise. The paper grew out of a blog post I wrote last year at Practical Ethics, a version of which was published as an article in Slate. A few months ago, Ross Andersen from the brilliant online magazine Aeon interviewed Anders, Hannah, and me, and the interview was published earlier this month. Versions of the story quickly appeared in various sources, beginning with a predictably inept effort in the Daily Mail, and followed by articles in The Telegraph, Huffington Post, Gawker, Boing Boing, and elsewhere. The interview also sparked debate in the blogosphere, including posts by Daily Nous, Polaris Koi, The Good Men Project, Filip Spagnoli, Brian Leiter, Rogue Priest, Luke Davies, and Ari Kohen, and comments and questions on Twitter and on my website. I’ve also received, by email, many comments, questions, and requests for further interviews and media appearances. These arrived at a time when I was travelling and lacked regular email access, and I’m yet to get around to replying to most of them. Apologies if you’re one of the people waiting for a reply.
I’m very happy to have started a debate on this topic, although less happy to have received a lot of negative attention based on a misunderstanding of my views on punishment and my reasons for being interested in this topic. I respond to the most common questions and concerns below. Feel free to leave a comment if there’s something important that I haven’t covered. Continue reading
There are reports in the press this week that the remains of 86 unborn fetuses were kept in a UK hospital mortuary for months or even years longer than they should have been. The majority were fetuses less than 12 weeks gestation. According to the report, this arose because of administrative error and a failure to obtain the necessary permissions for cremation.
The hospital has publicly apologized, and set up an enquiry into the error. They are planning to cremate the remaining fetuses. However, they have decided not to contact all of the families and women whose fetal remains were kept on the basis that this would likely cause a greater amount of distress.
Is this the right approach? Guidelines and teaching in medical schools encourage health-care professionals and institutions to own up to their errors and disclose them to patients. Is it justifiable then to not reveal errors on the grounds that this would be too upsetting? How much transparency is desirable in healthcare?
Public opinion and governments wrestle with a difficult problem: whether or not to intervene in Syria. The standard arguments are well known – just war theory, humanitarian protection of civilian populations, the westphalian right of states to non-intervention, the risk of quagmires, deterrence against chemical weapons use… But the news that an American group has successfully 3D printed a working handgun may put a new perspective on things.
Why? It’s not as if there’s a lack of guns in the world – either in the US or in Syria – so a barely working weapon, built from still-uncommon technology, is hardly going to upset any balance of power. But that may just be the beginning. As 3D printing technology gets better, as private micro-manufacturing improves (possibly all the way to Drexlerian nanotechnology), the range of weapons that can be privately produced increases. This type of manufacturing could be small scale, using little but raw material, and be very fast paced. We may reach a situation where any medium-sized organisation (a small country, a corporation, a town) could build an entire weapons arsenal in the blink of an eye: 20,000 combat drones, say, and 10,000 cruise missiles, all within a single day. All that you’d need are the plans, cheap raw materials, and a small factory floor. Continue reading
Andrew Hessel, Marc Goodman and Steven Kotler sketches in an article in The Atlantic a not-too-far future when the combination of cheap bioengineering, synthetic biology and crowdsourcing of problem solving allows not just personalised medicine, but also personalised biowarfare. They dramatize it by showing how this could be used to attack the US president, but that is mostly for effect: this kind of technology could in principle be targeted at anyone or any group as long as there existed someone who had a reason to use it and the resources to pay for it. The Secret Service looks like it is aware of the problem and does its best to swipe away traces of the President, but it is hard to imagine this to be perfect, doable for old DNA left behind years ago, or applied by all potential targets. In fact, it looks like the US government is keen on collecting not just biometric data, but DNA from foreign potentates. They might be friends right now, but who knows in ten years…
Professor Paul Keim, who chairs the US National Science Advisory Board for Biosecurity, recently recommended the censoring of research that described the mutations which led to the transformation of the H5N1 bird-flu virus into a form that can be transmitted between humans through droplets in breath (in ferrets, the number of mutations required is frighteningly small – five). His reason is simple: the research would be a recipe book for bioterrorists.
Keim thinks, however, that such censorship will only delay the inevitable. The information will come out sooner or later, but at least governments might by then have developed and prepared sufficient stocks of vaccine and set in place other emergency measures to deal with a global pandemic.
This is not quite closing the stable door after the horse is bolted. It’s more like closing the farm gate, in the knowledge that eventually the horse will jump the gate and escape.
But this raises the question of why the stable door wasn’t bolted in the first place. In an article in Nature, the leader of one of the teams has said that the research was necessary to show that those experts who doubt the human transmissibility of H5N1 are wrong. But given that there is controversy here, governments should of course be doing what they have been doing: treating the possibility as a serious risk. In response to the charge that the research is dangerous, this same research leader’s response is that there is already a threat of mutation in nature. But threats don’t cancel one another, and nature is not revealing its secrets to bioterrorists. The researchers claim that their research was necessary for the development of a vaccine. Keim’s view is that this is quite implausible, since the drugs the scientists were using against their virus were the same ones used against others. If he’s right, a natural conclusion to draw is that the scientists should never have done the research in the first place. And, having done it, they should have kept quiet about its details and destroyed the virus. They might indeed have informed the media of their overall result, or some carefully restricted set of other researchers of the details of their research. But then of course they wouldn’t have been able to publish those details in top scientific journals.
It was probably hard for the US National Science Advisory Board for Biosecurity (NSABB) to avoid getting plenty of coal in its Christmas stockings this year, sent from various parties who felt NSABB were either stifling academic freedom or not doing enough to protect humanity. So much for good intentions.
The background is the potentially risky experiments on demonstrating the pandemic potential of bird flu: NSABB urged that the resulting papers not include “the methodological and other details that could enable replication of the experiments by those who would seek to do harm”. But it can merely advice, and is fairly rarely called upon to review potentially risky papers. Do we need something with more teeth, or will free and open research protect us better?
Scientists have made a new strain of bird flu that most likely could spread between humans, triggering a pandemic if it were released. A misguided project, or a good idea? How should we handle dual use research where merely knowing something can be risky, yet this information can be relevant for reducing other risks?
Australia essentially bans sex selection, except to prevent babies being born with serious sex-linked disorders. The National Health and Medical Research Councils also prohibits it in its guidelines.
A couple in the state of Victoria is currently appealing to the Victorian Civil and Administrative Tribunal to allow them to access IVF and to deliberately have a girl. The couple have had three boys naturally and lost a daughter soon after birth. They recently had IVF which resulted in a twin pregnancy. The twins were boys. They aborted the pregnancy.
I argued over 10 years ago there are no good reasons to oppose sex selection in countries like Australia.
Amphetamines and major league baseball are in the news again, with a number of busts made for the prescription drug Adderall, which contains several amphetamine stimulants in its list of active ingredients.
Governments around the world have condemned Wikileaks recent release of US diplomatic cables, often while simultaneously denying they matter; the reactions are tellingly similar to the previous reactions from the US military simultaneously claiming the leaks were highly illegal, dangerous and irrelevant. At the same time many have defended the release as helping transparency. As David Waldock twittered: "Dear government: as you keep telling us, if you've done nothing wrong, you've got nothing to fear".
Is this correct?