existential risk

The goodness of being multi-planetary

The Economist has a leader “For life, not for an afterlife“, in which it argues that Elon Musk’s stated motivation to settle Mars – making humanity a multi-planetary species less likely to go extinct – is misguided: “Seeking to make Earth expendable is not a good reason to settle other planets”. Is it misguided, or is the Economist‘s reasoning misguided? Continue reading

Moral Agreement on Saving the World

There appears to be lot of disagreement in moral philosophy.  Whether these many apparent disagreements are deep and irresolvable, I believe there is at least one thing it is reasonable to agree on right now, whatever general moral view we adopt:  that it is very important to reduce the risk that all intelligent beings on this planet are eliminated by an enormous catastrophe, such as a nuclear war.  How we might in fact try to reduce such existential risks is discussed elsewhere.  My claim here is only that we – whether we’re consequentialists, deontologists, or virtue ethicists – should all agree that we should try to save the world.

Continue reading

Petrov Day

Today, 31 years ago, the human species nearly came to an end. Lieutenant colonel Stanislav Petrov  was the officer on duty in bunker Serpukhov-15 near Moscow, monitoring the Soviet Union early warning satellite network. If notification was received that it had detected approaching missiles the official strategy was launch on warning: an immediate counter-attack against the United States. International relations were on a hair trigger: just days before Korean Air Lines Flight 007 had been shot down by Soviet fighter jets, killing everybody onboard (including a US congressman). Kreml was claiming the jet had been on a spy mission, or even deliberately trying to provoke war.

Shortly after midnight the computers reported a single intercontinental missile heading towards Russia.

Continue reading

Live from the shooting gallery: what price impact safety?

As I am writing this post, asteroid 2012 DA14 is sweeping past Earth, inside the synchronous orbit (in fact, I am watching it on live webcast). Earlier today, an unrelated impactor disintegrated above Chelyabinsk, producing some dramatic footage and some injuries from shattered glass due to the sonic boom. It might have been the largest impactor over the last century, clocking in at hundreds of kilotons. It is no wonder people are petitioning the White House to mount a vigorous planetary defense against asteroids and comets. But what is the rational and ethical level of defense we need against astronomical threats?

Continue reading

Terminator studies and the silliness heuristic

The headlines are invariably illustrated with red-eyed robot heads: “I, Revolution: Scientists To Study Possibility Of Robot Apocalypse“. “Scientists investigate chances of robots taking over“. “‘Terminator’ risk to be investigated by scientists“. “Killer robots? Cambridge brains to assess AI risk“. “‘Terminator centre’ to open at Cambridge University: New faculty to study threat to humans from artificial intelligence“. “Cambridge boffins fear ‘Pandora’s Unboxing’ and RISE of the MACHINES: ‘You’re more likely to die from robots than cancer‘”…

The real story is that the The Cambridge Project for Existential Risk is planning to explore threats to the survival of the human species, including future technological risks. And among them are of course risks from autonomous machinery – risks that some people regard as significant, others as minuscule (see for example here, here and here). Should we spend valuable researcher time and grant money analysing such risks?

Continue reading

Ferretting out fearsome flu: should we make pandemic bird flu viruses?

Scientists have made a new strain of bird flu that most likely could spread between humans, triggering a pandemic if it were released. A misguided project, or a good idea? How should we handle dual use research where merely knowing something can be risky, yet this information can be relevant for reducing other risks?

Continue reading


Subscribe Via Email

Email *