By: Julian Savulescu
Stephen Hawking, the Cambridge physicist, has recently argued, in a Discovery channel documentary, that alien life forms probably exist somewhere in the Universe, but we should avoid contact with them. (http://news.bbc.co.uk/1/hi/uk/8642558.stm). His reason is, apparently, that if they are anything like humans, they are likely to be aggressive and either exterminate us or pillage our resources.
"If aliens visit us, the outcome would be much as when Columbus landed in America, which didn't turn out well for the Native Americans," he said. "We only have to look at ourselves to see how intelligent life might develop into something we wouldn't want to meet."
Of course, aliens might be thoroughly different to us. Whether we should make contact with them will depend on the information available to us relating to the threat which they or the benefits they might provide. However, the tricky bit is deciding whether to get information, or more information. Perhaps that issue will be decided for us.
However, Hawking’s interesting claims have another implication for us. He implies we are not the kinds of beings which should be encountered, which “we wouldn’t want to meet”. We are dangerous. Here he is surely right. And our powers will exponentially increase through the relentless progress technology. We will soon have the superpowers of the aliens which Hawking imagines and will likely strip our planet of resources and make it one day uninhabitable. We will likely be the future aliens which Hawking is imagining.
That is, if we do not destroy ourselves with our technology first. Hawking has discussed this possibility. His solution then was genetic engineering to make ourselves wiser and less aggressive. I have recently been working on human moral enhancement (see references). I have discussed his claims there. I have argued that if technology, especially cognitive enhancement, continues to increase in power, we must enhance our very limited moral dispositions. We must address the characteristics which have caused us to obliterate peoples and we pose such a threat to ourselves.
But Hawking’s comments provide another reason for human moral enhancement: the threat we pose to the Universe and other sentient life forms. We have a moral obligation not to become the aggressive threatening aliens which Hawking rightfully fears. We have an obligation not only to ourselves, our world but to the Universe.
References
Savulescu, J. and Persson, I. (forthcoming), ‘Unfit for the Future? Human Nature, Scientific Progress and the Need for Moral Enhancement’. In Ter Meulen, R., Kahane, G., and Savulescu, J. (eds.). Enhancing Human Capacities. Oxford: Wiley-Blackwell
Savulescu, J. and Persson, I., (submitted March 2010) ‘The Turn for Ultimate Harm: A Reply to Fenton’. Journal of Medical Ethics
Persson, I. and Savulescu, J., (forthcoming) ‘Moral Transhumanism’. Journal of Medicine and Philosophy, thematic issue on Transhumanism and Bioethics
Persson, I., and Savulescu, J., (2008). ‘The Perils of Cognitive Enhancement and the Urgent Imperative to Enhance the Moral Character of Humanity’. Journal of Applied Philosophy. 25:3 pp. 162 – 177.
The television program image of rapacious aliens coming to take our resources is strongly tied to the idea of the aliens as something akin to us, biologically and motivationally. That is likely a mistake. However, there are good reasons to be concerned with expansionist intelligences.
Robin Hanson’s paper “Burning the Cosmic Commons” shows that cultural evolution tends to favour space colonies strongly motivated to use their resources to build more ‘descendant’ space colonies. In the “space ecosystem” this tendency would in the long run lead to a dominance of extreme colonizers that use all available resources to colonize more (there can certainly be sensible colonizers too, but they will tend to be crowded out). Similarly Stephen M. Omohundro argues in his paper “The Basic AI Drives” that self-improving artificial intelligences will tend to develop tendencies that, if not counteracted, make them dangerously expansionist.
I have argued in one of my own papers, that the best defence against any external threat of this kind is a system of self-replicating robotic “police” devices spreading through space and enforcing good behaviour (like not releasing any dangerous replicators). Of course, any aliens encountering these will likely not be pleased, if only because now they do not have the chance to impose their own policing.
Now, if we somehow manage to become more moral as a species, what ought we do? It still seems likely that we might send out those police probes to protect us and possibly other, undefended species.
Hawking’a anthropology is doubtful,because his “home-made”anthropological culture of precision is not sufficient.In particular:
1.Analogy with “Native Americans”,probably, is not scientific,moreover, Columbus – like discoverers cannot be associated with taking “aliens” seriously.
2.To be here [ for “aliens-observers”] is to know how to contact “with experimental finesse” ,etc.
Comments are closed.