Studying the possible causes of human extinction
Matt Pickles | 24 Apr 13
Anyone skimming the BBC News Online website over breakfast could be forgiven for heading straight back to bed after reading about research into the possible causes of human extinction by Oxford University’s Future of Humanity Institute.
The Future of Humanity Institute is part of the Philosophy Faculty and the Oxford Martin School, and its work has unsurprisingly met with considerable interest – the story was one of the most read on the BBC website today, while an article in Aeon earlier this year attracted a long debate in the comments section beneath.
The Institute looks into existential risks, which Dr Bostrom defines as 'those that threaten the entire future of humanity'. This refers not to issues like pandemics and natural disasters, which could be calamitous but from which Dr Bostrom believes humanity would be likely to survive.
But it is experiments in areas such as synthetic biology, nanotechnology and machine intelligence which cause Dr Bostrom to worry.
He says the 'fatal indifference' machines with artificial general intelligence might have to human life should be considered by research in the field, and Institute conference held a conference on this issue in December.
'Our species is introducing entirely new kinds of existential risk - threats we have no track record of surviving,' he says. 'Our longevity as a species therefore offers no strong prior grounds for confident optimism.'
The Future of Humanity Institute hopes its warnings about existential risks will reach the ears of decisionmakers across the world and says that the issue is not being taken seriously enough at present.
'While millions of pounds are pumped into researching artificial intelligence (AI) and bringing the possibilities of AI closer to reality than ever before, relatively little thought has gone into the ethical and safety implications of AI,' says Dr Stuart Armstrong of the Faculty of Philosophy.
Dr Bostrom’s 2002 paper on existential risk and his recent paper called ‘Existential Risk Prevention as Global Priority’ can be read here.
Top image: Nick Bostrom, director of the Future of Humanity Institute
