A growing number of scientists, mathematicians and philosophers are concerned humanity may be on the verge of omnicide: that is, human extinction as a result of human actions. One chief factor involved in overcoming this threat is learning more about human prejudices and biases and taking them into account before moving ahead with advanced technology projects that could lead to irreversible “unintended consequences.”
Dr. Nick Bostrom, Director of Oxford University’s Future of Humanity Institute, in a paper called “Existential Risk as a Global Priority” concedes pandemics and natural disasters could cause catastrophic loss of life but that humanity is likely to survive. Other recent threats such as asteroids or super volcanoes he believes are a very slight risk to humanity. There is a lot of study on “Human Extinction Scenarios“, some of which we’ll cover here.
As a side note, renowned scientist Stephen Hawking recently suggested the human race should move to space in light of the possibility of extinction.
A Brave New World of Advanced Technology
What concerns Bostrom most is the new technological era: synthetic biology, nanotechnology, machine intelligence, genetics, and others. He likens our use of these technologies as quite possibly analogous to a child playing with a loaded firearm. It is agreed that these technologies have big upsides but they also carry risks if we wield them irresponsibly.
Experiments that carry out genetic modifications and dismantle and rebuild genetic structures could lead to unintended consequences that could spread throughout the population, the environment or both. Recent examples in which the human mind has been flummoxed by complications include Japan’s Fukushima nuclear accident and China’s mounting problems with the Three Gorges Dam Project. Humans are just not able to anticipate all likely outcomes of their activities but they could take more time up front to try to manage technological change with more analysis on how it might develop.
This is the crux of Bostrom’s argument and he sees the next 87 years as a critical period in which rapidly advancing technological civilization could end up snuffing out humanity.
Video of Professor Nick Bostrom discussing his research:
Extinction Scenarios
The number of extinction scenarios is already quite large but unforeseen events are the real concern. Some of these include loss of breathable air on the planet, an extreme ice age leading to global drought, a nuclear winter destroying forests and plants, mass deforestation, contamination of all fresh water on earth, collapse of crops, plants or critical insects or animals such as bees, and the melting of the ice sheets over 300 years which would lead to sea level rises of as much as 15 feet worldwide.
An historical anecdote illustrates the position humanity will continually be placed in as it brings new technologies to the world: in the 1940s prior to the Trinity nuclear test projects scientists were concerned a fission explosion might cause a reaction and destroy Earth’s atmosphere. Hans Bethe, using Monte Carlo simulations calculated such a possibility was theoretically impossible and so the project move forward. The question is will each and every technological advancement be evaluated for its potential negative effects on the world by a genius like Hans Bethe?