Humanity may have already created its own nemesis, Professor Stephen Hawking warned last week. The Cambridge University physicist claimed that new developments in the field of artificial intelligence (AI) mean that within a few decades, computers thousands of times more powerful than in existence today may decide to usurp their creators and effectively end humanity’s 100,000-year dominance of Earth.
This Terminator scenario is taken seriously by many scientists and technologists. Before Prof. Hawking made his remarks, Elon Musk, the genius behind the Tesla electric car and PayPal, had stated that “with artificial intelligence, we are summoning the demon,” comparing it unfavourably with nuclear war as the most potent threat to humanity’s existence.
Aside from the rise of the machines, many potential threats have been identified to our species, our civilization, even our planet. To keep you awake at night, here are seven of the most plausible.
Our solar system is littered with billions of pieces of debris, from the size of large boulders to objects hundreds of kilometres across. We know that, from time to time, these hit the Earth. Sixty-five-million years ago, an object – possibly a comet a few times larger than the one on which the Philae probe landed last month – hit the Mexican coast and triggered a global winter that wiped out the dinosaurs. In 1908, a smaller object hit a remote part of Siberia and devastated hundreds of square kilometres of forest. Last week, 100 scientists, including Lord Rees of Ludlow, the Astronomer Royal, called for the creation of a global warning system to alert us if a killer rock is on the way.
Probability: remote in our lifetime, but one day we will be hit.
Result: there has been no strike big enough to wipe out all life on Earth – an “extinction-level event” – for at least three billion years. But a dino-killer would certainly be the end of our civilization and possibly our species.
Prof. Hawking is not worried about armies of autonomous drones taking over the world, but something more subtle – and more sinister. Some technologists believe that an event they call the Singularity is only a few decades away. This is a point at which the combined networked computing power of the world’s AI systems begins a massive, runaway increase in capability – an explosion in machine intelligence. By then, we will probably have handed over control to most of our vital systems, from food distribution networks to power plants, sewage and water treatment works, and the global banking system. The machines could bring us to our knees without a shot being fired. And we cannot simply pull the plug, because they control the power supplies.
Result: if the web wakes up and wants to sweep us aside, we may have a fight on our hands (perhaps even something similar to the man vs. machines battle in the Terminator films). But it is unlikely that the machines will want to destroy the planet – they “live” here, too.
This is possibly the most terrifying short-term threat because it is so plausible. The reason Ebola has not become a worldwide plague – and will not do so – is because it is so hard to transmit, and because it incapacitates and kills its victims so quickly. However, a modified version of the disease that can be transmitted through the air, or which allows its host to travel around for weeks, symptom-free, could kill many millions. It is unknown whether any terror group has the knowledge or facilities to do something like this, but it is chilling to realize that the main reason we understand Ebola so well is that its potential to be weaponized was quickly realized by defence experts.
Probability: someone will probably try it one day.
Result: potentially catastrophic. “Ordinary” infectious diseases such as avian-flu strains have the capability to wipe out hundreds of millions of people.
This is still the most plausible “doomsday” scenario. Despite arms-limitations treaties, there are more than 15,000 nuclear warheads and bombs in existence – many more, in theory, than would be required to kill every human on Earth. Even a small nuclear war has the potential to cause widespread devastation. In 2011, a study by NASA scientists concluded that a limited atomic war between India and Pakistan involving just 100 Hiroshima-sized detonations would throw enough dust into the air to cause temperatures to drop more than 1.2C globally for a decade.
Probability: high. Nine states have nuclear weapons, and more want to join the club. The nuclear wannabes are not paragons of democracy.
Result: it is unlikely that even a global nuclear war between Russia and NATO would wipe us all out, but it would kill billions and wreck the world economy for a century. A regional war, we now know, could have effects far beyond the borders of the conflict.
Before the Large Hadron Collider (LHC), the massive machine at CERN in Switzerland that detected the Higgs boson a couple of years ago, was switched on, there was a legal challenge from a German scientist called Otto Rossler, who claimed the atom-smasher could theoretically create a small black hole by mistake – which would then go on to eat the Earth.
The claim was absurd: the collisions in the LHC are far less energetic than those caused naturally by cosmic rays hitting the planet. But it is possible that, one day, a souped-up version of the LHC could create something that destroys the Earth – or even the universe – at the speed of light.
Probability: very low indeed.
Result: potentially devastating, but don’t bother cancelling the house insurance just yet.
Many scientists have pointed out that there is something fishy about our universe. The physical constants – the numbers governing the fundamental forces and masses of nature – seem fine-tuned to allow life of some form to exist. The great physicist Sir Fred Hoyle once wondered if the universe might be a “put-up job”.
More recently, the Oxford University philosopher Nick Bostrom has speculated that our universe may be one of countless “simulations” running in some alien computer, much like a computer game. If so, we have to hope that the beings behind our fake universe are benign – and do not reach for the off-button should we start misbehaving.
Probability: according to Professor Bostrom’s calculations, if certain assumptions are made, there is a greater than 50% chance that our universe is not real. And the increasingly puzzling absence of any evidence of alien life may be indirect evidence that the universe is not what it seems.
Result: catastrophic, if the gamers turn against us. The only consolation is the knowledge that there is absolutely nothing we can do about it.
Almost no serious scientists now doubt that human carbon emissions are having an effect on the planet’s climate. The latest report by the Intergovernmental Panel on Climate Change suggested that containing temperature rises to below 2C above the pre-industrial average is now unlikely, and that we face a future three or four degrees warmer than today.
This will not literally be the end of the world – but humanity will need all the resources at its disposal to cope with such a dramatic shift. Unfortunately, the effects of climate change will really start to kick in just at the point when the human population is expected to peak – at about nine billion by the middle of this century. Millions of people, mostly poor, face losing their homes to sea-level rises (by up to a metre or more by 2100) and shifting weather patterns may disrupt agriculture dramatically.
Probability: it is now almost certain that CO2 levels will keep rising to 600 parts per billion and beyond. It is equally certain that the climate will respond accordingly.
Result: catastrophic in some places, less so in others (including northern Europe, where temperature rises will be moderated by the Atlantic). The good news is that, unlike with most of the disasters here, we have a chance to do something about climate change now.