Wind-up to end of days

Scientists from the Global Challenges Foundation and the Future of Humanity Institute, Oxford University, have compiled a “scientific assessment about the possibility of oblivion” that involves 12 most likely ways human civilisation could end on planet earth. “(This research) is about how a better understanding of the magnitude of the challenges can help the world to address the risks it faces, and can help to create a path towards more sustainable development,” the study&’s authors said.

The likelihood of global coordination to stop climate change is seen by them as the biggest controllable factor in whether the environmental catastrophe can be prevented. They also warn that the impact of climate change could be strongest in the poorest countries and that mass deaths from famines and huge migration trends could cause major global instability.

A mushroom cloud rises from the waters of Bikini Lagoon during the USA&’s first series of underwater atomic tests on 7 August 1946. Ships of a “Guinea Pig” fleet can be seen against the huge bank of water at the base of the explosion. While the researchers concede that a nuclear war is less likely than in the previous century, they say evidence suggests “the potential for deliberate or accidental nuclear conflict has not been removed”.

Advertisement

The biggest fact that they say would influence whether one happens would be how relations between future and current nuclear powers develop.The again, “there are grounds for suspecting that such a high impact epidemic is more probable than usually assumed,” the researchers believe.

The ability of the world&’s medical systems to respond to a pandemic was important in preventing a catastrophe, they said, but the biggest threat was simply whether there was an uncontrollable infectious disease out there or not.

They warned that an asteroid impact larger than five kilometres in size would destroy an area the size of the Netherlands. These events happened every 20 million years. “Should an impact occur, the main destruction will not be from the initial impact but from the clouds of dust projected into the upper atmosphere… The damage from such an ‘impact winter’ could affect the climate, damage the biosphere, affect food supplies and create political instability.”

Like an asteroid impact, the greatest threat from a super-volcano was a global dust-cloud that would block the sun&’s rays and cause a global winter. “The effect of (historic eruptions) could be best compared with that of a nuclear war,” they said.

Humanity either had to conserve the ecosystem or hope that civilisation was not dependent on it. “Species extinction is now far faster than the historic rate,” the study warned. Humanity had to develop sustainable economies in order to survive this one. “The world economic and political system is made up of many actors with many objectives and many links between them,” the study warned. “Such intricate, interconnected systems are subject to unexpected systemwide failures caused by the structure of the network.” Economic collapse could lead to social chaos, civil unrest and a breakdown in law and order.

With plans on to artificially replicate the polio virus, the scientists were worried that someone would intentionally build an “engineered pathogen” to wipe out the human race. “Attempts at regulation or self-regulation are currently in their infancy, and may not develop as fast as research does,” they warned.

Nanotechnology&’s proponents may tout it as a way to solve problems, but the researchers believe it could present serious problems. “(Nanotechnology) could lead to the easy construction of large arsenals of conventional or more novel weapons made possible by atomically precise manufacturing,” they warned. “Of particular relevance is whether nanotechnology allows the construction of nuclear bombs.”

They also believe that “extreme” artificial intelligence “could not be easily controlled” and would “probably act to boost their own intelligence and acquire maximal resources”. Rather spookily, they said that one of the key factors in our survival was whether “there will be a single dominant AI or a plethora of entities”.

Finally, the researchers warned of “unknown unknowns” and called for “extensive research” into “unknown risks and their probabilities”.

The Independent

Advertisement