Compiled from aN evolving list of the top ten most likely reasons that may bring about the end of civilisation, EVALUTED by the top scientific minds and publications in the the world, we will us our interviewing of a highly credible cross-section of experts in various fields, to gain further understanding of the direction humanity is taking and it’s possible ultimate fate.

NUCLEAR WAR:

As of 2020, humanity has about 13,410 nuclear weapons, thousands of which are on hair-trigger alert. While stockpiles have been on the decline following the end of the Cold War, every nuclear country is currently undergoing modernization of its nuclear arsenal. The Bulletin of the Atomic Scientists advanced their symbolic Doomsday Clock in 2015, citing among other factors "a nuclear arms race resulting from modernization of huge arsenals".

In a poll of experts at the Global Catastrophic Risk Conference in Oxford (17‐20 July 2008), the Future of Humanity Institute estimated the probability of complete human extinction by nuclear weapons at 1% within the century, the probability of 1 billion dead at 10% and the probability of 1 million dead at 30%. These results reflect the median opinions of a group of experts, rather than a probabilistic model; the actual values may be much lower or higher.

Scientists have argued that even a small-scale nuclear war between two countries could have devastating global consequences and such local conflicts are more likely than full-scale nuclear war.

Nuclear war was the first man-made global catastrophic risk, as a global war could kill a large percentage of the human population. As more research into nuclear threats was conducted, scientists realized that the resulting nuclear winter could be even deadlier than the war itself, potentially killing most people on earth.

BIO WARFARE:

Biological warfare, also known as germ warfare, is the use of biological toxins or infectious agents such as bacteria, viruses, insects, and fungi with the intent to kill or incapacitate humans, animals, or plants as an act of war. It was heavily used by Unit 731 of the Imperial Japanese Army during World War 2 on the Soviet Union and China. As many as 400,000 Chinese died of bubonic plague, cholera, anthrax, and other diseases planted by Unit 731. An attack called Operation Cherry Blossoms, which would've affected 80,000 in Southern California, was also planned by Ishii Shiro, the leader of Unit 731.

In the decades to come, advanced bioweapons could threaten human existence. Although the probability of human extinction from bioweapons may be low, the expected value of reducing the risk could still be large, since such risks jeopardise the existence of all future generations. We provide an overview of biotechnological extinction risk, make some rough initial estimates for how severe the risks might be, and compare the cost-effectiveness of reducing these extinction-level risks with existing biosecurity work. We find that reducing human extinction risk can be more cost-effective than reducing smaller-scale risks, even when using conservative estimates. This suggests that the risks are not low enough to ignore and that more ought to be done to prevent the worst-case scenarios.

Regulations on biological and genetic research vary widely between countries – but making weapons with such techniques is largely illegal under the 1975 Biological Weapons Convention. Some experts worry, however, that recent advances may make it easier to design more effective and lethal new pathogens. In February 2017, Microsoft founder Bill Gates warned that a conflict involving such weapons could kill more people than nuclear war.

CLIMATE CHANGE:

Some world governments have ignored the advice of scientists and the will of the public to decarbonize the economy (finding alternative energy sources), resulting in a global temperature increase 5.4 F (3 C) by the year 2050. According to NASA’S EARTH DATA At this point, the world's ice sheets vanish; brutal droughts kill many of the trees in the Amazon rainforest (removing one of the world's largest carbon offsets); and the planet plunges into a feedback loop of ever-hotter, ever-deadlier conditions.

Thirty-five percent of the global land area, and 55 percent of the global population, are subject to more than 20 days a year of lethal heat conditions, beyond the threshold of human survivability.

Additionally, droughts, floods and wildfires will have regularly ravaged the land. Nearly one-third of the world's land surface will turn to desert. Entire ecosystems collapse, beginning with the planet's coral reefs, which has already begun, along with the deforestation of the rainforest and the rapid melt of Arctic ice sheets. The world's tropics are hit hardest by these new climate extremes, destroying the region's agriculture and turning more than 1 billion people into refugees.

This mass movement of refugees — coupled with shrinking coastlines and severe drops in food and water availability — will eventually stress the fabric of the world's nations, including the United States. Armed conflicts over resources, perhaps culminating in nuclear war, are likely. The result, according to new studies, is "outright chaos" and perhaps "the end of human global civilization as we know it.”

PANDEMICS:

Just prior to 2020, natural and engineered pandemic disease is one of the most-studied global risks. It was an area given new urgency by the controversy over "gain of function" experiments. These involve taking a known pathogen and adding extra, risky functionality.

For example, in 2011, virologists Ron Fouchier and Yoshihiro Kawaoka created a strain of the bird flu virus that could be transmitted between ferrets. This was done in order to better understand the conditions in which the virus might develop transmitability in the wild.

Such experiments can head off certain risks but create an arguably greater one, in that the modified organism might escape the lab and cause a global pandemic, we look no further than Covid-19 as the possibility of this.

The risk of a pandemic is particularly great because it is self-replicating. Whereas a nuclear explosion is localised, in our highly connected world a synthetic, incurable virus could spread around the planet in days. In the past, natural pandemics such as the black death have killed millions and effected wholesale social changes. In the 21st century, advanced biotechnology could create something that makes the black death look like a nasty cold.

The failure to manage a natural pandemic has long been seen as a serious threat or event that could cause human extinction or permanently and drastically curtail humanity's potential. 2020 Has reminded us, the fragile balance between nature and science keeps us teetering on the edge of the abyss. 

Natural pandemics have killed more people than wars. However, natural pandemics are unlikely to be existential threats: there are usually some people resistant to the pathogen, and the offspring of survivors would be more resistant. An aggressive virus may prove to the living enemy that proves impossible to beat.

SUPER VOLCANO:

Quietly lurking beneath Yellowstone National Park in the US is a "supervolcano" that has the potential to wipe out humans when it erupts.

There are several other "supervolcanoes" — volcanoes capable of an eruption that produces more than 240 cubic miles of magma — around the world that scientists and volcanologists are constantly watching, including the one at Lake Toba in Indonesia.

There have been three major volcanic eruptions at Yellowstone in the last 2 million years, according to the National Park Service (NPS), and two of them can be considered supervolcano events. The NPS writes on its website that another supervolcano eruption is possible and a several recent eruptions around that world have the experts worried.

The vast quantities of lava and ash that are spewed out of the Earth during supervolcano events have the potential to cause long-lasting climate change that could trigger a life-threatening ice age or global warming.

This will also result in devastated crops and the destruction of the power grid. Without sunlight, warmth and food, a major loss of life would be incurred. How bad? As Bryan Walsh writes in his op-ed in the New York Times, a 2015 report for the European Science Foundation on extreme geohazards called what might happen "the greatest catastrophe since the dawn of civilization."

NANOTECHNOLOGY:

A recent Harvard University report notes that “atomically precise manufacturing” could have a range of benefits for humans. It could help to tackle challenges including depletion of natural resources, pollution and climate change. But it foresees risks too, as there is measurable risk around the explosion of technological advancement.

“It could create new products – such as smart or extremely resilient materials – and would allow many different groups or even individuals to manufacture a wide range of things,” suggests the report. “This could lead to the easy construction of large arsenals of conventional or more novel weapons made possible by atomically precise manufacturing.”

Nanotechnology is the control over matter with atomic or molecular precision. That is in itself not dangerous – instead, it would be very good news for most applications. The problem is that, like biotechnology, increasing power also increases the potential for abuses that are hard to defend against.

Weapons can also be small, precision things: a "smart poison" that acts like a nerve gas but seeks out victims, or ubiquitous "gnatbot" surveillance systems for keeping populations obedient are not only possible, but likely. Also, there might be ways of getting nuclear proliferation and climate engineering into the hands of anybody who wants it. 

We cannot judge the likelihood of existential risk from future nanotechnology, but it looks like it could be potentially disruptive just because it can give us whatever we wish for… the most dangerous risk of all!

HOSTILE AI:

Similarly to Nanotechnology, Artificial intelligence (AI) has long been associated with science fiction, but it’s a field that’s made significant strides in recent years. As with biotechnology, there is great opportunity to improve lives with AI, but if the technology is not developed safely, there is also the chance that someone could accidentally or intentionally unleash an AI system that ultimately causes the elimination of humanity.

The creators of a superintelligent entity could inadvertently give it goals that lead it to annihilate the human race. A survey of AI experts estimated that the chance of human-level machine learning having an "extremely bad (e.g., human extinction)" long-term effect on humanity is as great as any other threat.

Intelligence is very powerful. Consider, a tiny increment in problem-solving ability and group coordination is why we left the other primates in the dust. Now their continued existence depends on human decisions, not what they do.

The problem is that intelligent entities are good at achieving their goals, but if the goals are badly set they can use their power to cleverly achieve disastrous ends. There is no reason to think that intelligence itself will make something behave nice and morally. In fact, it is possible to prove that certain types of superintelligent systems would not obey moral rules even if they were true.

Software-based intelligence may very quickly go from below human levels to frighteningly powerful. It has been proposed that an "intelligence explosion" is possible when software becomes good enough at making better software. Should such a jump occur there would be a large difference in potential power between the smart system (or the people telling it what to do) and the rest of the world. This has clear potential for disaster.

ASTEROID IMPACT:

Earth would be hit by small asteroids constantly were it not for the atmosphere, which burns up anything less than ten metres in width. This is convenient, as even a ten-metre rock builds up kinetic energy equivalent to that of the Hiroshima nuclear bomb. The planet is hit by an asteroid or comet measuring more than ten metres once or twice every 1,000 years. Every million years an asteroid spanning at least one kilometre will also hit Earth, which can be enough to affect its climate and cause crop failures that would put the population at risk.

The really serious, existential-threat-level strikes, such as the 180km Chicxulub impactor, which wiped out the dinosaurs around 66 million years ago, come once every 50 to 100 million years. That may be enough to cause worry, but it's reassuring to know that a) astronomers keep a close eye on larger objects posing a danger to Earth, and b) there is a whole interdisciplinary community of scientists working out what to do if they get too close. However the threat is very real and the outcome is inevitable.

 “As we gaze into the future, it turns out that the universe is a pretty dangerous place.

“The possibility of one of these wiping us out isn’t just the stuff of Hollywood disaster movies, the threat from asteroids is real.”

Stephen Hawking - 2010

E.T. THREAT:

Intelligent extraterrestrial life, if it existent, could invade Earth either to exterminate and supplant human life, enslave it under a colonial system, steal the planet's resources, or destroy the planet altogether.

Although evidence of alien life has never been proven, scientists such as Carl Sagan have postulated that the existence of extraterrestrial life is very likely. In 1969, the "Extra-Terrestrial Exposure Law" was added to the United States Code of Federal Regulations (Title 14, Section 1211) in response to the possibility of biological contamination resulting from the U.S. Apollo Space Program. It was removed in 1991. Scientists consider such a scenario technically possible, but unlikely.

More recently, an article in The New York Times discussed the possible threats for humanity of intentionally sending messages aimed at extraterrestrial life into the cosmos in the context of the SETI efforts. Several renowned public figures such as Stephen Hawking and Elon Musk have argued against sending such messages on the grounds that extraterrestrial civilizations with technology are probably far more advanced than humanity and could pose an existential threat to humanity.

Invasion by militarily superior extraterrestrials – often considered to be a scenario purely from the realm of science fiction but, professional SETI researchers have given serious consideration to this possibility, although they conclude that it is unlikely, the past several years have yielded more credible video evidence filmed and witnessed by respected sources all over the globe, including a verified statement from the U.S. Pentagon in 2021, that they are “in possession of off-world vehicles and materials not made on this planet”, substantiating the likelihood that intelligence beyond human understanding does indeed exist.

MASS STARVATION:

The global population is forecast to hit 9.6 billion by 2050. Experts argue that to avoid mass starvation, we will need to increase food production by 70 per cent in just over 30 years. The challenge is that advances in food-production techniques, which have allowed humans to keep pace with population growth since 1950, largely relied on fossil fuels. In addition, cultivable land is being reduced by factors including topsoil erosion.

There are also risks associated directly with the nature of the foods we eat. It's widely believed that humans will need to eat less meat and more grains. However, whereas advances in crop development have produced varieties that can grow in inhospitable places, they've also increased vulnerability to disease. Whole tracts of wheat, the world's third-most popular cereal crop, could be wiped out by fungal infections and synthetic viruses can only increase the risk of catastrophe.

Experts have predicted that the impact will be felt through sharp price rises around 2020 and 2021, with the situation becoming critical in developing countries by the middle of the century.