Pages

21 July 2023

How Likely Is A Catastrophe Or Extinction Level Event?

I trust superforecasters over domain experts, because a lot of evidence has shown that non-domain expert superforecasters have a better track record. 

While their assessments of risk to humanity from different sources differed, however, both indicate that AI turning on humans is a genuine concern that ranks high on the threat list.

Superforecasters predict a 9% chance of a catastrophe (defined as an event killing 10% of all humans or more in a five year period, which is about 100 times worse than COVID-19) by 2100 including a 1% chance of an extinction event (defined as an event reducing the human population to 5,000 people or less) by 2100. While lower than the percentages estimated by "experts" and the general public, these are still troublingly high.

The main differences of agreement were over the likelihood of a catastrophe by 2100 due to nuclear war (domain experts said 8%, superforecasters said 4% - about 2-1), due to a bioweapon (domain experts said 3%, superforecasters said 0.8% -about 7-1) and due to an artificial intelligence singularity in which artificial intelligences turn on humans (domain experts said 12%, superforecasters said 2.13% - about 6-1). This led to an overall risk of catastrophe by 2100 prediction of 28.95% from domain experts v. 9.05% from superforecasters - about 3-1.

They also differed in the rank order of the threats as well. Superforecasters saw a nuclear war catastrophe as twice as likely as an AI catastrophe. Domain experts saw an AI catastrophe as 50% more likely than a nuclear war catastrophe. But they both did agree that among the five possibilities presented, these two threats were the most serious. 

Domain experts saw the #3 risk as a bioweapon, while superforecasters saw a natural pathogen as more likely than a bioweapon. 

They agreed that a non-human caused catastrophe was the least likely to be caused by something not caused by humans (among the options presented) with domain experts estimated a 0.09% chance and superforecasters estimating a 0.05% chance. They both agreed that the overwhelmingly more likely threat of mass death was from human caused events.

The extinction risk (defined as the human population on Earth falling below 5,000 by 2100) was similarly split with the biggest divide again on nuclear war risk (domain experts 0.55% v. superforecasters 0.074% - about 7-1), bioweapons risk (domain experts 1% v. superforecasters 0.01% - about 100-1), and AI risk (domain experts 3% v. superforecasters 0.38% - about 9-1). This led to an overall risk of catastrophe by 2100 prediction of 6% from domain experts v. 1% from superforecasters - about 6-1.

Both superforecasters and experts, however, see an AI event as the single most likely potential cause of human extinction by the year 2100.

From here.

No comments:

Post a Comment