Blog

Swapnil Garg

Existential Risk: Good or Bad?

In Effective Altruism, a central question is that of what the biggest priority should be. Nick Bostrom is a well-known advocate for decreasing existential risk. In his paper “Astronomical Waste: The Opportunity Cost of Delayed Technological Development,” he notes a possible 10^23 possible human lives living at once in the Virgo supercluster. Thus, decreasing the probability of existential risk, or increasing the probability of eventual colonization of the universe, by even a very small amount could massively increase total utility. However, Phil Torres argues that space colonization will cause suffering on an astronomical scale, and therefore contends that it is most desirable to delay space colonization or prevent it from happening (https://www.lesswrong.com/posts/zwpYGNQXvdD64FE3z/could-the-maxipok- rule-have-catastrophic-consequences-i). It seems that either decreasing or increasing existential risk will cause the most good/decrease the most suffering for humans, so one of them should be the cause area with the highest priority. But which one? I argue that the highest priority should be given to determining whether decreasing or increasing existential risk is better.

I believe that astronomical suffering is much worse than astronomical colonization can be good. For one thing, extreme pain can be inflicted on the human body, while I don’t think the same thing can be said about extreme pleasure. In addition, Bostrom notes that some utilitarians may adopt a person-affecting view, where they primarily try to maximize the utility of past and ongoing human lives, and a loss of potential future lives is not bad. However, while they might not care about an astronomical number of future human lives, surely these utilitarians would be against those lives being in almost constant suffering. So, even if there is a very small probability of Phil Torres’s prediction of astronomical suffering, the expected amount of suffering is still very large, while the benefits from space colonization might not be as big. Therefore, more research on “suffering risk” should be the highest priority cause area.

Phil Torres also notes that there have been many individuals throughout history with the intention to cause as much suffering as possible. It seems almost certain that many more of these individuals will exist in the future if humanity doesn’t undergo extinction. With enough intelligence, drive, and access to large-scale weapons, they could conceivably cause astronomical suffering. If the proportion of such humans remains constant, then in the future, there would be large numbers of people with the intention to cause massive suffering, and if even one obtained access to the most advanced weapons, catastrophic suffering would result. So, more research into methods of decreasing such tendencies in people is also of the utmost priority.