Chapter 353 - Astronomical Waste: The Opportunity Cost Of Delayed Technological Development[4]

Name:Random Stuff Author:Brayon101
Arguably, however, we ought to assign a non-negligible probability to some current people surviving long enough to reap the benefits of a cosmic diaspora. A so-called technological "singularity" might occur in our natural lifetime, or there could be a breakthrough in life-extension, brought about, perhaps, as result of machine-phase nanotechnology that would give us unprecedented control over the biochemical processes in our bodies and enable us to halt and reverse the aging process. Many leading technologists and futurist thinkers give a fairly high probability to these developments happening within the next several decades. Even if you are skeptical about their prognostications, you should consider the poor track record of technological forecasting. In view of the well-established unreliability of many such forecasts, it would seem unwarranted to be so confident in one's prediction that the requisite breakthroughs will not occur in our time as to give the hypothesis that they will a probability of less than, say, 1%.

The expected utility of a 1% chance of realizing an astronomically large good could still be astronomical. But just how good would it be for (some substantial subset of) currently living people to get access to astronomical amounts of resources? The answer is not obvious. On the one hand, one might reflect that in today's world, the marginal utility for an individual of material resources declines quite rapidly once his basic needs have been met. Bill Gates' level of well-being does not seem to dramatically exceed that of many a person of much more modest means. On the other hand, advanced technologies of the sorts that would most likely be deployed by the time we could colonize the local supercl.u.s.ter may well provide new ways of converting resources into well-being. In particular, material resources could be used to greatly expand our mental capacities and to indefinitely prolong our subjective lifespan. And it is by no means clear that the marginal utility of extended healthspan and increased mental powers must be sharply declining above some level. If there is no such decline in marginal utility, we have to conclude that the expected utility to current individuals of successful colonization of our supercl.u.s.ter is astronomically great, and this conclusion holds even if one gives a fairly low probability to that outcome. A long shot it may be, but for an expected utility maximizer, the benefit of living for perhaps billions of subjective years with greatly expanded capacities under fantastically favorable conditions could more than make up for the remote prospects of success.

Now, if these assumptions are made, what follows about how a person-affecting utilitarian should act? Clearly, avoiding existential calamities is important, not just because it would truncate the natural lifespan of six billion or so people, but also – and given the assumptions this is an even weightier consideration – because it would extinguish the chance that current people have of reaping the enormous benefits of eventual colonization. However, by contrast to the total utilitarian, the person-affecting utilitarian would have to balance this goal with another equally important desideratum, namely that of maximizing the chances of current people surviving to benefit from the colonization. For the person-affecting utilitarian, it is not enough that humankind survives to colonize; it is crucial that extant people be saved. This should lead her to emphasize speed of technological development, since the rapid arrival advanced technology would surely be needed to help current people stay alive until the fruits of colonization could be harvested. If the goal of speed conflicts with the goal of global safety, the total utilitarian should always opt to maximize safety, but the person-affecting utilitarian would have to balance the risk of people dying of old age with the risk of them succ.u.mbing in a species-destroying catastrophe.