Friday, February 23, 2007

Utilitarian Suicide Pacts

I found Ran’s comment on my utility of life extension post sufficiently interesting to merit a post of its own. In response to my statement that someone maximizing the average utility of his life would commit suicide immediately if the expected utility of future years were lower than the average up to the present, Ran said:
I'm not so sure it makes sense for maximization of average utility to cover the past; it seems like the average-utilitarian's goal should be to maximize his average future utility.
That sounds sensible, but it also creates a problem of time inconsistency. Suppose you’re 30, and your life up to now has been “great.” You expect the next 20 years (up to age 50) to be “fantastic.” And you expect all years after that to be “pretty good.” Every year after age 50 will lower the average quality of future years. So from your current 30-year-old perspective, if you want to maximize the average quality of future years of life, you should plan to commit suicide at age 50. (Indeed, you should bind yourself to commit suicide at 50, if there’s any way to do so.) However, by the time you reach age 50, the intervening decades will have become the past and thus irrelevant to the goal of maximizing the quality of future years. You should therefore repudiate your plan to commit suicide.

This time inconsistency is, to me, further evidence that maximizing average utility of life is not the right goal. An analogous inconsistency occurs for average utilitarianism of the interpersonal variety, once we consider multiple generations. Right now, present and foreseeable future generations of humanity seem likely to have higher expected utility than the average of prior generations, since the mass of human beings throughout history have lived in conditions of squalor and deprivation. So average utilitarianism would recommend continued existence for humanity. But what if we reached a point in history at which humankind’s condition declined rapidly and seemed to likely to continue that way (perhaps as in Battlestar Galactica)? In that case, an average utilitarianism that considered all lives, past and future, would recommend immediate mass suicide. An average utilitarianism that only looked to the future, on the other hand, could allow continued existence for humanity for a period of time – but if humanity’s condition were expected to drop even further at some point in the more distant future, it would make sense to adopt a time-delayed suicide pact.

6 comments:

Gil said...

Glen,

I agree with what you've written, but I suspect that the uncertainty involved in very-distant utility estimates will overwhelm the calculation.

So, even if you did want to maximize future expected average utility, you'd be foolish to take drastic actions now (like contracting with somone to kill you when you're 50) because of your long-term estimate.

You should probably try to do what will make the shorter-term, less uncertain, future as good as possible, while avoiding doing anything really stupid that would be likely to screw up your long-term future.

Gil said...

On the other hand, maybe I shouldn't be writing things like that, becausy my expected utility will go up if those who are certain of our impending horrific future (I'm looking at you, Al Gore!) would end their lives earlier.

I'm only kidding (mostly).

Ran said...

I think the time inconsistency is realistic; anyone who buys into the "live fast, die young, and leave a good-looking corpse" ethic is someone who thinks he still has some time to live fast before he proceeds to the part where he dies young. One could attribute the policy change to new information ("hey, being old isn't so bad!"), to cold feet, to maturity, or to a change in values (or to a combination of these), but I think one could equally well attribute it to a continual re-calculation of future utility.

(I think economists usually analyze sunk costs in terms of change in available information -- one wouldn't have accepted the cost if one had known that the expected benefits wouldn't materialize -- but I don't think that's necessarily the only interpretation in every case of policy change over time.)

Anonymous said...

This is the first time I've stumbled upon your blog, being directed here from the "Longevity Meme" site. That gives you a sense of where I'm coming from.

At any rate, the thing I think you're missing in the analogy is that life extension needn't be a zero sum game. In other words, the amount of "enjoyment" for a lifetime needn't asymptotically decrease. Technology will probably make it so that each successive year maintains quality of life, if not increasing it.

Glen Whitman said...

Anon -- you may be right. In fact, I hope you are! The purpose of my post wasn't really to consider the merits and demerits of life extension. It was to address an interesting philosophical question about whether to maximize the total or the average utility of your life -- and by extension, the total or the average utility of society. In the case you describe, where quality just keeps getting better, there is no conflict between the total and the average; living longer increases both. The interesting philosophical question emerges when you can only increase the total by decreasing the average, or vice versa.

Gil -- I agree, you might want to avoid commitments to suicide in order to preserve option value.

Seth Baum said...

Hi,

I found this blog Google searching for ("repugnant conclusion" nutrition). Yinz were #1. Congrads, I guess.

My quick thoughts on this topic:
Re: "squalor and deprivation". Word is hunter-gatherers had a 20-hour work week (link) I think we should be more careful evaluating the lives of our ancestors. They might not have been as bad as we think.

Re: Future generations will be better-off: If they exist. At least some of the proposed existential risks are very plausible. See Bostrom/Oxford; Wikipedia

...I help run Felicifia, an online utilitarianism community. Yinz can post your own writing there if you'd like. This discussion here would fit in nicely. I'll try to keep an eye out here too.