Blog Coordinator

« Montreal Neuroethics Conference for Young Researchers | Main | What is blame? »

09/22/2014

Comments

Feed You can follow this conversation by subscribing to the comment feed for this post.

Matt,

While I do not agree with retributive punishment (as it's been defined in this forum), I don't see that your argument actually strikes at it.

If the convict truly and retributively deserves 15 years in actual prison, then they must deserve all that's embodied in that 15 year penalty -- the loss of (free) life span and all those foreclosed opportunities included. Thus the loss of lifespan would not be otiose, and criticisms of the loss of opportunity would be ill-founded.

If the retributivist truly believed that only the experience of 15 years away from society were deserved, and that the other things were simply unavoidable consequences, they would actually be satisfied with your afternoon-in-a-box, or even Saul Smilansky's funishment. Thus I don't see that you have shown retributivism to be inapt.


Matt, that's an interesting scenario.

I'm not sure whether the machine would be giving the criminal his just deserts (assuming 15 years of imprisonment is just), because there may well be some relevant differences.
For example, in the experience machine, does he interact with images that are essentially P-zombies?
If so, does he know that he's not interacting with real people?
If not, are those characters self-aware programs, who can also suffer?
I get the impression that it's P-zombies and he knows that (because that is publicly known in a scenario where there are such punishments), so I'll go with that, but please let me know if you have something else in mind.

On the question of whether adding the drug would make it a just desert, it seems to me that that would be excessive punishment - assuming always 15 years of imprisonment were deserved.
On that note, we may consider the alternative without the experience machine: just the drug that takes away 15 years. Would that be less, or more punishment than 15 years in prison?
Assuming no prison rape, torture, etc. (which are not elements meant to be included in a punishment, on a retributive approach), it seems to me it would be more punishment to take away the 15 years without the prison time.

In fact, even though the loss of freedom is (always on a retributive approach) a punishment, not all of his experiences in prison are meant to be negative. The intended punishment would be (as I understand it) the loss of freedom and its necessary consequences, but the person still gets to live those years, and can do some things he finds meaningful during them (i.e., I would disagree with point 2. in your argument, at least in the way the retribution is meant to work; in practice, it the time in prison may well be worse).

Granted, the experience machine would give him 15 years of experience anyway. But it's not the same (though as I mentioned, I have limited info about the experience machine).

For example, if people with an expected remaining life of about 60 years were given the option between experiencing not just 60 but even 300 years in 60 days in an experience machine that will burn out their organs, and then die painlessly and instantly, or experiencing normal life, I think most people would choose the latter, and given what they value (including interaction with other people, not with P-zombies), the choice would be proper in my assessment.

So, it seems to me the drug + the machine would be too much. It's harder to address the machine alone, without further info, but it seems to me it would be very different from real experiences, because of the criminal's knowledge that he's not interacting with actual people - and hiding it from the criminal might be problematic for a number of reasons, but please let me know if you had that alternative in mind.

All that said, I don't find the "debt to society" metaphor very convincing, but I don't think that's problematic for retribution per se.

This is a provocative but really useful post. Thanks Matt.

It seems to me that an essential part of the adequacy of a concept of punishment is not just with respect to the role of the punished as somehow served or corrected, but to whom else punishment matters--victims and society at large. Short-cuts of even severe punishment don't engender confidence that the lasting memory of harms are apportioned a balance of justice across the lives of those harmed. Justice isn't merely a function of the perceived retribution by the punished him or herself--it is an important function of the perception of victims and society at large about how punishment works. Punishment that does not serve the diachronic psychological and moral perspectives of victims and society will simply not work. Unless victims and society are included in the illusions of time-compressed punishment--and there you have Big-time-Big-Brother--it is not workable as a system of punishment. The temporal illusion of punishment is not workable unless all parties involved are subject to the illusion--and that is only workable in some dystopian society under some mantle of control devoted to this collective vision.

Alan and Matt:

With regard to the adequacy of a punishment, I think Alan gives good reasons why the alternative would not be workable in practice, but on the other hand, it seems to me that the objection does not work from a retributivist perspective.

On that note, it seems to me that the answer to the question of what punishment is deserved depends only on the mind of the perpetrator, not on how victims or other members of society feel.
For example, let's say Joe fires a few shoots in the air to celebrate New Year. One of the shots falls on Alice's head, killing her instantly.
On a distant planet (let's assume an infinite or sufficiently large universe), Joe2 is a counterpart of Joe who has the same state of mind as Joe has on Earth (same experiences, everything), and fires shots in exactly the same manner. But as it happens, the shots do not hurt anyone. The counterpart of the shot that killed Alice on Earth weirdly quantum tunnels away on Earth2, and Alice2 is not hurt and doesn't even know there was a bullet heading her way - no one does.

Reasons such as social peace, limited resources and knowledge of states of mind, etc., may justify passing laws that impose punishments at least sometimes based on harm actually done, but that aside, and from a point of view of what is deserved, it seems to me that Joe and Joe2 deserve exactly the same punishment.

Similar considerations apply to even weirder scenarios in which there is no society in the first place, (e.g., the perpetrator is placed in an experience machine by some AI, but chooses freely anyway, since the AI only interferes with perceptions, not choices).

So, as I see it, actual differences in outcome do not affect what is deserved - of course, others may support retributive punishment but disagree with my assessment regarding what punishment is actually deserved, so perhaps Alan's objection may work from a perspective like that.

Thanks to all who have so far responded.

I should have be clearer upfront that my aim in the post is to scrutinize the metaphor of "paying one's debt to society". It is certainly a familiar phrase, in ordinary practice if not moral theory.

There may be a number of reasons to reject the metaphor. (For instance, one might think that since punishment is received by the one punished, they couldn't thereby pay anything back.) So it may be that I'm shooting fish in a barrel, as it were.

But I didn't take myself to be critiquing retributivism directly. One could always hold, as Mark does, that if actual punishment in real time is what is deserved by the guilty, that it what retributivism requires giving them (at least on some versions of retributivism).

Still, I thought the thought experiment useful for at least trying to articulate what it is the guilty deserve if they deserve punishment. Is it the suffering? The experience of imprisonment? Actual losses of a different kind? I'd certainly be interested in thoughts along those lines.

I think addressing this question is relevant more broadly, since it isn't implausible to think that punishment and blame are at least related.

Alan,

That's interesting. But I'm not sure which of two claims you're making. The first is that what the guilty deserve is partially a matter of our attitudes or experiences about their treatment. The second is that an adequately justified punishment scheme requires more than just giving the guilty their due, it also requires engaging with victims and co-citizens in various ways.

Of course, you might have in mind a third kind of predictive claim, which is that society would only buy into a scheme of punishment that was temporally extended in a way the experience machine proposal is not. But I'm not sure that's right. It might take time to come to terms with scheme or adjust our attitudes, but I'm not sure it would be impossible.

Matt,

Don't any forms of retributivism allow society to specify - within some range - the exact form which punishment should take?

Angra,

Just allow multiple prisoners to serve time at the same time and interact with each other through the experience machine interface. A guard might also be paid to use the experience machine and interact with them. So, no P-zombies required. But on the *other* other hand, if experience machines allow interaction and also a massive speed-up, then everyone's going to have one (at least at the work place; on their off-time people might prefer some contact with ground level reality).

Matt,

With regard to what exactly the guilty deserve, my impression is that they deserve a certain loss of freedom (though it's hard to tell how much), though there are alternative punishments also deserved (i.e., I think there isn't a single punishment X such that agents A deserves X, and all other punishments would be unjust, but more like A deserves either X1, or X2, ..., or Xn), so even before there were any prisons, proper punishment could be imposed on, say, a serial killer, a rapist, etc.

With regard to punishment and blame, it seems to me that the punishment that is deserved depends on the degree of immorality of the behavior, which goes hand in hand with the amount of blame the perpetrator deserves.
However, I think comes to the question of the punishment that a person should support (e.g., whether a lawmaker should vote for a specific proposed bill, establishing a certain punishment), there are also many other issues at play (i.e., other than whether the punishment is deserved), since plausibly there are plenty of immoral behaviors not warranting state intervention even though they warrant some social punishment, and also a lot of variables like availability of resources, other consequences of supporting a specific law in a specific case (even if the punishment is deserved), etc.


Paul,

If the experience is fully realistic - and it seems to me that's how it's supposed to work -, and also interaction is possible, I'm not sure most people would want any contact with ground-level reality. In fact, it seems to me the alternative would be preferable for most.

Interaction with ground-level reality could be implemented by means of robots, and that would give humans the capability to operate very quickly at that level. For example, this sort of machine (regardless of whether it allows for humans interacting with each other) would allow a human brain to experience in an hour what it would normally experience in a year. That means science, math, philosophy, etc., would advance extremely fast.
If it also allows human interaction, that seems to be more or less similar to extending human lifespan by a factor of 24*365=8760 (not counting leap years, but that makes only a slight difference), at least in terms of experience.
Also, in that case, it seems to me the question of whether 15 hours in the machine is an adequate punishment is relevantly similar to whether 15 years in prison remains an adequate punishment (assuming it is now), if people were to live on average 8760 times more than they live now.

(I don't think human brain processing can be nearly as fast in our universe, but it's a thought experiment, so I think potential nomological impossibility is not a problem.)

Angra,

Yep, that's basically where I wanted to go with it. I still think some people wouldn't want robots to do all their direct-experience for them, but that's a pretty minor worry.

Thanks to everyone who contributed to this thread. I found it helpful. I don't have much in the way of specific comments, and am moving on to new territory. Hope others found the discussion interesting as well!

The comments to this entry are closed.

Books about Agency


3QD Prize 2014: Marcus Arvan