Rescuing the miners and the babies

by Ingrid Robeyns on August 29, 2014

On Monday I was having dinner with Robin Celikates and a bunch of PhD students who were this week attending a Summerschool on Dirty Hands and Moral Dilemmas. Someone came up with the following case (none of us was quite sure about the author, but Derek Parfit seems like a likely candidate):

Case A: Rescuing the miners:
Imagine 100 miners who are stuck in a mine. They are divided in two groups. You can either rescue 50 (with certainty), but then the other 50 will be lost (this is strategy 1). Or you can try a different rescue strategy, which may potentially save all of them, but only at a 50% probability; there’s another 50% chance that all will die (strategy 2). Which strategy would you choose?

The people around the table had conflicting views, and the reasons we believed to have for a certain view did not convince the others at all. My choice was for strategy 2, since that gives everyone an equal chance to be rescued, and thus treats the miners morally equally in a certain sense. But Robin said that miners themselves would choose strategy 1, since they have a strong collective ethos/identity which includes that you save whom you can save. He claimed that we can deduce this empirical claim from some accidents that happened with miners who were actually locked up in a mine. (this is my recollection of the discussion, but Robin is very welcome to correct me !)

In the case of miners, we are dealing with adults and respecting their agency could plausibly be taken to overrule other reasons to choose for a certain strategy. But what if agency didn’t play a role? We could change the example, by turning the people-to-be-rescued into babies, who are too small to have anything resembling group-identity and agency:

Case B: Rescuing the babies:
Suppose 100 babies are stuck in a mega-crèche which is on fire. They are two floors with 50 babies on each floor. There are two rescuing strategies. Under strategy 1, you can rescue 50 babies for sure, but the other 50 will die. Alternatively you can try another strategy in which all 100 babies have a 50% chance of being rescued (strategy 2).

Which strategy do you choose, and why? And if you choose differently in case A and case B, then why so?

{ 109 comments }

1

Steven Hope 08.29.14 at 11:18 pm

I’d ask the miners. If they chose strategy 1, I’d ask them to randomly assign themselves to the 50% certain (or collectively decide on some other basis).

If they were babies, I’d choose strategy 2.

2

A H 08.29.14 at 11:31 pm

Risk aversion says you should choose strategy 1 both times. Same expected outcome less risk.

What about either save 10 childern or save 100 babies 10% of the time and at the other times lose them all?

3

Collin Street 08.29.14 at 11:38 pm

Scenario 1, because I think it holds up better once you allow for the assessments potentially being out.

4

sPh 08.29.14 at 11:38 pm

You are in charge of making the decision. You do so, and bring out 50 babies. Do you tell the parents which strategy you used?

[the other option is to run down to the mine railroad and throw the switch sending the 775,000 lb steam locomotive smashing into the side of the mountain, killing the obese engineer (driver) and brakeman but opening the path to the 100 babies]

5

Vladimir 08.29.14 at 11:38 pm

I agree with comment 2, choose the solution that guarantees saved lives especially when the probably outcome of either option is the same. As for Parfit, a simple google search shows that his example is quite different. He gives us the option of flooding shaft A or B which would either kill or save all the miners depending on which shaft they are trapped in. Alternately we could flood both shafts and save 90.

6

Priest 08.29.14 at 11:48 pm

50 babies have been rescued. The commander of the Nazi occupation forces will allow you to return them safely to their homes if you abandon the other 50 to the flames. Or you can take a 50% chance to rescue the others, but if you fail, troops will execute the 50 babies. What do you do?

7

MPAVictoria 08.30.14 at 12:03 am

I would (or I think I would. You probably never know until you are in the situation for real) gamble and pick strategy 2. Everything might just work out…

8

MPAVictoria 08.30.14 at 12:10 am

I would probably even pick strategy 2 if the percent chance of success was only 33%…

/ I should probably not be put in charge of emergency responses….

9

David of Yreka 08.30.14 at 12:12 am

Claim that the probabilities are not well enough established, and that opinions differ on whether there are any miners down there at all; continue to do so until the miners are all dead, then wring your hands and solemnly decry someone else’s failure to provide timely and accurate information.

Or did I miss something?

10

Thornton Hall 08.30.14 at 12:21 am

The trolley problem has led me to give up on the search for an 100% internally consistent moral theory. John Dewey looks better everyday.

11

Thornton Hall 08.30.14 at 12:23 am

From a human perspective in this situation, by far the most important thing is to pick one strategy and go all out.

12

Medrawt 08.30.14 at 12:39 am

I agree with Thornton Hall, but as part of my distrust of the mechanics of the trolley problem stories, I’ll volunteer that the claim of “100% probability” in saving 50 people suggests to me that they are in a circumstance where some of the miners/babies are in a less immediately dangerous circumstance than the others, and taking strategy 1 involves intentionally choosing to consign a specific group to death in order to save the others (rather than it being unpredictable which 50 are saved). In that circumstance I would choose strategy 1, at the last feasible moment, which is what they do in movies when a ship’s hull has been breached but some of the crew can be saved by sealing off compartments where other crewmembers are still alive.

13

CP Norris 08.30.14 at 1:11 am

In disaster fiction, somebody always suggests Strategy 1, and then the hero says “No, we can’t give up on the others! We have to take the chance!” So they go with Strategy 2.

You could call that the heroism bias. I suspect it balances out the risk aversion.

14

Lindsay 08.30.14 at 1:30 am

From a probability stand point both scenarios are equal.

For any given one individual baby the odds are at 50% for scenario one or two.

This question is more about ones appetite for risk. The risk adverse take the certain out come. The risk takers, the second.

15

Rajiv Sethi 08.30.14 at 1:39 am

Like you, I’d choose strategy 2 in both cases, and for much the same reason. Moral equality trumps risk aversion. But ask yourself this. What if strategy 2 had only a 45% chance of success? Or 20%? At some point presumably you would switch to strategy 1. But at what point and why? I think the answer reveals something about the subjective weight given to equity relative to efficiency.

16

Bernard Yomtov 08.30.14 at 1:40 am

A H @2

Risk aversion? Are you saying that saving 100 miners/babies is less than twice as good as saving 50?

17

Kevin 08.30.14 at 1:48 am

I’ll take risk aversion over CP Norris’s heroism bias any day. The difference between certainty and uncertainty has a stronger psychological pull over me than the difference between two numbers. Another interesting question: say its certainty of ‘x’ lives, or 50/50 odds with 100 lives? I would probably switch to the 50/50 odds around 40 lives.

18

Adam 08.30.14 at 2:01 am

Losing all 100 babies is more than twice as bad as loosing “only” 50. I believe that the argument from utility is for strategy 1.

However, I am somewhat of a risk taker (and not a devout utilitarian) and would probably take strategy 2 – depending on many variables, especially my internal “gut” feelings about the 50% chance estimate. I also should probably not be in charge of such decisions.

If 25 saved and 75 lost is strategy 1
and 25% chance of saving all 100 is strategy 2?

19

urple 08.30.14 at 2:09 am

I don’t think there’s a realistic scenario where we’re so certain about the probabilities involved. So in the real world, we wouldn’t perceive this as choosing one over the other of two probabilistically equal strategies. Instead, we’d judge one strategy (rightly or wrongly) as a better or worse bet, based on the facts on the ground, and then go with it.

20

JimV 08.30.14 at 2:36 am

As previous comments have mentioned, the details which determine the probabilities must be specified so we nitpickers can be sure of the exact probabilities (maybe an evil villain offers a coin flip) before we pick a case. Assuming the probabilities are in fact exactly as stated (which is unlikely to happen), I would pick case 2, as in, “Do ya feel lucky, punk?” – yes, I do, and if I ever deserved a break it’s now, and if I’m wrong at least it’s more evidence for the faith-heads that there is no god for whom humans amount to a hill of beans in this crazy world.

21

Alan White 08.30.14 at 2:48 am

urple: Think the scenarios are dispensed by an evil manipulator who guarantees the odds, and then defer that question of ultimate responsibility and focus on what sub-moral strategy is available to us.

If we invoke expected value for some set of individuals against uncertainty of such for all as individuals, A wins, because a valued outcome for some is certain against one that is not for all.

If we invoke equal value for all involved favoring no ensemble for expected value, the meta-strategy says flip a coin, heads for A and tails for B.

BTW I’m just spit-balling here.

22

joebco 08.30.14 at 3:00 am

How do you know that the probability is 50% in Strategy 2? I would rather have the problem be stated more realistically as “We have a second strategy. Either everyone lives or everyone dies, but we really don’t know the probability because we’ve never done this before, at least not under these circumstances. For what it’s worth the Alice says it is probably going to work, but Bob says the chances are slim to none. So the relevant question is not ” At what percentage, if any, would you switch strategies?” but is “What evidence for the possible success of strategy 2 would cause you to consider that over Strategy 1?”

OTOH. A problem that I read about today may be close to an actual instance of the problem as stated: Whether or not to dilute the doses of the ZMapp Ebola treatment to treat more people, but with less certain efficacy. The efficacy is at least theoretically knowable to a reasonable precision, but is of course not known right now.

23

ZM 08.30.14 at 3:01 am

Case A is a good example of why safety standards in mines need to be strongly regulated and inspections carried out routinely. Is there really a way of calculating probability of rescues in mine collapse incidences? What would the confidence be?

The choices for cases A and B are different. A you can supposedly either 1. rescue 50% or 2. have a 50% chance 100% will either live or die. B you can supposedly 1. Rescue 50% or 2. Give all a 50% chance of living or dying.

Lots of people on the climate change thread preferences 450ppm co2e over trying to return to 350ppm, but the probabilities of 450ppm stabilizing the climate at 2 degrees

“The temperature increase: Analysis for the 2006 Stern report (p. 195) shows that, taking uncertainty about climate sensitivity into account, a 450ppm CO2e target has:
• A 26–78% probability of exceeding 2 degrees
Celsius (ËšC) relative to pre-industrial
• A 4–50% probability of exceeding 3˚C
• A 0–34% probability of exceeding 4˚C
• A 0–21% probability of exceeding 5ËšC”

Current irresponsible practices look set to exceed 450ppm. This is a moral dilemma/hand staining situation we are all taking part in now.

24

Ted Lemon 08.30.14 at 3:15 am

Scenarios like this are generally used to justify moral absurdities. They presuppose an omniscience that doesn’t exist: that you actually know you are making a choice like this. We never do. If we had that kind of ability to know, we would have known that the situation was going to happen before it did, and prevented it. The only purpose of such scenarios is to reinforce our erroneous belief that we have some kind of control over the future. We do not. The cost is that it is this very type of reasoning that routinely gets us into wars, which never seem to turn out the way we’d planned.

An ethical system that relies on the ability to be omniscient is like a mathematics that relies on the ability to get a sensible result when dividing by zero.

25

Cian 08.30.14 at 3:19 am

look strangely at the person who figured that you could accurately gauge the success of strategy 2, and decide that given they were clearly insane, go for strategy 1.

Do I win?

Alternately decide that because I will be attacked in the media if I choose strategy one, strategy two is really the only viable option if I wish to continue to have a career/feed my children. We tried and failed being politically savvier, than we chose to let them die.

On RN ships it’s always strategy 1. I imagine most politicians choose strategy 2.

26

Jason 08.30.14 at 3:19 am

I am with urple — there is no way to know the probabilities with that kind of certainty, so all kinds of other issues enter (e.g. framing effects).

Heck, even defining the probabilities is problematic — if you adhere to a Bayesian view of probability, scenario 2 means your belief in the outcome is completely uncertain (you have no idea if everyone will live or die, giving your prior probability of each as 50%). Basically a Bayesian is saying “either we save 50 or we try something where we have no idea if the miners (babies) will live or die”.

A frequentist view of probability sees the 50% as what happens in multiple rolls of the dice, in which case the two scenarios are equal (both save on average 50 people). That means something else besides logic makes you decide between the two scenarios.

I like the heroism bias of CP Norris above, and what I see happening there is a frequentist view (doesn’t matter) followed by the Bayesian view (we don’t really know with scenario 2, so maybe we’ll get lucky).

@Alan White: in a sense that puts the moral onus on the evil manipulator and changes the ethics of the problem.

27

Steko 08.30.14 at 3:24 am

Why limit the scenario to 100 babies?

Chance to save 200 babies = 25%

Chance to save 5,000 babies = 1%

28

Alan White 08.30.14 at 3:29 am

Jason–my post meant to bracket the problem to put concerns about realistic probabilities about 50% aside with a posit that they were assumed in the manipulator scenario, nothing more. What’s left is on us.

29

Steko 08.30.14 at 3:29 am

Stuck on a faraway space colony with 5000 sick babies and a limited amount of medicine. We can split the dosage but are reliably informed that the chance of survival is perfectly linear to the dosage.

30

derrida derider 08.30.14 at 3:33 am

Like the trolley problem, I think these sort of exercises give us little insight into what we should actually do (still less what we WOULD actually do) simply because we live in a world where absolutely nothing is mathematically certain (even, as once famously pointed out, that the sun will rise tomorrow). Which is basically another way of stating Ted Lemon’s point @23 – we are always uncertain because we are never omniscient.

Of course in this world even estimates of probability are almost always uncertain and known by all to be uncertain – that’s why we do Bayesian inference.

31

Dean Eckles 08.30.14 at 3:54 am

This seems like a question about social welfare functions. Neither policy stochastically dominates the other. Note that answers could very well change if these 100 people are the only people in your society (i.e. you may want someone to live).

32

David 08.30.14 at 3:55 am

Reminds me of a report on NPR today, which mentions Ukrainian forces in Mariupol dividing themselves into two groups, one of which will fight to the death so that the other can get away.

33

Bruce Baugh 08.30.14 at 4:18 am

Sources of Power is a really fascinating book studying how it is people actually make decisions. One of the factors in successful crisis decision-making is, according to Gary Klein’s research, the ability to construct an internal story of how things need to and can go that leads to a successful outcome. Experienced people can assemble theirs with more reference to prior attempts, both successful and un-, and use this to skip distracting clutter. (At the risk, of course, of overlooking distinctive features that are highly relevant even though they lack precedent, but spotting that is also part of experience.)

I really find this kind of study more interesting and more useful.

34

cassander 08.30.14 at 4:21 am

Seems to me that the obvious answer, if it’s possible, is to defer to the wishes of the miners or the parents of the babies. to me, the more interesting question is how one would go about resolving differing opinions about the correct course among the miners/parents in question. for example, should the members of the “at risk” group be given more or less weight than those in the “safe group?”

35

Bruce Baugh 08.30.14 at 4:25 am

Whoops, knew I left a bit out: an important element of Klein’s work is emphasizing how much of the chances of successfully resolving a crisis is a property of the rescuers’ insight and judgment, rather than of a static fixed objective probability.

I can play a couple musical instruments, a bit, but any instrument I take up to play will sound better in the hands of a skilled musician. They don’t magically make the instrument better; they simply take advantage of its existing features better. Likewise, if there’s a rescue situation where I look at it and can’t see better than 50/50 odds, my smart and moral move is probably to turn it over to someone who can see better odds thanks to stuff I’ve missed.

36

Donald Johnson 08.30.14 at 5:03 am

I’ve always worked really hard to avoid situations where I might face these exquisitely balanced and seemingly artificial moral dilemmas. So far it seems to be working.

37

bad Jim 08.30.14 at 5:10 am

In case B, I’d wonder whether the roasted babies would be edible; if so, that would be an argument for strategy 1. Along with fifty saved there would be dinner.

38

bad Jim 08.30.14 at 5:36 am

I like Donald Johnson’s comment better than mine, and in fact I’ve dealt with moral conundrums mostly by avoiding them.

We used to have a local opera company. After prolonged applause the audience would depart en masse to the adjoining parking structure. The eldest tended to arrive earliest and park nearest the exits. The younger typically arrived later and parked on the higher levels, and were quicker to reach their cars. The situation was like the rivers in Siberia which run north to the Arctic, flooding in the spring when their sources thaw before their outlets and in the fall when the outlets freeze before their sources.

Making my way through the ensuing deadlock entailed endless repetitions of a dilemma: should I be kind to a few olds, who have just reached their cars and have not had to wait, or to the hundreds of cars behind me? A straightforward utilitarian calculation would say that those late to their cars should have to endure what might seem to be an interminable delay. People tended to let them go ahead, though, to the detriment of the majority.

I chose to make my mother walk a short block (it was good for her!) and park where such problems didn’t arise.

39

Kien 08.30.14 at 5:36 am

Hi I expect in strategy 1, it is not possible to tell in advance who will be rescued. So from the perspective of the miners, they each have an equal chance of being survival under either strategy under both strategies. I vacillated between the two strategies but would ultimately select strategy 2 under the “3 Musketeers principle” – all for one and one for all. If we survive, let’s all survive. If we perish, let’s all perish. So I disagree with Robin.

40

js. 08.30.14 at 5:51 am

I expect that at least a small minority will be at least mildly offended by the following, but when this sort of thing comes up, I find it impossible to resist quoting this:

[I]f you want to corrupt people by direct propagation of ideas, moral earnestness is pretty well indispensable. […]* A third point of method I would recommend to the corrupter would be this: concentrate on examples which are either banal: you have promised to return a book, but … and so on; or fantastic: what you ought to do if you had to move forward and stepping with your right foot meant killing twenty-five fine young men while stepping with your left foot would kill fifty drooling old ones. (Obviously the right thing to do would be to jump and polish off the lot.)

Elizabeth Anscombe, “Does Oxford Moral Philosophy Corrupt the Youth,” Human Life, Action and Ethics, pp. 162–63.

Obviously, the last sentence is the offensive one, but for what it’s worth, I think Anscombe’s thought is deeply right.

*Put in brackets to indicate that this is ed., the later ellipsis is in the original text.

41

Chris Bertram 08.30.14 at 6:34 am

The way the example is set up, we have an assumption of the equal expected goodness of outcomes with certainty of outcome weighing on one side and equality (the equal right of each person to be rescued) on the other. One might think about whether each of these two “overbalancing” considerations itself contributes to the goodness of the outcome. This doesn’t look plausible for certainty, which seems to have at best instrumental value, whereas a more equal treatment of the victims might be good in itself. If it is an additional factor contributing to the good of the outcome then we might be required to save *fewer* expected people whilst giving everyone an equal chance of rescue. I’m not sure what the tradeoff should be, but if we inflate the numbers, saving a million people with certainty (out of a population of 2 million) might be morally worse than pursuing a strategy giving everyone an equal chance of being rescued but where the expected number rescued is only 999,999.

42

John Holbo 08.30.14 at 7:14 am

My kneejerk reaction is to try to save them all. That approach has a whiff of Kantian-Hollywood nobility and all-around egalitarianism. All or nothing! Kant AND Hollywood can’t be wrong. Right?

43

evren 08.30.14 at 7:21 am

From a moral point of view I don’t see any difference among cases. People can think illogically in dire situations so miner’s decision about sacrificing themselves should not change the action we are going to take.

No one would want to make a choice in those situations but I think it is about what kind of world we want to live in. Do we want to live in a world where we can effectively kill 50 people to save 50 people or do we want to live in a world that you will be sure of everything in power will be used to save all of them. I’d like to chose the latter.

44

Moz in Oz 08.30.14 at 7:24 am

The presumption that the miners/babies/rescuers are all identical, interchangeable units of value is perhaps better suited to an economics discussion than a philosophy one. Likewise the burbling about probability, certainty and accuracy.

Take Pike River Mine, for example. There one question was: do we send rescuers to their likely deaths in an effort to find out whether there are any miners still alive after the explosion? After the second explosion the question became even more acute.

So in real life the philosophy questions very rarely come up, and when they so they’re very much more personal. Do I, the mine safety manager, drive out to the mine when I hear the first report? Or should I ring my lawyer first? When I get to the mine, should I attempt to take charge? Should I volunteer to go in to investigate?

Similar situations occur not just for those immediately involved, but for anyone who hears about the events. Should I send money? Well-wishes? Put a post on Facebook saying “20 ferals die in mine, more to follow with any luck”?

45

Moz in Oz 08.30.14 at 7:30 am

FWIW, I very strongly adhere to the principles of first responders:

1: ensure your own safety
2: ensure the safety of those immediately around you
3: attempt to reduce the risk of the problem spreading
4: attempt to solve the problem

IN THAT ORDER.

I almost failed my driver’s license back in the day when there were oral questions, because one was a scenario where I found a crash site and “what would you do”. My answer was “turn on my hazard lights and reverse away from the crash”. Apparently I was supposed to rush in, open the driver’s door and kill everyone involved when the cabin light set off the petrol fumes resulting in a Hollywood-style explosion.

46

Moz in Oz 08.30.14 at 7:39 am

Sorry for the repeated posts, but I thought it’s worth pointing out that a ridiculous number of drownings occur when some idiot jumps into the water to save someone who is in distress. This study from Turkey found “In this 4-year period, 88 “rescuer” drowning incidents occurred in which 114 “rescuers” and 60 Primary Drowning Victims died from drowning in Multiple Drowning Incidents; 114 drowned “rescuers” rescued 47 victims before they died from drowning.

The statistics for other situations are not as bad, but it’s still distressingly common to hear of case where (for example) someone stops at a car crash and is run down by a passing car that didn’t see the crash or the “rescuer”. Or simply runs into the back of what is now a multi-vehicle pile-up, crushing people between the cars in front.

In the above thought experiment, how many rescuers are required and what are their chances of survival? What risk is there that the “mine collapse” will spread and put other people at risk? If they are evacuated, can the proposed rescue still take place or is it evacuation instead of rescue? Are the rescuers volunteers, or are they employed in roles that make it difficult or impossible for them not to “volunteer” for dangerous rescue roles?

47

actio 08.30.14 at 8:09 am

In realistic versions of such scenarios both rescue strategies are likely very costly. In almost every such case the true moral answer is the, always available, strategy 3: abandom the costly rescue, let all the 100 children die and use the fund for medical interventions that save many more children elsewhere through vaccination programs or distribution of anti-malaria treated sleeping nets. But people are unlikely to choose 3 and often get angry even at the suggestion of 3. Closeness bias, loss bias, in group bias, …

48

Dr. S 08.30.14 at 10:40 am

I’ve thought there must be a class of questions I would call meaningless hypotheticals. My inspiration is the question we must have all asked ourselves (?), “What if I stood up in the middle of this theater and yelled….” (or whatever outrageous action you choose). There is no answer, because there is no question.

Do you philosopher folks have such a category?

49

Brett Bellmore 08.30.14 at 11:46 am

“[I]f you want to corrupt people by direct propagation of ideas, moral earnestness is pretty well indispensable. […]* ”

Pretty much my view of trolley problems: Little more than an exercise in learning how to construct situations where you’d have an excuse to kill somebody. Nobody who’s practiced in them should be laying out trolley lines. Metaphorically speaking…

50

Karsten 08.30.14 at 12:07 pm

Option 2 is unequivocally better. The expected number of survivors is 50 in both cases but the survivors are much happier in the second case, so utility is far greater. Not having lost half their mates nor fraught with survivor’s guilt, they will happily return to their lives in no time at all. The survivors in option 1, not so much. This is at the root of the three musketeers ethic. Who wants to live without his mates?

51

Karsten 08.30.14 at 12:11 pm

I’m assuming that 100 is a small fraction of the total population. As others have pointed out, you don’t want to gamble with the future of the species. If it’s the last 100, you save 50 who will be deeply traumatized, but as a species we’ll get over it.

52

Ted Lemon 08.30.14 at 12:19 pm

Of course, if we want to be real, most mine disasters are, in fact, predictable, and so discussions of this sort are actually excellent propaganda tactics that help to draw peoples’ attention away from that fact. (I am sure the author of this post did not intend this as a propaganda tactic; I’m just saying that we should be careful not to play into the hands of propagandists with our philosophical inquiries, and hence should categorically reject questions of this sort.)

53

Lynne 08.30.14 at 12:43 pm

I’d choose to try to save them all. I think I could live with myself better than if I deliberately let 50 people die.

54

NomadUK 08.30.14 at 12:53 pm

Since none of them are foetuses, our moral obligation to protect them is dispensed with; let God sort them out.

55

Akshay 08.30.14 at 2:31 pm

Agree with Moz’s rule-utilitarian approach @45. Tried and tested, and robust when thinking meta-probabilistically; reducing the problem step-by-step by saving individuals is “likely” to make you do better than you expected, whereas attempts to play the hero by tackling a big problem will likely make you do worse than expected in your probability model.

I also think there is a strong argument that for many types of communities, losing everybody will be more than twice as bad as losing half.

Strategy 1 it is. (There might be a treshold of people saved which is so low you might as well try strategy 2, but for me this would be low, perhaps <<25…)

56

MPAVictoria 08.30.14 at 2:37 pm

“Since none of them are foetuses, our moral obligation to protect them is dispensed with; let God sort them out.”

Ha!

57

Layman 08.30.14 at 2:59 pm

In practice I think we nearly always choose A, though in fairness we may not be conscious of the choice.

Consider a burning building with a number of people trapped inside. Brave fire fighters go in to rescue them. They come upon one person. What do they do? They immediately detail some labor to carry that person out, even though so doing means reducing the effort to find more people. They’re placing a higher value on the life they know they can save than on those they can only speculate about, and this decision may mean that those not found will die.

Other examples present themselves. Hostage negotiators trade value for individual hostages, opting for a sure rescue while perhaps dooming the rest. Air-sea rescue helicopters stop their search for more survivors in order to rescue the ones they’ve found.

58

Bill Harshaw 08.30.14 at 3:57 pm

Tim ARmstrong has written about cannibalism among sailors surviving shipwrecks, going back to the biblical story of Jonah. Apparently there’s a real life choice between sacrificing the weakest and using lots. http://www.amazon.com/Slavery-Cambridge-Studies-American-Literature-ebook/dp/B008FOSL5E/ref=la_B001H6KLU0_1_2?s=books&ie=UTF8&qid=1409414051&sr=1-2

59

Chris Armstrong 08.30.14 at 5:29 pm

I agree with Ingrid: we should pick option 2. And I agree with Chris Bertram’s analysis of why this is so: the outcomes are on average equally good, but we have also safeguarded, in a sense, the value of equality. The complication Chris raises is in one sense right: if we think that value has some value, we should be prepared to go for option B even if there is a 50% chance of saving somewhat less than 100: say, 99 or 95 or whatever. On the other hand, somewhat paradoxically, we then lose the value of equality which drove us to accept a lower number in the first place, because we are denying some people a chance of being saved. So I think that the egalitarian would not compromise on the numbers saved, but rather on the percentage chances: he or she would accept a *somewhat* less than 50% chance of saving everyone. In sum, I think we trade-off in the direction of percentage chances, Chris, not percentage saved.

Here’s an interesting variant, Ingrid: we have a lifeboat which we know can save 2 people, and which *might* save 4, but which might just as likely sink and not save anyone. Do we let the last 2 people in or not?

60

Jamie Dreier 08.30.14 at 7:34 pm

Ingrid, this is (in large part) what John Broome’s Weighing Goods is about.
It’s interesting that some commenters have posed the problem as a choice between equality and risk aversion. One of Broome’s points is that Harsanyi’s Theorem shows a surprising connection between attitudes toward risk and attitudes toward equality — a logical connection, that is, not a psychological one.

61

Zamfir 08.30.14 at 9:27 pm

A baby in hand is worth two in the mine.

62

Lord 08.30.14 at 9:28 pm

I agree reality is more similar to the first though probabilities are highly uncertain, but we are discussing hypotheticals, so that should not deter us from considering the second, but the chance of ever facing such a situation is so fantasical, I doubt there is anything useful that can ever be learned from considering this. Angels..pins..

63

Thornton Hall 08.30.14 at 9:47 pm

@Holbo. When Kant and Hollywood agree only a moral monster goes along!!!

64

Jim 08.30.14 at 10:04 pm

Human beings exist as parts of a network, and the value of a network increases exponentially with size. So the expected value of 100 saved humans should actually be more than double the expected value of 50, shouldn’t it? On this basis I would take the 50/50.

Drop it to 50/50 of 98 humans vs. a sure-bet 50, though, and I’m unsure again – since while I think there is a greater-than-zero ‘network value’, I don’t know how to evaluate that versus the value of a human life.

65

Ben 08.30.14 at 10:21 pm

I’m Batman, so I’d save all the babies and the miners all in one spectacular series of confusing edits.

66

Evil Jim 08.30.14 at 10:34 pm

In the case of the miners, losing all of them represents a heavy hit to society. 100 families will lose their paychecks. Losing only half of them means that (if unionized) the survivors can adopt the families of the dead men, leading to an effective loss of only half the family paycheck, a better result than a total loss. Choose option 1: Rescue 50, write off the other 50.
In the case of the babies, they have contributed nothing to society or to the economy. Being non-productive workers, they represent a drain and an expense on both. They are therefore less important. Flip a coin. Saving any of them represents a risk: we cannot know if one of them will be the next Einstein or the next Hitler. Choose option 2: try to rescue all. If you succeed, you’ll be a hero, if they all die, they can be quickly replaced.
:)

67

Val 08.30.14 at 10:37 pm

I tend to agree with suggestions that there is something morally and intellectually wrong with posing such problems. They give the impression (on reflection, perhaps like much ‘scientific’ epistemology) that situations are static, that there are a limited range of choices, and that we know what they are. Better to talk about the real, complex problems we face, and what we can do about them, perhaps?

68

J Thomas 08.30.14 at 10:46 pm

I strongly agree with Urple and disagree in detail with Karsten.

Say the species is extinct except for me and 100 babies. (Ignore any problems I will have raising 100 babies by myself.) This is a narrow genetic bottleneck and there’s a strong chance we’ll die out from inbreeding depression etc. But 100 is better than 50, so if it looks like the same outcome on average, I’d try for 100.

On the other hand, say you’re selling babies to people who want to adopt into good families but can’t get them legally. And say that 50 gives you a decent profit while 100 gives you a fantastic profit. Then it’s better to settle for 50 and prepare for your next batch, because if you lose everything then you are sunk into poverty and can’t rescue any more babies, but if you are a going concern you have the chance to hit the home run next time.

If logistics is the important thing then you should call in that you will rescue 50, and then other people can have supplies ready for 50. If it’s either 0 or 100 then they don’t know what to prepare for and they are likely to allocate resources in some suboptimal way.

The basic theme is that outside factors may change the payoff, so that the payoff for rescuing 100 is not twice the payoff of rescuing 50.

But as Urple and others pointed out, in reality we hardly ever know the odds with any reliability. You can figure you can probably get half of them out, maybe more or less. Or you can try a strategy that has a one half chance to rescue them all, as best you can tell. But you know that you don’t know very exactly at all.

It’s a fire. Maybe you can rescue around half of the babies. Or maybe you can put out the fire before it gets bigger. If it’s worth trying to put out the fire then that’s worth the efforts of everybody who can contribute much to it. Because if it works, it works. If it won’t work then you need to salvage what you can.

Put that way, my instinct is to assign everybody who looks useful for putting out the fire, to putting out the fire. If it looks like it will work. And put everybody else to work rescuing babies, while that can be done safely. Even apart from the babies we’re better off if we can put out the fire.

Externalities are important. They can change the odds of success. They can change the value of different kinds of success.

But that’s in the real world. In Lady-or-the-Tiger questions where the intention is to balance things exactly so there’s no known reason to choose one side over the other and then see what people choose, if we agree that the value of 100 babies is exactly twice the value of 50 babies and the probability is exactly 50%, then AH has a point. Other things equal, you and everybody clse can plan better for a known outcome than for an unknown outcome. But if the average of 100 and 0 is not 50 for you, then you could get any result.

Cian has a point, if you need to think about your own rewards. If you get rewarded for success with no penalty for failure, then you should go for the sure thing. Unless that does not count as success.

If you get punished for failures with no reward for successes, then you might as well try to rescue them all and hope for the best.

If you get punished for violating rules, traditions, laws, etc then you should do whatever looks closest to the existing rules.

If you get rewarded for looking good to the media, then you should be good at telling the media what you’re doing, why it’s vitally important, why each next step is critical and could lead to ultimate success or failure, whether each step has succeeded and how everybody needs to feel about it, and finally announce victory or defeat. It isn’t if you win or lose it’s how you play the game. And trying to rescue all of them is the better story.

69

KDH 08.31.14 at 12:52 am

Captain Kirk would try to save them all. Spock would complain that it was not rational. C.P. Norris is correct that there is a heroism bias in fiction that probably effects how some of us react to this dilemma. Kevin (and Spock) are probably correct that this is pretty stupid behavior.

70

Roger 08.31.14 at 1:05 am

Societies face variations of this problem on a daily basis. The solutions are guided by the local legal or institutional framework, religious influence, cultural / societal norms, financial considerations, potential political ramifications, and emotion. (Within emotion I’m including everything from physical attraction to empathy to racial prejudice).

For both scenarios presented, assuming all points are exactly as presented, I’m not aware of any societies which would select to save 50% and doom 50%.

Who would make the selection? How would the selection be made? Coin toss? Would the “selector” be isolated to avoid receiving bribes to save a certain group? Who selects the selector? Would the selector attend the funerals of those he had assigned to die? Would families of the dead be permitted to sue the selector? Would families of the living pay his legal fees?

Some additional angles:

The 100 miners are in 2 groups. The 1st group are humanitarian award-winners who are on a fact-finding mission to improve mine safety. The 2nd group are criminals who had selected mine work instead of kitchen duty. An informant reported that they were planning to dig a tunnel to escape.

Solution: option 2. try to save both groups

Among the babies in the 1st group is one who is believed to be the newest incarnation of the Dalai Lama. Millions of Buddhists are following the story. Among the babies in the 2nd group is a newly born English prince, in line to be King.

Solution: option 2. try to save both groups

On an everyday level though, in real life, societies frequently choose to save some and doom others. Health Departments and funded medical researchers choose which diseases to focus on. Regulators choose which safety requirements to enforce and which operations (including mining) to target. Foreign aid organizations choose which countries will receive emergency food support.

You are a new Ambassador to an impoverished country. You have a remaining budget for the year of $5,000. You’re presented with 2 requests for approval:

1. End-of-year party for the diplomats, embassy staff, and their families. They have had a difficult year; morale is low; they are all looking forward to the party. They are planning a series of arbitration exercises to prevent conflict between remote villages. These exercises have been effective in the past. The party will include honoring local staff who were injured during exercises earlier in the year. The budget was $8,000, but the planning team has trimmed the food and presents down so it’s down to $3,500.

2. The northern provinces have had another bad harvest, and food supplies are low. A village leader has sent a message to the Embassy that food and supplies to support his village would require $5,000 to get them through the winter. They estimate 100 lives can be saved. Your advisors say that you will receive frequent such requests, that other villages will quickly ask to support them as well, and that there may be a violent backlash if the neighboring village sees you supported only their neighbor, with whom they have a land dispute.

Additional funds have not been approved.

71

Andrae 08.31.14 at 5:25 am

What is being missed here is that this problem is a repeated game. The fault for the 50% expected casualties lies with the managers, officers, directors, and shareholders of the mining company who refused the resources necessary to avoid the dilemma in the first place.

My solution: Select 200 of the aforementioned culprits. Explain that any resources they might make available to improve the odds of success will be gratefully accepted. Also explain that for each miner casualty, two of them will be shot.

Then choose option B.

72

Alan 08.31.14 at 6:03 am

I have a question.

When in history has anyone faced a dilemma much like this? What did they do and why?

73

Peter T 08.31.14 at 6:34 am

Something of this sort is fairly common in war. As noted above, if you have to close the watertight doors with people in the compartment or crash-dive the sub with crew on deck, you do so (and all the navy people I have known accept this). The first priority is the survival of the ship. In land warfare things are bit more problematic, because there are many more unknowns. The general course is to rescue as many as possible, even if that means putting an equal or even greater number in danger (and an army that abandons its troops will not long survive). But there are exceptions, again mostly to do with the survival of the greater part/larger cause. Historians of World War II note that Manstein’s relief effort towards Stalingrad was abandoned when the Red Army threatened to envelop the whole southern wing – he would risk an Army to save an Army, but not an Army Group.

74

Person 08.31.14 at 8:14 am

Yeah, it’s not just subs crash-diving either. What if there’s an issue with emergency bulkheads and a hull breach? Do you wait for a proper evac or immediately seal the doors to decrease risk of further damage or complete loss of the ship? I don’t know any specific real world historical scenarios, but it has to have happened or at least be possible and not unreasonable to assume it could happen without being contrived.

75

Stunfisk 08.31.14 at 8:31 am

@Ingrid: The plural of dilemma in English is dilemmas, not dilemma’s.

76

Peter T 08.31.14 at 10:35 am

On further reflection, in military situations calculation along the lines of the post seems to happen at higher levels of command. At lower levels, it’s “all for one and one for all”. That is, a lieutenant will risk a platoon to save one or two, but a general will sacrifice a battalion in the hope of saving an army. Or the battalion will sacrifice itself -as the Theban Sacred Band did so movingly at Chaeronea.

77

Shatterface 08.31.14 at 10:46 am

Among the babies in the 1st group is one who is believed to be the newest incarnation of the Dalai Lama. Millions of Buddhists are following the story. Among the babies in the 2nd group is a newly born English prince, in line to be King.

Let the first group die. The Dalai Lama will instantly be reincarnated anyway, so in real terms that’s one less death.

78

Shatterface 08.31.14 at 10:50 am

Captain Kirk would try to save them all. Spock would complain that it was not rational. C.P. Norris is correct that there is a heroism bias in fiction that probably effects how some of us react to this dilemma. Kevin (and Spock) are probably correct that this is pretty stupid behavior.

Spock will act on impulse then rationalise his decision afterwards. Kirk and McCoy will laugh about it.

http://en.m.wikipedia.org/wiki/The_Galileo_Seven

79

Shatterface 08.31.14 at 10:58 am

You have two groups of miners in different tunnels. The leaders of each group are married. There’s a trolley full of dynamite rolling down the tunnels. The leaders can divert the cart down the opposite tunnel but the trolley will pass the last intersection at midnight. The only other options are to block the tunnel with a fat man or torture him for the deactivation codes.

80

engels 08.31.14 at 11:02 am

Push the fat man off the bridge. It might not save them, but it will buy some time.

81

Shatterface 08.31.14 at 11:08 am

Consider a burning building with a number of people trapped inside. Brave fire fighters go in to rescue them. They come upon one person. What do they do? They immediately detail some labor to carry that person out, even though so doing means reducing the effort to find more people. They’re placing a higher value on the life they know they can save than on those they can only speculate about, and this decision may mean that those not found will die.

Not really, because if they can’t be arsed rescuing the first guy they have no reason to rescue the next guy either. If they’re only concerned with hypothetical people round the next corner they’ll never rescue anyone.

Firefighters go into burning buildings whether they know if anyone is there or not. Risking their lives for hypothetical people is in their nature of the job. If a firefighter dies rescuing a kid he’s regarded as a hero, not condemned for wasting his life and the resources that went into his training.

Also, consider the marine motto never to leave a man behind.

And how about a Saving Private Ryan scenario where several lives are risked saving one – because the sacrifice his family has already made is considered too high.

82

Shatterface 08.31.14 at 11:11 am

Push the fat man off the bridge. It might not save them, but it will buy some time.

The fat man consumes more air than the others. Plus you could feed more people with his corpse if you have to resort to cannibalism.

83

Shatterface 08.31.14 at 11:19 am

Actually, the fat man/torture scenario could be stated more realistically. In the classic example you are faced with throwing an innocent man on the track; but what if he has the codes to divert the trolley? You might be justified in throwing him on the track and saving lives for certain but if you threaten to throw him on the track – in effect, psychologically torturing him – is that less moral than killing him for certain because torture is never, ever justified?

84

Ingrid Robeyns 08.31.14 at 12:57 pm

@stunfisk: Thanks, I’ll fix that! As you may know, I’m not a native speaker, and while I try to write without spelling and grammar mistakes, I often fail.

85

Layman 08.31.14 at 12:58 pm

Me: “Consider a burning building with a number of people trapped inside. Brave fire fighters go in to rescue them. They come upon one person. What do they do? They immediately detail some labor to carry that person out, even though so doing means reducing the effort to find more people. They’re placing a higher value on the life they know they can save than on those they can only speculate about, and this decision may mean that those not found will die.”

Shatterface @ 80: “Not really, because if they can’t be arsed rescuing the first guy they have no reason to rescue the next guy either. If they’re only concerned with hypothetical people round the next corner they’ll never rescue anyone.”

I think you said what I said, except for the ‘not really’ bit.

86

Ingrid Robeyns 08.31.14 at 1:06 pm

Chris @59: I believe there was a recent real case like this (the lifeboat); I read the survivors took turns in the water and in the boot. I also am very convinced that, as some suggested, what we *think* we would do (or should do) is not necessarily what we would do. In the lifeboat case an additional feature is that we *see* the victims, and I believe there’s evidence from moral psychology that this makes a difference.

87

dsquared 08.31.14 at 2:28 pm

The guy who says “we can certainly save 50 miners might be telling the truth but the guy who says “there’s a 50% chance we can save evrybody” s very likely to be bullshitting. So option 1 in both cases.

Asking questions about exactly where these incredibly precise probability estimates come from (in contexts, as I’ve whined on numerous occasions, where there’s no strong reason to believe that the exectation exists at all) tends to get you docked marks in philosophy classes for “avoiding the question”, but IMO the fault is with utilitarian moral philosophers for avoiding the question of getting a rigorous treatment of probability for decades.

88

J Thomas 08.31.14 at 3:10 pm

#87 dsquared

The guy who says “we can certainly save 50 miners might be telling the truth but the guy who says “there’s a 50% chance we can save evrybody” s very likely to be bullshitting. So option 1 in both cases.

I think one of the most interesting things about this sort of question is the variety of external factors people choose to bring to it, that were not included in the problem.

I like to imagine it’s some sort of orphanage or nursery. They should not have 100 babies together, an infection could spread through the whole population easily. But they did. They should not have them in a building that’s designed so that with the fire blocking the main entrance, all the evacuation stuff has to go through two narrow doors and two narrow corridors. Considering the fire hazard, they shouldn’t have built that building in the first place. But they did.

Now you have a choice. Try to put out the fire, or try to rescue babies while the fire burns. It’s your professional judgement that your people can probably get about half the babies out before the fire gets too bad to get any more. And you have a chance to put the fire out before it kills anybody, but once it gets too bad you won’t be able to stop it at all.

And my natural instinct, unless the details prevent it, is to do both at once as best we can. Put the guys who’re best at putting out fires to work putting out the fire, particularly working from the outside near the main entrance. Put the people who don’t have those skills to work going into the narrow entrances and passing out babies. A baby-bucket brigade, if possible. If one has to take priority then decide which has priority, otherwise do both as well as you can. Anything the anti-fire guys do to delay the fire gives time to save more babies.

But the story is designed to make it as evenly balanced as possible. Like the Lady or the Tiger.

And the second interesting thing I see from that, is that when it’s balanced as well as the originator knew how, still sometimes people take strong definite stands for one over the other.

Like, say you have a choice between two vanilla ice cream cones. The cones are almost identical except:

The first cone has a slight crack that extends almost all the way down. If you eat it slowly it might eventually start to leak, and if you push on it hard enough at just the wrong angle it might crack open. It has your initials engraved in the ice cream on top where it’s easy to see them.

The second cone has somebody else’s initials engraved in the ice cream on top where it’s easy to see them.

Which would you choose?

Not unlikely they could get people arguing about it.

“Take the one with the initials! The other one will leak and crack open.”

“No, man, the second one was *meant* for somebody else. You don’t want the one with the wrong initials.”

“That’s stupid. It makes no difference and it will be gone after the first lick. The first cone has this big physical flaw that you can avoid.”

And not unlikely after a few rounds they’ll be arguing like Republicans and Democrats even though in this particular case there’s no significant difference between the choices.

When instead they could be arguing about the Democratic Party and the GOP where the vital differences will determine the future of the whole world.

89

Shatterface 08.31.14 at 4:24 pm

And my natural instinct, unless the details prevent it, is to do both at once as best we can. Put the guys who’re best at putting out fires to work putting out the fire, particularly working from the outside near the main entrance. Put the people who don’t have those skills to work going into the narrow entrances and passing out babies. A baby-bucket brigade, if possible. If one has to take priority then decide which has priority, otherwise do both as well as you can. Anything the anti-fire guys do to delay the fire gives time to save more babies.

From my limited experience of the matter, and by ‘limited experience’ I mean watching Chicago Fire, fire fighting squads are split into those who fight the fire and rescue teams responsible purely for dragging people to safety: two different jobs.

90

Quite Likely 08.31.14 at 4:25 pm

I have no strong opinion one way or the other on these. The correct moral reasoning is to maximize the projected number of lives saved. If it was a sure thing to save 51 then that would be the right thing to do. If it was only a sure thing to save 49, then you should take the risk. In the contrived exactly even scenario it just comes down to whether you’re more risk averse or risk tolerant.

I don’t think the scenario is any different than if it was a choice between getting $50 for sure or a 50% chance of getting $100. Of course in the scenario with the miners we should let them decide which to go for themselves. As for the babies, who knows who in this hypothetical scenario is responsible for making such a decision. I wouldn’t second guess either decision.

91

nat 08.31.14 at 5:28 pm

I find it hilarious that so many answers are avoiding the question with quibbles about the math or the phrasing. Y’all are like the physics students that refuse to answer questions about frictionless spherical chickens. Its a question you morons. Within the question is an answer. Avoiding it by saying the question is impossible is not an answer.

Within the logic of the world you have the odds.
NOW MAKE A CHOICE!

failure to make a choice is failure to save anyone.

My other note is this whole thing brings up the 100% death vs 1% success question.

Its like what I refer to as the “Stupid Star Trek Dilemma”
We face certain death!
The chances of this thing saving us are very low.
Do we do it?
Of course we do you idiot Kirk, if its 100% die vs “maybe kinda live” you always go for the non-100% death route even if its only 97% death.

92

J Thomas 08.31.14 at 7:36 pm

#89 Shatterface

From my limited experience of the matter, and by ‘limited experience’ I mean watching Chicago Fire, fire fighting squads are split into those who fight the fire and rescue teams responsible purely for dragging people to safety: two different jobs.

That seems like the obvious way for me.

But somehow the way the question was stated, you are supposed to do one or the other.

I can imagine that, like everybody has to go through one narrow door and either team will interfere with the other team’s functioning so if you try to do both then neither gets done.

But it doesn’t come natural to me. If it looks like there’s a chance to do both, I want to do both.

93

Witt 09.01.14 at 2:11 am

When I see a hypothetical like this, it makes me wonder (as dsquared and others note above) why on earth people would trust the probability estimates in a (perhaps artificially urgent) situation like this. People are fallible, subject to all sorts of biases and preconceptions. There is no reason to think that imperfect human beings are somehow equipped to generate perfect predictions or calculations.

In 1793, Philadelphia was suffering a horrific epidemic of yellow fever. Noted physician Benjamin Rush believed that black people were immune to the disease.

As a result, black Philadelphia were asked to take on significant responsibility for caregiving. Rush was wrong, and black Philadelphians died.

Rush was a learned man, and I’m sure his contemporaries saw him as speaking with the voice of Science. But he was wrong.

I don’t like thought experiments in general, because I actually believe that the specifics matter — and that most of the them, when I see these kinds of situations framed, it’s by someone who is trying to trick me. (Often, a legislator trying to pass a new law.) So my heuristic is “most calculations of probability are seriously screwed up, unintentionally if not maliciously, and you ought to make decisions that rest as a little as possible on the confidence that somebody else’s calculations are right.

94

Witt 09.01.14 at 2:12 am

s/b “most of the time”. Apologies.

And I should have closed quotation marks at the end of the comment.

95

Bruce Baugh 09.01.14 at 2:31 am

Nat@91: For many of us, this kind of exercise is like demanding we give orders to the captain of a ship in distress based on its proximity to John Cleves Symmes’ proposed holes into the inner Earth(s) and the ensuing consequences for currents. No good decision the captain might make takes the polar holes into consideration, because they don’t exist and have no effect on currents.

Likewise, surgeons never plan operating room use with allowance made for sudden temporary cessation of gravity. (Shipboard surgeons and others in potentially unstable situations do plan for them, and the adaptations involved are often really cool.) Anyone demanding that they go ahead and make and carry out a plan for it anyway is simply wasting everybody’s time. Or, alternatively, they are very successfully and productively demonstrating their power by making an important task that much more likely to fail for a reason that has nothing to do with the realities.

I guess what I really wonder is: is there any evidence that doing these kinds of thought experiments leads to people making better moral choices when it comes to reality? Are people who think about this stuff a lot less likely to vote for enthusiastic torturers, for instance, more likely to help neighbors and strangers in need, anything like that? Has anyone even tried measuring outcomes, and if so, what are they? People have been doing them for long enough that it seems reasonable to inquire.

96

bianca steele 09.01.14 at 3:31 am

I’m having a lot of difficulty imagining any situation where both A and B could be a possibility, outside special situations like mines. The helicopter on the roof waits for the elevator to reach both levels of the building and return? Maybe. The one ambulance in town doesn’t leave until the building is empty? Ambulance drivers probably have an ethos preventing that. Everyone ignores the babies while trying to fix a mechanical problem, or blow a hole in the side of the building, that will render the fire moot? As the chances grew that they would fail, would they really just stand there while babies died behind them? The only person with the keys to the building drives across town instead to cut the power, even though there’s a lot of traffic and he might not get there in time? Everyone goes out to construct a batsignal? The only thing I could think of was a situation where there were enough people to definitely rescue 50 if each carried one, but if each carried two, there was a chance both would die (because the first one picked up got carried around for longer). And the opposite seemed more interesting: what if it was standard to try to rescue two, but it turned out that always leaving with one improved the survival rate?

97

J Thomas 09.01.14 at 11:13 am

#93 Witt

When I see a hypothetical like this, it makes me wonder (as dsquared and others note above) why on earth people would trust the probability estimates in a (perhaps artificially urgent) situation like this. People are fallible, subject to all sorts of biases and preconceptions. There is no reason to think that imperfect human beings are somehow equipped to generate perfect predictions or calculations.

Agreed. And yet, when we make choices we have to depend on our estimates of the probabilities, even though we know those estimates are not very accurate. Sometimes by experience we learn to bias our estimates, because we have learned that they are naturally biased the other way round. Computer programmers asked to estimate time-to-completion have a rule-of-thumb to make their very best estimate of the time and then multiply by four.

Akshay in #55 pointed out that the probabilities for big projects are probably worse than you think.

Still, after you apply whatever corrections you know to apply, you still have to go with your best estimates. If you have to choose, and you think it’s 30% you can save the whole thing and 90% the other way you can get at least 70 out of the 100 out, then it sure makes sense to get out what you can. If you think it’s 80% you can save the whole thing but if you don’t try that then you can probably get 10 babies out, then you try to save them all.

When you can’t tell which is better, you still have to choose right away or the odds get worse for either one.

Unless you don’t in fact have to choose one over the other.

It’s an unrealistic hypothetical problem, but I think the whole point of it is to balance out all the objective criteria so that it’s exactly 50:50, and then see what special biases crop up. They specify that you know the odds exactly so that you won’t be tempted to argue as Akshay did that the odds are really not equal and decide on that basis.

It isn’t supposed to be real, it’s supposed to be exactly balanced.

98

Joshua W. Burton 09.01.14 at 2:36 pm

Priest @6 finds the way home:

50 babies have been rescued. The commander of the Nazi occupation forces will allow you to return them safely to their homes if you abandon the other 50 to the flames.

Obviously, you round up 50 miners with pickaxes, and attack the Nazis. Since armchair posers of trolley problems are much easier to physically intimidate than Nazi occupiers [citation needed], this approach resolves the original problem as well.

99

TM 09.01.14 at 5:33 pm

This (I mean the OP) goes for moral thinking these days?

100

Jim Bracher 09.01.14 at 8:03 pm

Fascinating. The question is interesting because it motivates me to consider my moral values. Option 1 is repugnant because not trying to save everyone is bad. Option 2 is repugnant because risking half in the chance of saving all is bad. After reading the fascinating comments, I find I’m afflicted with the heroism bias, and would try to save them all because Heroism.

Reading the comments was very educational. So many didn’t want to address the question as stated, but wanted to quibble and avoid it. I agree the question is not practical. It’s clearly not intended to be. This is not about rescue math and decision making, it’s about morality.

In a real situation, I’d go with option 2 and work like hell to change the math.

Thank you all for the mental candy.

101

Matt Steinglass 09.01.14 at 9:41 pm

Well, I don’t know. But three weeks ago I was driving from Haarlem to my in-laws’ place in Ede, and I saw on my iPhone that there was a serious traffic jam on the A1 north of Amersfoort. I had two choices: either I could get off at an exit before Amersfoort and try to work my way to Ede on back roads, which would be slower than the highway but avoid the traffic jam; or I could stay on the highway and brave the traffic jam, which would probably only last a few minutes. I knew that if I didn’t get off the highway and ended up sitting in traffic for a few minutes, my wife would scold me for failing to avoid the traffic jam, whereas if I took the back roads, even if it in fact took much longer, my wife would not get angry at me, since I had at least tried. What would you do?

In the end, I got off the highway and took the back roads, but as it turned out there were further road closures that forced us onto a strange detour, down a narrow road that would have taken us the wrong direction, and I decided to make a U-turn and try a different back road. To make the U-turn I pulled into the driveway of a strange, large, modern retreat or institute that popped up out of nowhere in the middle of the fields. The institute, according to the sign on the gateway, was called the Onderzoekschool voor Wijsbegeerte (OZSW), or “Research School for Philosophy”. Exactly three weeks later, Ingrid Robeyn would attend a weekend Summerschool on Dirty Hands and Moral Dilemmas at this very institute, and would post a question on “Crooked Timber”: Imagine you are trying to rescue some miners…

(The detour, of course, ended up taking about 45 minutes extra–vastly longer than the traffic jam would have.)

102

Greg 09.02.14 at 12:35 pm

Maybe it’s just my poor little head, but these problems underline why I hate frequentist probability approaches to one-off situations.

Anyway, option B may average out to saving 50 lives if you do it a hundred times, but the whole point is that you only get to do it once. You’d have to get up to a subjective 80% or 90% probability of success before it’s worth even considering the risk of option B.

Certainty wins every time. Option B basically forces you to gamble with 50 lives. Imagine if you had 50 babies already safe, would you put them back in danger for a coin flip of rescuing 50 more? That would surely be immoral.

103

J Thomas 09.02.14 at 1:06 pm

Certainty wins every time. Option B basically forces you to gamble with 50 lives. Imagine if you had 50 babies already safe, would you put them back in danger for a coin flip of rescuing 50 more? That would surely be immoral.

Imagine you have worked out a hostage deal, and you get 100 babies. But there is a catch. There are two roads they can use to return the babies, one of them is mined by terrorists who oppose both sides, and nobody knows which one. The bad guys who have agreed to return the babies give you a choice. They can put the babies on a bus and you choose which route they take. Or they can split them onto two smaller buses and take both routes.

Would you prefer a 50% chance to rescue all of them, or the certainty that half of them must die?

(If you had the choice, you could insist that they first send an empty bus and then send 100 babies by whichever route is known safe. But being bad guys who are not demanding much of anything in exchange for the babies, they aren’t willing to do you this favor.)

104

TM 09.02.14 at 8:55 pm

Your inventiveness is impressive. Still I don’t think this has much to do with moral reasoning.

105

J Thomas 09.02.14 at 9:59 pm

Still I don’t think this has much to do with moral reasoning.

First they set up an artificial situation with two choices, and the two choices are equally balanced as best they can arrange it. Then they ask which choice is better.

People could say that they are equally balanced and there’s nothing to choose between them. But usually instead people come up with reasons why one is better than the other. The difference could be tiny, because they were balanced so well that a tiny difference would show up. But often people say the difference is large.

What decides those choices? It might have something to do with their values, with what they think makes one choice better than another.

Also, the way it’s framed makes a big difference. Like Greg made the reasonable argument that it’s like rescuing 50 babies for sure and then putting them at risk again to get a 50% chance to get 50 more. But I can frame it the opposite way — you can give them all a chance, or you can kill 50 of them so the other 50 will survive. By the numbers it’s the same outcome on average, but it feels different.

It depends partly on details. If you personally had to decapitate half the babies to make sure the other half survive, versus you do nothing and they all get the same 50% chance, I suspect a lot of people would say to let the 50% chance happen.

106

James Marcus Bach 09.03.14 at 3:34 am

How much time do I have to make this decision? Can I put it off for an infinite amount of time? If so, I will postpone it until the minors grow up enough to replace the miners.

Are these human babies? Because I would not worry much about baby ants. Also the ants could certainly not replace the miners except on a very small scale.

The situation wherein I must make a rushed decision about human babies on this basis in intolerable. No human would accept it. Here’s what would happen in real life: we do one or the other option based primarily on conventional wisdom (what does it say in the manual that we should do?) or if there is no such wisdom then we choose based on some transient irrational factor, such as being a tiny bit closer to the magnet of imagining all the babies saved, or the magnet of “I am rescuing this baby NOW!” Having chosen one course, we immediately experience the endowment bias and a continually increasing sunk cost bias that reinforces our decision (we stick ourselves to the magnet). This serves as a sort of moral painkiller.

At the end of it we sue whomever it was who put us in this situation. Fire codes improve, future lives are saved, and we start to think the Triangle Waistcoat fire was a good thing.

BTW, we are in this situation right now as a species. Do we sacrifice now to rescue our children from global warming, or do we bank on the possibility that no sacrifices will have to be made and some miracle will save us? Humanity as a collective is obviously choosing the latter– accompanied by a whole lot of probability denial.

On second thought, I am going to save the ants. They will do well in the far future and I will need some friends.

107

Zamfir 09.03.14 at 5:11 am

And yet, when we make choices we have to depend on our estimates of the probabilities, even though we know those estimates are not very accurate. You’re right that people often estimate probabilities for one-of events that do not have a reasonable distribution to base that probability on. That should not be encouraged.

Probabilities are not a fact of the world. Uncertainty is. Probability is an attempt to put a mathematical grip on uncertainty. It ‘s a suitable approach for some kinds of uncertain situations, and less or not at all for others.

It has become surprisingly common to assume that every uncertain outcome must have a true probability attached to it, that we just not happen to know. There are economic-ethical theories that take the existence of such probabilities as axiom. It’s sad.

108

bad Jim 09.03.14 at 7:49 am

So, on an adjoining thread, John Quiggin is asking for enough donors to pony up $1K so that he can enjoy a refreshing ice bucket bath after his upcoming triathlon. Last I checked we weren’t there yet.

Charity is a slow-motion performance of moral decision. I’m a man of means, and although I’m rather generous with friends and family, and respectably generous in the conventional way, I’m strongly inclined to hold tight to my millions. My acquaintances tend to upbraid me for my unwillingness to spend money on myself.

Professors love such gendankenexperimente because their students are obligated to serve as subjects. They’re easy to carry out, but not especially informative, because not only are their subjects WEIRD, they’re also young. We can find out what they think they think, but they haven’t had the time or wherewithal to do anything.

We can, though, measure what people actually do, noting how they vote or donate, how far out of their way they’ll go to put their trash in a suitable receptacle (not very far), and so forth. We seem to be more horrified by the beheading of a reporter by a terrorist group in Iraq than by the beheading of a housekeeper in Saudi Arabia or botched executions by injection in the U.S.

Compared to the daily news, such undergraduate puzzles are unrealistically bloodless. It might be better to wonder whether nuclear weapons are the right way to handle Ebola or drones in Ukraine.

109

J Thomas 09.03.14 at 1:21 pm

#107 Zamfir

Probabilities are not a fact of the world. Uncertainty is. Probability is an attempt to put a mathematical grip on uncertainty. It ‘s a suitable approach for some kinds of uncertain situations, and less or not at all for others.

I’m not sure how useful it is to separate our lack of knowledge about the world, from possible indeterminacy of world events.

Physicists have apparently given up making the distinction. Quantum theory is about what we can measure. It definitely applies to the statistical results of many repeated trials. Does it apply to individual events? That doesn’t seem to matter to the results. It appears that in circumstances where something people consider a particle behaves like a wave, it does apply to that individual event. When particles behave like particles, probably not.

If at some time we find ways to look deeper at particles, we might find deterministic rules that apply to them and then QM turns into a measure of uncertainty that’s useful whenever we don’t measure the events but instead are uncertain. Until then it might as well be a probability.

So when we don’t know how well strategies will work, we still need to estimate how likely they will work to choose among them.

Say we restate the problem. By your best estimation, there’s a 40% to 60% chance to rescue all the babies. Or you can try a different approach that will probably rescue 40% to 60% of them.

Does that change it much?

It has become surprisingly common to assume that every uncertain outcome must have a true probability attached to it, that we just not happen to know. There are economic-ethical theories that take the existence of such probabilities as axiom. It’s sad.

Comments on this entry are closed.