If a machine can predict what choice you will make, do you still have a choice?
Suppose that a machine exists that can predict what choice someone will make in a given situation with nearly 100 percent accuracy. That machine has predicted what choice you will make in the following situation:
A room contains two boxes, Box A, which is transparent and contains $1,000, and Box B, which is opaque and contains either $1,000,000 or nothing. You have the option of choosing both boxes or choosing only Box B, and you get to keep the money in the box or boxes you choose.
The contents of Box B are determined in the following manner. If the machine predicted that you would choose Box B only, then Box B contains $1,000,000. If the machine predicted that you would choose both boxes, then Box B contains nothing. The contents of Box B have already been arranged by the time you enter the room.
What choice should you make?
This problem was invented by William Newcomb and popularized by Robert Nozick, Martin Gardner, and William Poundstone, in his book Labyrinths of Reason. Nozick said that to most people, the right answer is obvious and the other choice is silly, but unfortunately they are about evenly split between people who think you should choose both boxes and people who think you should choose Box B only. Philosophers continue to disagree, and much of the disagreement seems to come from the mysterious nature of the predicting machine.
A straightforward argument for choosing Box B is that if you choose it, then it is likely that the “nearly 100 percent accurate” machine will have predicted that you would choose it, so it will have $1,000,000 in it.
Something about that “will have predicted” reasoning bothers those who argue that, regardless of the machine’s abilities, when you enter the room, the contents of Box B have already been determined, and choosing both boxes increases your haul by $1,000.
The paradox raises questions about free will. If there really were an entity that could predict your behavior so accurately, then would your choice of boxes really be a choice?
The Poison Puzzle
Here is a variation that raises similar questions: There is a machine that can measure your intentions accurately. A wealthy individual has promised to give you an enormous sum of money if you form the intention to drink a poison tomorrow. The poison will have no permanent effects but will make you extremely ill. The money will be paid today, as soon as you form the intention to drink the poison tomorrow. It must be your actual intention. Having formed the intention and received the money, do you have to actually drink the poison tomorrow? And does thinking about the possibility of not drinking the poison tomorrow affect your intention today?
Poison puzzle loop-hole: if you only agree you will drink it tomorrow and dont agree on a specific date (paid on 5/6 and drink poison on 5/7), the technically tomorrow will never come, therefor only when today is considered tomorrow, you do not have to drink the poison.
Ah, the old “free beer tomorrow” trick!