Monday, 31 July 2017

Two Envelopes and the Hypergame

The two envelopes problem goes a little like this:

Say I offer you one of two identical envelopes.  I don't tell you anything other than that they both contain cash, one contains twice as much cash as the other and that you can keep the contents of whichever envelope you eventually choose.  Once you make a choice, I offer you the opportunity to switch.  Should you switch?

Note that there's no need for me to know or to not know what is in each envelope (so this is not a Monty Hall type situation in which my knowledge affects your decision).  Note further that each time you make a choice (including the choice to switch), I could conceivably offer you the opportunity to switch - so if you were to decide to switch, then the same logic that caused you to switch in the first instance still applies and therefore you should take me up on the offer and switch back to the original envelope, and switch again, and again, and again.

To me, this indicates that the only valid answers to the question "should you switch?" go along the lines of "it doesn't really matter if I switch or not because, statistically speaking, there is zero value in a switch, I don't gain or lose anything by switching".

---

So, why might you think that there is value in switching?  A presentation of the problem at Quora put it this way:

Well, think about the expected payoffs here. You had a 50% chance of choosing the better envelope from the start, right? So that means half the time, the second envelope will have twice as much money as the first one, and half the time it'll have half as much. So say the first envelope has $10 in it; then if you switch, you have a 50% chance of losing $5 and a 50% chance of gaining $10. So it's objectively better to switch!

This is one of those rare situations in which using actual numbers can make understanding slightly more difficult, so let's think about this in terms of X.

You choose an envelope and assign that envelope a value of X dollars.  In the other envelope is either 2X dollars or X/2 dollars, each with an equal likelihood.  Therefore, a switch has the following value:

0.5 * (2X-X) + 0.5 * (X/2-X) = 0.5X - 0.25X = 0.25X

So, when thought about this way, there's apparently a positive value associated with the switch rather than a zero value.

This is clearly the wrong way to think about it.  First and foremost, it skews the total value of envelopes.  On one hand, you are presuming that there is $30 in total value and you have the wrong envelope, while on the other, you are presuming that there is only $15 but you have the right envelope.  Naturally, it’s going to look like swapping is better since you currently only have $10 if you are right and stand to gain $10 if you swap, while only rising the loss of $5 if you are wrong.

A better way is to think about this in terms of X and 2X only.  One envelope has X dollars and another has 2X dollars.  Once you have selected an envelope, there is a 50% chance that you have X dollars and a 50% chance that you have 2X dollars, therefore, the value of your envelope is:

0.5 * X + 0.5 * 2X = 1.5X

The value of the other envelope must, given that the total value of both envelopes is 3X, be 1.5X dollars as well and there is therefore zero value in switching.

The value of the switch can also be calculated this way - there is a 50% chance that you will give up X dollars in exchange for 2X dollars and a 50% chance that you will give up 2X dollars in exchange for X dollars:

0.5 * (2X-X) - 0.5 * (X-2X) = 0.5X - 0.5X = 0

So the "paradox" resolves down to a simple misrepresentation of the problem (related to the old error of counting chickens before they hatch).

---

Naturally, there is a slight twist.  Say I give you an envelope (with X dollars in it), then I toss a coin and fill another envelope with an amount of money based on that result.  All you know is that there is a 50% chance that the second envelope has 2X dollars in it and a 50% chance that is has X/2 dollars.  On this basis you should in fact swap, because the second envelope has a value of 1.25X dollars (therefore the value of switching is 0.25X dollars as calculated above).

In this instance, however, it initially seems as if, were I to ask you if you wanted to swap again, you should say no, because the first envelope only has a value of X dollars while the one you switched to has a value of 1.25X and therefore the switch back would have a value of -0.25X dollars.

However, the second switch actually has this value:

0.5 * (2X-1.25X) + 0.5 * (X/2-1.25X) = 0.375X - 0.375X = 0

In other words, there is no value or cost associated with a second swap, or a third swap and so on.  This is further indication that using the X/2 and 2X values is problematic.

---

While I am responding to something written by Leon Zhou, I want to take the opportunity to respond to his hypergame paradox.

It goes a bit like this:

You and I play a game.  The rules are that I choose a finite two-player, turn-based game.  We play that game.  You get the first move and whoever wins the game wins the hypergame.

A finite game is a game that ends after a finite number of moves (it doesn't matter how many though).

Can I choose, as my finite game, this very game, the hypergame?

It seems that I can since, under the rules, the game chosen must be finite, thus the hypergame is the same number of moves, plus one, and therefore finite as well.  But if I chose the hypergame as my game, then you can choose the hypergame too and we can go backwards and forwards forever, choosing to play the hypergame … in which case the hypergame is not finite after all.

So the hypergame is both finite and not finite and we have a paradox.

I agree that this is a paradox, but I disagree with Zhou's claim that this paradox is not an example of "self-referential trickery".  It's quite clearly an example of self-reference in which the game itself is called from within the game.  He also suggests that it's not related (although he qualifies it with the term "direct reference") to Russell's paradox, but it is.  Within the hypergame is a call to the set of all finite games, Y.  If you put the hypergame in Y, then a path to an infinite loop opens and – by virtue of being placed in Y, the hypergame becomes ineligible as a member of Y.  Take the hypergame out of the set of games that can be called by the hypergame and it becomes a finite game again, and thus qualifies for being a member of Y.

This is similar to (but not exactly the same as) Russell's set R which is the set of all sets that are not members of themselves.  As a set which is not a member of itself, R becomes a candidate for being a member of R, but is thus disqualified.  And by not being a member of R, R becomes a candidate for membership of R.

The hypergame is both a member of Y and not a member of Y in the same sort of way that R is a member of itself and also not a member of itself.

We can avoid the hypergame paradox, perhaps naïvely, with a minor clarification.  We simply clarify that the game chosen within the hypergame cannot be infinite.  Not "is not" but rather "cannot be".

This sort of clarification leaves Russell's paradox untouched.  Say we were to define R as the set of all sets that cannot be members of themselves - if R can be a member of itself, then it cannot be a member of itself, then it qualifies for being a member of itself, but it thus immediately disqualifies itself … and so on.

Somewhat unsurprisingly, Russell's paradox seems to be more fundamental than the hypergame paradox.



No comments:

Post a Comment

Feel free to comment, but play nicely!

Sadly, the unremitting attention of a spambot means you may have to verify your humanity.