A better explanation of the monty hall problem
I know I said no newsletter this week but this just hit me and it's small enough to write up so I'm fitting it in. The Monty Hall problem is a famous probability puzzle. From Wikipedia:
Suppose you're on a game show, and you're given the choice of three doors: Behind one door is a car; behind the others, goats. You pick a door, say No. 1, and the host, who knows what's behind the doors, opens another door, say No. 2, which has a goat. He then says to you, "Do you want to pick door No. 3?" Is it to your advantage to switch your choice?
Most people say it doesn't matter, that regardless of if you switch or not you have a 50% chance of getting the car. But switching actually wins 2/3rds of the time!
I've never been satisfied with the many, many explanations I've read. I get the math, but every explanation feels like a sleight-of-mind, where if you poke just a little bit it'll all fall apart. The other day I came up with a variation:
Behind one door is a car, behind one is $1, and behind one is $2. You choose two doors, say #2 and #3, and the host truthfully says "#3 is more valuable than #2". Then you can pick any one of the three. Which do you pick?
So your choices are effectively "#1" and "the best of #2 and #3", and it's pretty natural that "the best of two doors" will win twice as often as "one door". If you want the exact odds, here are all possible combinations:
#1 |
#2 |
#3 |
---|---|---|
$1 | $2 | car |
$2 | $1 | car |
car | $1 | $2 |
Now let's show that this is identical to the canonical Monty Hall.
- First, instead of letting you pick two doors, the host makes you pick one door, and then tells you the most valuable choice of the other two. But the door you choose forces the other two, so you can learn "the best of #2 and #3" by picking #1.
- Now, instead of letting you pick any of the three doors, you can only stay or switch to the most valuable door. But in the "pick any three doors" model, there's no reason to ever pick #2, since you know it's not worth it. So you only had two choices anyway.
- Instead of telling you "#3 is more valuable", the host says "#2 is less valuable", and proceeds to show you by opening door #2.
But wait, now we have changed things! Because if the host reveals $2 in #2, you know the car is in #3! You have a 100% of winning if you switch! So this isn't like Monty Hall at all...
... except that Monty Hall has the same problem too! It doesn't specify which door the host opens if both have goats. Imagine that the host has a strict strategy of "always choose the spotted goat". Then if you see the plain goat, you know the last door isn't the spotted goat, so it must be the car. Conversely, if you see the spotted goat, switching metaparadoxically doesn't matter— you have even odds either way.
(Incidentally, if you run through the math, you'll discover that "always switch" still wins 2/3rds of the time overall. Spooky!)
When people present MH, they usually assume that the strategy is "pick randomly". In that specific case, switching always wins 2/3rds of the time. I'm weaker on how to show this is equivalent to "you know #3 is valuable but not what specifically is in #2"; my only good argument is you wouldn't expect it to be different just because the two goats are identical.
I was gonna extend this to a few more Monty Hall variations, but this is already 600 words and if I keep going this will become a whole multihour newsletter thing etc etc
(No, I don't think it's actually better, just more intuitive for me, but I'm not making the title "An explanation of the Monty Hall problem that's more intuitive, in my personal opinion", that's a terrible email title)
If you're reading this on the web, you can subscribe here. Updates are once a week. My main website is here.
My new book, Logic for Programmers, is now in early access! Get it here.