Dan Meyer’s Money Duck vs. Pepsi’s Play for a Billion

Written in a little bit of a hurry, but hopefully the ideas make sense . . . .

In 2003 and 2004 Pepsi ran a promotion of the summer called “Play for a Billion.”   There’s not much on youtube about the contest, but one of the commercials did show up in a search:

It was a fun promotion that involved many Pepsi drinkers winning trips to Orlando (2003) and Los Angeles (2004) where they had a chance to win $1 billion on a game show that was broadcast on the WB network (2003) and on ABC (2004).    As far as I know it was the last billion dollar promotion before this year’s bracket contest with Quicken loans.   Tim Chartier interviewed me about those contests here:

http://blogs.siam.org/perfect-billion-dollar-madness/

In the Play for a Billion game show, the contestants picked 1,000 6-digit numbers from 000000 to 999999 and if any of them picked the winning number selected on the show they could have one the billion dollars.  But there was one pretty interesting twist on the show that reminds me a little bit of the direction people have gone with Dan’s Money Duck problem.  The conversation about that problem is happening here:

http://blog.mrmeyer.com/2014/confab-money-duck/#featuredcomments

Once the winning number in the Pepsi contest was selected, the 10 contestants with the “closest” numbers were brought on stage.  “Closest” was defined as having the most digits correct (and then tie breakers that I don’t remember).  These 10 contestants then played the following game:

There will be 10 rounds of the game.  During any of the rounds you can ring a bell and receive the prize money offered during that round.  The catch, though, is that by taking that prize money, you give up your number and have to leave the game.    If no one rings in, the person who is the “farthest” away from the winning number will be revealed and that person is eliminated from the game with no prize.  After 9 rounds there will be 1 person remaining.  That person gets $1,000,000 automatically AND if any of the 10 numbers from the people who were on stage matched the winning number, that person also gets $1,000,000,000.

The prizes for the rounds were something like this (I don’t remember the exact details):

Round 1:  $20,000

Round 2:  $30,000
Round 3:  $40,000

Round 4:  $50,000

Round 5: $60,000

Round 6: $70,000

Round 7:  $80,000

Round 8:  $90,000

Round 9:  $100,000

 

So, the relationship to the Money Duck problem is through the concept of expected value.  In the first round of the game described above, each of the 10 people have a chance at winning a guaranteed $1,000,000 prize.  Since the contestants don’t know the winning number, they have no reason to believe they are any more or less likely to win than anyone else, and hence the expected value for each contestant at the start is $100,000.  Why would you give that up for $20,000?

Well . . . the answer is because the decision isn’t just abstract math / probability / expected value.  I should point out that at least one of the commenters on Dan’s site did point out that running Dan’s activity with an economics teacher might be a good idea.  I completely agree with that sentiment.

So, the first year the rounds played out like this (again, I forget the exact details, but this is close to what happened):

Round 1 – no one rings in and one person is eliminated.

Round 2 – no one rings in and one person is eliminated.

Round 3 – someone rings in and walks away with $40,000

Round 4 – someone rings in and walks away with $50,000

Round 5 – someone rings in and walks away with $60,000

Round 6 – no one rings in and someone is eliminated.

Round 7 – someone rings in and walks away with $80,000

Round 8 – someone rings in and walks away with $90,000

Round 9 – no one rings in.  One person wins one million, and “2nd place” leaves with nothing.

Watching this live was one of the most amazing experiences I ever had.  Rarely, if ever, do you see this kind of tension.

The decisions of the contestants came down to two ways of thinking about the problem they were facing.  One way they thought about it was expected value, and if you thought about it that way you’d never ring in.  The other way was more along the lines of – I unscrewed a bottle cap, won a trip to Florida, and now someone is offering me $50,000.  Why wouldn’t I take it?  Why not indeed.  I find it hard to believe that the behavior of the contestants in this game was irrational – and I’ve debated it with a lot of folks who understand probability way better than I do (for example, one of the companies involved in the promotion is run by a multiple time world bridge champion).

However, I’m in the minority on the rational / irrational behavior point, and just about every article about the show talked about how silly the contestants were to take the smaller prizes.

So, what happened in year 2?  No one rang in to take the prize money in any round.  How very rational!

The irrational contestants who rang in early in the first year waked away with $320,000 in “small” prizes and the rational ones who never rang in the second year walked away with nothing.

In part because of my experience with these promotions, I do not like the idea of approaching these problems purely on the basis of expected value and probability.  At minimum the ideas of economic utility come into play, but many other things do as well.