rbt wrote: > The TV show on NBC in the USA running this week during primetime (Deal > or No Deal). I figure there are roughly 10, maybe 15 contestants. They > pick a briefcase that has between 1 penny and 1 million bucks and then > play this silly game where NBC tries to buy the briefcase from them > while amounts of money are taken away from the list of possibilities. > The contestant's hope is that they've picked a briefcase with a lot of > money and that when an amount is removed from the list that it is small > amount of money not a large amount (I categorize a large amount to be > more than 100,000)
Well, if the contestants' choices are truly random, and they stick with their first choice all the way to the end, each contestant wins, on average, $131 477.54 (sum(amounts)/len(amounts)). Assuming that the buyout offer is always less than (or equal to) the average of the still-available amounts, NBC will (on average) never have to pay out more than ~$132k per contestant. Likely they'll pay out less, because most people will get nervous before the very end, and will take the low ball offer NBC is fronting. What I would really like to know, is how they calculate the offer. Obviously, they set the upper limit at the average of the still standing offers, but I wonder if and how they take subsequent rounds into consideration. Is there a "Monty Hall" (http://en.wikipedia.org/wiki/Monty_Hall_problem) type consideration that needs to be taken into effect as cases are eliminated? -- http://mail.python.org/mailman/listinfo/python-list