You Bet your Life!!

The very nature of a bet is that you either get something, or else you get nothing and lose something. This makes betting one’s life a very dicey (pardon the phrase) situation. Even if there is something you would give your life to achieve, that’s not the same. If you bet your life and lose, not only do you die, but whatever you wanted doesn’t come to pass either.
We’ve all been asked if there’s anything you would give your life for. Yeah yeah yeah. But is there anything, any potential prize or reward -be it world peace, a billion dollars, the recovery of a sick child, super-powers, whatever- for which you would bet your life in even odds? 50/50 chance you get it, 50/50 chance you get nothing and die.
Who’s game?

This is funny for me…

Anyways the answer is yes, but I feel like I’ve already made the bet. Not so much for my physical life, but for my mental life. I wouldn’t say I received super powers… I wouldn’t say I even had a ‘reward’ in mind but I certainly received something…

I’ll have to pass on that. I get along just fine without gambling.

Here is something related to gambling that is quite interesting, and something I don’t fully understand. Check this out.

If I have three cups on a table and under one of those cups is a coin, which one you don’t know, and you choose one that doesn’t have the coin under it, with one chance left to pick the right one (starting off with two chances, obviously), and in the second choice you change your mind about which one you were going to originally pick as your second chance, you increase the odds if picking the right one.

How does that work? It doesn’t make sense. How could the odds predict before hand which cup you were going to choose before you have the opportunity to change your mind and choose the other one?

I don’t know how this works, although I saw it on a show on the Discovery channel a while ago…a show about gambling and what not. Discovery is a credible source for these kinds of subjects, I believe. I don’t see why it would be a fiction.

Russian roulette eh?

Detrop:
I’m not 100% clear about the experiment that you’re asking about but I think it goes like this:

  1. there are three cups on a table, one contains a coin. You are asked to pick the cup you believe contains a coin.

  2. One of the cups that does not contain a coin is removed.

  3. You are asked if you would like to change the cup you picked or stay with the original one.

Is this the game you are describing?

The reason that you are better off picking the other cup at point 3 is because of the odds of picking the right coin change over the course of the game.

Consider this: At 1 you have a 1 in 3 chance of picking the correct cup. At point 3 the other cup now has a 1 in 2 chance of being the one contains the coin. As such, you are better to commit to the 1 in 2 chance than the 1 in 3 chance.

My response when I first learned this was that since probability must add up to 1 then both cups must have a probability of 0.5. That is the case, but you aren’t engaging in a the same kind of choice as if you were picking between two cups. You are deciding to pick at 1 in 2 or stay with the pick you made at 1 in 3.

Does this make sense? I had such a huge argument with a friend of mine whe she told me this, but I eventually saw it. Probability is such a head fuck sometimes.

Clear as mud perhaps?

cheers,
gemty

Your odds change over the coarse of the game. The 1 in 3 chance is only your chance of picking the right one right off the bat and would have no effect on the outcome of the game because your chances got boiled down to 1:2 ratio. Plus think of it this way, switching which one you choose would aslo be a 1:3 BECAUSE the one you switched to was one out of those 3.

pascal was game

-Imp

thesun1:

you’re incorrect. The probabilities don’t boil down to 1 in 2 on the second round. The don’t boil down because it is a different type of probability event than event one.

after round one, there is a 1 in 3 chance that you picked the right cup, however, when you come to round two it is still only a 1 in 3 chance that that cup was the right one while there is a 1 in 2 chance that the other cup will be the right one.

The removal of a cup that was not the one containing the coin does not alter the probability value of the cup you have chosen because it cannot retroactively change the probability of choosing the right one.

Imagine a slight alteration of the game, instead of getting to choose to switch or not, they just take away one cup and then make you wait a few minutes before they show you if you got it right or wrong.

When they come back to you what is the probability that you have chosen the right cup? It is still one in three. The fact that one of them was removed doesn’t alter the fact that you only had a one in three chance of guessing right in the first place.

This game is the same as that, when you come to choose to pick the other one or stay with the one you have its only a one in three chance that you picked the right one in the first round.

I knew this was going to cause a big fight…

cheers,
gemty

Gemty,

Let me explain this to you because you seem to believe what that irritating little bastard in ‘the curious incident of the dog in the night-time’ believed, which is wrong.

The odds do change, but they change for BOTH cups. In the first round there are 3 cups, 1 contains a coin. Regardless of which you pick, you’ve a 1 in 3 chance of getting the right cup.

In the second round, there are 2 cups and one contains a coin. Regardless of which you pick (i.e. whether you stick or switch), you’ve still got a 1 in 2 chance of getting the one with the coin.

Put another way - in the second round, when there are 2 possibilities and ONLY 2 possibilities (either picking the cup with the coin or the cup without the coin) and the chance of one of these possibilities is 1/2 then, so that all the chances of the various possibilities add up to 1 (first rule of probability, in the way that I was taught) then the chance of the other possibility HAS TO BE 1/2.

"The removal of a cup that was not the one containing the coin does not alter the probability value of the cup you have chosen because it cannot retroactively change the probability of choosing the right one. "

Right, so the chance that the cup that I have (in the second round) is 1/3 that it contains the coin. The chance that the coin is in the other cup is 1/2. Where’s the remaining 1/6 chance? There are no other possibilities, either it’s in one cup or the other (or the whole thing is a lie in which case probability is null and void) but according to you the total chance of it being in one cup or the other is only 5/6.

Your claim that it’s a different kind of probability event is incorrect, the laws of probability still require that since the coin cannot be in the cup taken away that the combined chances of the remaining two possibilities must add up to 1.

Like I say, the autistic little cretin-fiend in ‘the curious incident of the dog in the night-time’ (which I hated - how the fuck did it win the Whitbread Award?) makes the same mistake as you, as do many other people. You can be forgiven for this but you have to realise that the laws of probability don’t change just because you got confused.

"This game is the same as that, when you come to choose to pick the other one or stay with the one you have its only a one in three chance that you picked the right one in the first round. "

Yes, in the first round. Not in the second round. When one is in the second round the circumstances have changed and so there are 1/3 chances of anything…

Preesh, Gems. That’s the one. But I still don’t get it.

That is the point. The time context is no different before you changed your pick than when it was when leaving the first round. If you consider the process in terms of rounds then each set of alternatives have distinct probability odds, but the odds don’t change simply because “rounds are being progressed.” Both rounds were determined, but in the experience of the second round, the odds appear to “reset,” while in reality they haven’t changed a bit.

The one in three chance exists for the second round as well, only one of the cups is removed because the round has progressed. This creates the illusion of the odds resetting.

Detrop,

If that were true then where’s the other 1/6 chance?

All the chances have to add up to one. You and gemty are confusing ‘the probability that you picked the right cup in the first round’ with ‘the probability that you HAVE the right cup in the second round’.

Hello F(r)iends,

I don’t get this probabilities thing…

Round 1: 1/3 chances.
Round 2: 1/2 chances.

Collectively: 2/3 chances.

I think the odds of winning are strongly in my favor.
Is there anything wrong with my math?

-Thirst

We gamble and take bets all day long, we just don’t
always noticed it. At work, you slack off betting that the
boss won’t catch you, or you turn in a sloppy report and
hopes no one notices you which is a bet of sorts, or at
home when you bet that the wife is too busy to notice you haven’t
taken out the garbage and so you can watch the rest of the game
before she notices. We gamble all the time, sometimes we just
don’t realize it.

Kropotkin

You know what, I see what he is saying, I was typing up a long post to go against it but I see it!

In the first round you can only hit one of two things. A coin or not a coin. there is a 2/3 chance that you will carry a cup WITHOUT a coin into the next round. Even though your chances increase to 1/2 at the final round YOU ARE STILL PLAYING with that 2/3 chance that you picked a dud on the first round! This is a progressive game, it’s just like picking one of 3 and telling someone you’ll drop an empty one at the end, your chances are still 1/3.

So, odds are in the first round you will pick one that has nothing in it, so ODDS ARE you will carry over a cup with NOTHING to the next round.

I don’t get it… the first choice has nothing to do with your probability. It’s just killing time or waiting for the demonstrator to remove a cup for you. The first cup is unimportant. What’s the big deal?

SIATD,
there aren’t just two probabilities at the begining of the 2nd round. In fact, there are 4:

  1. You keep the cup you picked and it does contain the coin
  2. You keep the cup you picked and it does not contain the coin
  3. You switch and it contains the coin
  4. You switch and it does not contain the coin

If you do the math you’ll see that the probability of each of the four is as follows:

  1. 0.166666666
  2. 0.166666666
  3. 0.333333333
  4. 0.333333333

Add all those up and get 1, which as you said a probabilty has to add up to. The reason this is a different kind of probabilty event in event two is because its a contingent probability, the probability of event two depends upon the outcome of event one.

cheers,
gemty

Hi,

Have a look at en.wikipedia.org/wiki/Monty_Hall_problem for a famous version of this and all its controversies.

It’s my understanding that the reason the probabilities are not 1/2 is because the person who took the cup without a coin away, knew that that cup didn’t have a coin. i.e they know where the coin is.

Consider for example this was done with 1000 cups. You choose one cup and then someone who knows where the coin is takes away 998 cups leaving you with just the cup you chose and one other cup. Would you then say that the odds of the coin being under each cup is equally 1/2?

The trick relies on the knowledge of the person taking the cups away. If it was done blind then the probabilities would be different.

Ben

Ben is right, so is wikipedia.

Forget my math from before… just look at wikipedia.

cheers,
gemty

everything has a 50% chance of happening
either the thing happens
or it doesn’t

sun coming up tomorrow 50%
all of us dying in a car crash 50%
nothing happing 50%

everything either happens or it doesn’t

I hope you’re joking glider, if you are just skip my post. If not…

the sum of P(a) and P(not-a) must be one, but it is not always 0.5 + 0.5.

For example, the probability of pulling an ace from a well shuffled deck of card is 4/52, the probability of pulling a not-ace (any other card) is 48 in 52.

You can clearly see how P(ace) and P(not-ace) add up to one, but that P(ace) is not 0.5

cheers,
gemty