Flannel Jesus wrote: ↑Mon Jul 18, 2022 2:07 pm
Would it still be different if I added another box with 2x100? So now there's 3 boxes, and 2 of them have 2 100s? Is it still 50/50 because there's only 2 options, or do you start counting the extra 100s then?
What the brain counts is the 'bills', the pieces of paper, themselves. But REALLY there is ONLY ONE $100.
There is REALLY ONLY ONE '100', although there are MANY 'bills' or 'pieces of paper'.
Think of it as it is the EXACT SAME 100, which is just written on MANY DIFFERENT 'pieces of paper'.
So, no matter how many boxes and 'pieces of paper' you keep adding on, there will ALWAYS only ever be ONLY ONE '100'.
I think it might be the dollar sign '$', in front of the '100', which might be helping to cause the confusion, or deception, here.
Flannel Jesus wrote: ↑Mon Jul 18, 2022 2:23 pm
My brain thinks of them exactly the same. I don't treat the balls differently from the money
Okay, fair enough.
But what are the chances that you will pick a red piece of paper or a blue piece of paper out of the box?
Is it still 66%?
If yes, then WHY?
But if no, then WHY NOT? And, what is the percentage now?
If you take my original question, and replace the bills with coloured paper, then all I would do is take my Bayesian formula, and replace bills with coloured paper, and my answer would be the same.
Flannel Jesus wrote: ↑Mon Jul 18, 2022 2:07 pm
Would it still be different if I added another box with 2x100? So now there's 3 boxes, and 2 of them have 2 100s? Is it still 50/50 because there's only 2 options, or do you start counting the extra 100s then?
What the brain counts is the 'bills', the pieces of paper, themselves. But REALLY there is ONLY ONE $100.
There is REALLY ONLY ONE '100', although there are MANY 'bills' or 'pieces of paper'.
Think of it as it is the EXACT SAME 100, which is just written on MANY DIFFERENT 'pieces of paper'.
So, no matter how many boxes and 'pieces of paper' you keep adding on, there will ALWAYS only ever be ONLY ONE '100'.
I think it might be the dollar sign '$', in front of the '100', which might be helping to cause the confusion, or deception, here.
I think I didn't pay this response enough attention.
So you think that, if I add one more box of 100+100, it's still a 50/50 that the other bill in the box is a 1 or a 100?
And if I add another box of 100+100, it's still 50/50?
And no matter how many boxes of 100+100 I add, as long as one of them is 1+100, it's 50/50?
Flannel Jesus wrote: ↑Mon Jul 18, 2022 4:01 pm
I think I didn't pay this response enough attention.
So you think that, if I add one more box of 100+100, it's still a 50/50 that the other bill in the box is a 1 or a 100?
And if I add another box of 100+100, it's still 50/50?
And no matter how many boxes of 100+100 I add, as long as one of them is 1+100, it's 50/50?
Someone doesn't know about Russel's inductivist Turkey...
If they have been feeding you for 364 days are you going to make it past Thanksgiving?
It's a Philosophy forum, guy! We don't attack the theory - it's just an elaborate tautological contraption.
We attack the premises/assumptions/axioms of the theory.
Flannel Jesus wrote: ↑Mon Jul 18, 2022 2:23 pm
My brain thinks of them exactly the same. I don't treat the balls differently from the money
Okay, fair enough.
But what are the chances that you will pick a red piece of paper or a blue piece of paper out of the box?
Is it still 66%?
If yes, then WHY?
But if no, then WHY NOT? And, what is the percentage now?
If you take my original question, and replace the bills with coloured paper, then all I would do is take my Bayesian formula, and replace bills with coloured paper, and my answer would be the same.
If that is what you would do, then that is what you would do. BUT, you would STILL be just AS Wrong.
No matter what formula or theorem you would like to use, there will ALWAYS REMAIN TWO things ONLY, from which to choose from. Therefore, the answer IS, and could only ever correctly be 50%.
Flannel Jesus wrote: ↑Mon Jul 18, 2022 2:07 pm
Would it still be different if I added another box with 2x100? So now there's 3 boxes, and 2 of them have 2 100s? Is it still 50/50 because there's only 2 options, or do you start counting the extra 100s then?
What the brain counts is the 'bills', the pieces of paper, themselves. But REALLY there is ONLY ONE $100.
There is REALLY ONLY ONE '100', although there are MANY 'bills' or 'pieces of paper'.
Think of it as it is the EXACT SAME 100, which is just written on MANY DIFFERENT 'pieces of paper'.
So, no matter how many boxes and 'pieces of paper' you keep adding on, there will ALWAYS only ever be ONLY ONE '100'.
I think it might be the dollar sign '$', in front of the '100', which might be helping to cause the confusion, or deception, here.
I think I didn't pay this response enough attention.
So you think that, if I add one more box of 100+100, it's still a 50/50 that the other bill in the box is a 1 or a 100?
And if I add another box of 100+100, it's still 50/50?
And no matter how many boxes of 100+100 I add, as long as one of them is 1+100, it's 50/50?
I did before, but I do NOT now. So, in the case of adding more boxes then it would change the odds.
But in the original question, from my perspective, it would still be 50/50.
I think I have worked out WHERE the CONFUSION lays now.
I was taking your original question, which is:
What is the probability that the other bill remaining in the box you selected is also a $100?
As being asked, in the moment AFTER the other box had been removed. Whereas, maybe your intention of 'what is the probability ...' was, in the moment BEFORE the two boxes were even picked?
If this is the case, then I APOLOGIZE for MISUNDERSTANDING and MISINTERPRETING your intention.
Step 1: I arrange the notes into boxes, and present you the boxes (you don't know which box is which, you just know that one of them is 100+100 and the other is 1+100)
Step 2: you choose a box (I then put the other box away)
Step 3: without peeking, you pick out a note from your selected box. You look at the note you've chosen, it's $100
So, you've chosen a box, you don't know which one, you've chosen a note, it's 100, the box still has one note left in it.
What's the probability that the note remaining in the box you chose is a $100?
Flannel Jesus wrote: ↑Mon Jul 18, 2022 11:03 pm
No I think you understood it correctly before.
Step 1: I arrange the notes into boxes, and present you the boxes (you don't know which box is which, you just know that one of them is 100+100 and the other is 1+100)
Step 2: you choose a box (I then put the other box away)
Step 3: without peeking, you pick out a note from your selected box. You look at the note you've chosen, it's $100
So, you've chosen a box, you don't know which one, you've chosen a note, it's 100, the box still has one note left in it.
What's the probability that the note remaining in the box you chose is a $100?
Do I get to play this game only once; or infinitely many times?
What do I win every time I get the answer right?
What do I lose every time I get the answer wrong?
Without this additional information I cannot determine whether to use the ensemble probability (66%) or time probability (50%)!
I haven't really phrased it like a game that you can play and win, but I certainly can:
There's a table at a casino. You can play the game and lose or win money as often as you want. Here's the rules:
The dealer shows you 4 balls, 3 blue and 1 red. The dealer boxes the 4 balls, 2 balls into 2 boxes, and presents you the boxes - like usual, you don't know which box is which.
You choose a box, he takes the other box back and puts it under the table. The box has two compartments, a left and a right one, the dealer asks you to choose one, verbally. You do so, the dealer pulls the ball out for you and shows you.
If at this point, the dealer sees the red ball, the game is over and no bet can be made. He will reshuffle the balls and go to the next round.
If he sees a blue ball, on the other hand, you're given the option to place a bet, and one bet only: 6-4 odds on the other ball in your chosen box being RED. In other words, for every $4 you bet, if you win, you get your $4 back + $6.
The above rules are for TABLE#1
---
There's another table that's nearly identical to this one, exactly the same rules, but the last step is different. At this table, the betting rule is this:
If he sees a blue ball, you're given the option to place a bet, and one bet only: 4-6 odds on the other ball in your chosen box being BLUE. In other words, for every $6 you bet, if you win, you get your $6 back + $4.
That's TABLE#2
---
The casino has a strict rule that once you choose a table, you may not ever place a bet at the other table. Never in your life. They enforce this with extreme precision and efficiency.
Flannel Jesus wrote: ↑Tue Jul 19, 2022 12:32 am
There's a table at a casino. You can play the game and lose or win money as often as you want.
Ludic fallacy.
How far into a negative bank balance can I go before it's game over? If you don't answer that question - you cannot possibly estimate the longest losing streak you can survive if the house gets lucky.
And I am sure you know that in theory infinitely-long losing streaks are possible, right?
Last edited by Skepdick on Tue Jul 19, 2022 12:45 am, edited 4 times in total.
Flannel Jesus wrote: ↑Mon Jul 18, 2022 11:03 pm
No I think you understood it correctly before.
Step 1: I arrange the notes into boxes, and present you the boxes (you don't know which box is which, you just know that one of them is 100+100 and the other is 1+100)
Step 2: you choose a box (I then put the other box away)
Step 3: without peeking, you pick out a note from your selected box. You look at the note you've chosen, it's $100
So, you've chosen a box, you don't know which one, you've chosen a note, it's 100, the box still has one note left in it.
What's the probability that the note remaining in the box you chose is a $100?
Now, because you want to KEEP the 'chosen' word in here, then the answer STILL REMAINS 50%.
If, however, the question is asked BEFORE I 'choose', then the answer WILL BE DIFFERENT.
Flannel Jesus wrote: ↑Tue Jul 19, 2022 12:32 am
There's a table at a casino. You can play the game and lose or win money as often as you want.
Ludic fallacy.
How far into a negative bank balance can I go before it's game over? If you don't answer that question - you cannot possibly estimate the longest losing streak you can survive if the house gets lucky.
And I am sure you know that in theory infinitely-long losing streaks are possible, right?
They sure are possible. You willing to miss out on infinite money because of a concern like that?
You can make a bet of any size. Just imagine you have the same amount of money that you have now.
The amount of bad luck we're talking about here is astronomical, tbh. I'd wager you're far more likely to die in a car crash in your next car ride, than you would be to lose money making the smart bet over and over and over again here. But you seem so set against the bet, because of some aversion to luck, that you'd probably rather continue driving to work every day (I don't know if you drive to work, I'm just using this as illustrative, replace "driving" with some other mundane but ultimately risky thing you do frequently) than you would be to take the smart bet.
So if you're more worried about an infinite losing streak than dieing in a car crash, despite the latter being provably more likely, I think the fallacy lies with you. You are loss averse and risk averse, but you are blind to certain types of risk and overly concerned with other types of risk.