Math question

Status
Not open for further replies.
Imagine we do the following experiment: we flip a coin a certain amount of times.

Previous knowledge:

1-We all agree that the chances of getting heads or tails is 0,5 for both. However, if you just flip a coin 3 times, you may get a frecuency of 0,33-0,66 for each.

2-It is general knowledge that if you flip a coin many many times, say 1000000 times, the frecuency for heads and tails will get very very close to 0,5 (theorical chance). The more times you do the experiment, the closer you will get to the theorical results.

If you flip a coin 1000 times, let's say the difference between the amount of heads and tails is 8 (I chose this number randomly), 496 heads-504 tails. If I do it again I get a difference of 13 (again, random number). What I want to point out is that fliping the coin 1000 times may result in a difference around 10 (I insist that I've made up this number, so instead of 10, say it's 5 or 20 or whatever the actual number is).

Now the question is: If I flip the coin 1000000 times, the difference between the amount of heads and tails will still be 10, or will it be bigger (say 100) since I've done the experiment more times?

Note that if we agree that the difference raises, it can't raise in a lineal way, i.e. If 1000 times means 10 deviation, then 2000 times means 20 deviation, because this would violate fact 2. So that would suggest that it grows in a logarithmical way (it grows but in a slower way than the amount of times the experiment is done) and tends to estabilise and in the long run ends up behaving like it doesn't raise (it raises very very little).

However I believe that the difference is constant since differences tend to compenste (5 extra heads will get compensated with 5 extra tails sooner or later). What is you opinion? Please include logical reasoning to support your answer.
 
If 1 000 flips result in a 10-point (1%) difference, 10 000 flips would likely result in a 95-point difference (.95%)

All numbers approximated.
 
I did this before in my statistics class.

The Central Limit Theorem pretty much says that since they both have an equal chance of happening they will become more normal with each flip. In other words the more you flip it the closer it gets to .50/.50
 
The order of the growth rate of the amount it varies from 50% should be less than the order of the number of flips.
 
Yes, in theory it would move as close to 50/50 as possible, but in reality it never does that. In reality, one is flipped a good deal more than the other.
 
I think that isn't what he intends to address with his question. Rather than actually being specifically about coins and such, it's a question of statistics and stuff.
 
Right, and it is a great question. I am just playing off of what DM said with that it never actually happens in reality.
 
If you think about it this way, if you flip a coin once you have a 100% chance of having a .5 difference. But if you flip it 100 times then the only way you could have less than .5 difference is to be exactly 50/50. which would sorta mean it would have to be exactly at 50/50 at 110 as well, which requires that in 10 flips you would have to have exactly 50/50 as well. This is obviously extremely unlikely.

So yeah, the number difference will increase, but the % difference will decrease..

Have a nice day.
 
naa, each case is independent of the other cases. if i got 1000 heads and 2 tails, the probability of me getting tails on my next flip is still 50%.
 
That's not really what the question is asking. You're talking about individual coinflips, while this is talking about sequences of coin flips.

I basically tried to solve this math-ily, which didn't really work out too well, but I tried to write functions that would tell the average % heads/tails per coin flips, based on data which I made up, but yeah that didn't really work out too well.

I think the actual number difference between heads and tails gets smaller as you do more coin flips. but only over an incredible huge amount of coin flips, like infinity I guess, because while I understand the whole %difference gets smaller, actual number gets bigger thing, it seems to me like if that can keep going forever, then eventually you reach a point which is so close to 50% that it essentially is 50% (so I guess .5 is the asymptote of this graph), meaning at some point you end up with what is for all practical purposes a 0 difference between heads and tails, meaning unless the initial %difference is 0, which it can't be in real life, after enough coin flips I think #heads will always equal #tails.

But that seems kind of wrong too, because that seems to be assuming that the %difference between heads and tails is dropping at a constant rate, and I don't know enough about probability to know if that makes any sense at all. OK so basically to conclude, I don't know, but based on my understanding of it it seems like the actual coin difference between heads and tails is not constant

but then again, I'm in Algebra 2 so what do I know lol :p
 
I'm a bit tired so my explanation will be short, but this is how I see it.

The average variance will grow in terms of the raw number, but will decrease in terms of the % of the overall flip.
 
If the coin is fair then we would expect a random variable Q ~ χ[k]^2 to be distributed such that

y = χ^2 = [(h - t/2)^2 + (t/2 - h)^2] / t
= (h^2 - th + t^2/4 + t^2/4 - th + h^2) / t
= [2h^2 + (t^2)/2 - 2th] / t
= (2/t)h^2 + t/2 - 2h

where h is the number of heads observed and t is the total number of flips. Of course, the coin flip random variable is actually a binomial distribution, so this a bit inaccurate.

The chance that h heads out of t flips is a fair result is given by P(Q > y) = 1 - F(y) where F is the cumulative distribution function for Q. It turns out that

P(Q > y) = 1 - G(1, y/2) where G is the regularised gamma function
= exp(-y/2)
= exp(-(h^2)/t - t/4 + h)

The simplification in the second step applies because a = 1.

It turns out that 1000000 is too big for my graphing program, so I went with flipping 100 times for this example, which gives us
P(Q > y) = exp(-(1/100)h^2 - 25 + h)

Now we can plot this function. The graph also includes t = 120 and t = 10.

Legend
Green -- flipped 10 times
Red -- flipped 100 times
Blue -- flipped 120 times

coingraphid1.png
 
First of all, thanks to everyone that contributed by posting here.

Now, I've seen some people passing over a bit of facts:
We all now that the more you flip the coin the closest you get to a distribution of 0,5-0,5. We are not discussing this at all.

However, the question is:

1000 flips: 510 heads, 490 tails this means 0,51 frecuency for heads and 0,49 frecuency for tails. Notice how there are 20 more heads than tails.

1000000 flips: Which do you think is the actuall case?

1- 500100 heads, 499900 tails
2- 500010 heads, 499990 tails

In case 1 we got a frecuency of 0,5001 for heads and 0,4999 for tails with a difference of 200 extra heads, while in case two we get a frecuency of 0,50001 for heads and 0,49999 for tails with a difference of 20 extra heads.

Notice how in both cases, the frecency got closer to 0,5-0,5, so we are NOT ASKING if the more you flip the coin the closer to get to 0,5-0,5.

Anyway, in case 1, the difference between heads and tails came from 20 to 200 when compared to 1000 flips, while in case 2 the difference sticked to 20, so the amount of flips didn't affect the difference when compared to 1000 flips.

In my opinion I think that case 2 is what actually happens, however most people tend to say that case 1 is the answer. But the way I see it, even though you flip the coin one billion of billion times, the difference will always be a number around 5. Why do you think like that? You may ask. Well, I take it like this:

Let's flip the coin a billion times,

Flip 1- tails, ok 1 extra tail.
Flip 2- heads, we get back to 0 difference
Flip 3- heads, 1 extra heads
Flip 4- heads, 2 extra heads
Flip 5- heads, 3 extra heads
Flip 6- tails, 2 extra heads
Flip 7- heads, 3 extra heads
Flip 8- tails, 2 extra heads
Flip 9- tails, 1 extra heads

Try as you may, you can keep fliping the coin to the infinite. You will NEVER get a difference of 100 extra heads/tails.
 
Well, honestly, that is the problem with math. Infinity tends to throw off your results no matter what happens. Certain things like the divergence of 1/x and the convergence of 1/x^2 exist but they really don't make all that much sense. If you did flip to infinity, the amount that you flip the coin should theoretically be equal in amounts of heads and amounts of tails because there is a 50% rate for each to occur. However, the differences in the amount of heads you obtain and the amount of tails you obtain will continue to grow despite the fact that the precentage of the flips aproaches 50 just like an asymptote on a graph so to speak.

BTW, especially when considering probability and infinity, there is never a never when because there is always a chance in that the flipping of the coin never stops. So yes, there has to be a point at which the difference exceeds 100 extra heads/tails because you are going to infinity.

Honestly, infinity really tends to make math awkward at best. Just look at the fact that 1^infinity is an indeterminate form while 1^anything is supposedly 1.

I hope that helps.
 
Since the actual chance that you will be flipping a heads is the same as you flipping a tails it becomes very 'difficult' to rack up a big difference. You're right Spaniard. The probability that the difference will ever reach a large value is incredibly tiny, because it requires that a significant percent of the flips to go a certain direction. The fallacy in my earlier post is that I assumed that the difference can't decrease, assuming that with each flip it could either increase or remain constant. But the two choices with each flip is a decrease or an increase. This essentially means that the function of the probability of the difference being a certain amount in the limit is probably going at the order of x^-2 or some other power of x greater than 1.

And as a bit of a sidenote to travisurfer, while infinity may be counterintuitve at times imagine a system of mathematics without it. It's faaaar more awkward when things don't have infinite in there somewhere.
 
1000000 flips: Which do you think is the actuall case?

[...]

Try as you may, you can keep fliping the coin to the infinite. You will NEVER get a difference of 100 extra heads/tails.

I wrote my post above so that you could read it, not ignore it.

Anyway, using the result I derived above:

exp(-(1/1000000) * 500100 ^ 2 - 1000000 / 4 + 500100)
≈ 99%

You could indeed obtain 500100 heads out of 1000000 flips with a fair coin.
 
Well, I'm gonna say that you know a lot more about the subject than I do, given that you're the one who works with statistics with Shoddy. But I think my point was that it's possible to get any difference, just as the intended difference increases the probability that you'll reach it decreases.
 
I wasn't responding to anything you said (hence why I quoted spaniard), but you're right. Except that beyond a certain point (traditionally, if my result evaluated to < 95%) it is likely that the coin is just unfair. My result shows that the difference of 100 out of 1000000 flips is well within what you would expect for a fair coin.
 
Well, honestly, that is the problem with math. Infinity tends to throw off your results no matter what happens. Certain things like the divergence of 1/x and the convergence of 1/x^2 exist but they really don't make all that much sense. If you did flip to infinity, the amount that you flip the coin should theoretically be equal in amounts of heads and amounts of tails because there is a 50% rate for each to occur. However, the differences in the amount of heads you obtain and the amount of tails you obtain will continue to grow despite the fact that the precentage of the flips aproaches 50 just like an asymptote on a graph so to speak.

BTW, especially when considering probability and infinity, there is never a never when because there is always a chance in that the flipping of the coin never stops. So yes, there has to be a point at which the difference exceeds 100 extra heads/tails because you are going to infinity.

Honestly, infinity really tends to make math awkward at best. Just look at the fact that 1^infinity is an indeterminate form while 1^anything is supposedly 1.

I hope that helps.

When you take maths to the infinity, things may be awkward, but it is still reality. Infinity is an idea, and not a number. That is knowledge of extremely vital importance to understand how to work with infinity.

I respect that you think that the answer is case 1 (why would I be submiting this thread if I wasn't interested in hearing people's opinions?). However, saying the difference will grow because you take it to the infinity and infinity is weird, is not a valid reasoning.

About the calculus stuff you coment, I understand why all that happens, and it all has to do with reality and it has a meaning, since maths are not magic. However I don't want to go deep in that because this would make the thread change, and because I need you to be here with me and a blackboard to discuss it. Just a quick hint, about convergence - divergence, try it out with a calculator: 1/x doesn't aproach to a concrete number, while 1/x^2 does. About 1^infinity or 1^whatever number, we are not making an operation like 2 plus 3, we are looking at a function (I'm not sure if that's the proper scientifical word) that takes values that aproach to a certain limit, and the result depends on the function itself (for instance, how fast one grows compared to the other) and therefore will be different for each case. That's why we say 1^infinity= indeterminate, because we don't know if the function aproaches 1 in 0,99 way or 1,01 way.

You can't flip a coin infinite times, so you will never get a difference of 100 or 1000 or whatever, and even if we discuss it theorically, you can get a difference of 100 extra heads but as you proceed to infinite, you will get 100 extra tails sooner or later. So the difference doesn't have any TENDENCY to increase, it'll grow sometimes while it will decrease other times.

I've actually tried to flip the coin myself:

15 flips- 6 extra tails
90 flips- 3 extra tails
 
Infinity is an idea, and not a number.

Infinity and minus infinity are numbers in the extended real number system, which is often used in the field of asymptotics.

1/x doesn't aproach to a concrete number

It doesn't?

As x -> infinity, 1/x -> 0.

That's why we say 1^infinity= indeterminate

We don't say this.

why would I be submiting this thread if I wasn't interested in hearing people's opinions?

Who knows? But I've posted two solid replies already that address your question quite thoroughly and you seem interested in ignoring them.
 
I wrote my post above so that you could read it, not ignore it.

Anyway, using the result I derived above:

exp(-(1/1000000) * 500100 ^ 2 - 1000000 / 4 + 500100)
≈ 99%

You could indeed obtain 500100 heads out of 1000000 flips with a fair coin.

I didn't ignore your post, on the contrary I agree with you 100%. The thing is that it wasn't too acurate from my part to say that it is imposible to get 100 extra heads.

What I wanted to say is that that is so ulikely that it will "never" happen. There are many things that doesn't happen in the nature simply because they are extremely unlike. For instance, is it possible that there's anyone in the world that has exactly the same genetic code as I do? I can bet my life that the answer is no (let's leave twins apart). This kind of results will only happen when you work in theory, where chances get pushed to the infinity, to every single possible combination. Another example, can you score 1000 sheer cold in a row when playing pokemon? obviously not.

However, despite what's possible in theory and what not, I'm gonna ask it again: Does the difference between the amount of heads and tails have a TENDENCY to grow with the more flips you do?

If you ask me the answer is no.
 
dunno if it has been said, but the heads side of a coin weighs a tiny, tiny, tiny, bit more.
therefor it will land on tails more, maybe once more out of 1000~
im just saying this for quarters, it might be different for other coins.
 
Infinity and minus infinity are numbers in the extended real number system, which is often used in the field of asymptotics.

It doesn't?

As x -> infinity, 1/x -> 0.

Who knows? But I've posted two solid replies already that address your question quite thoroughly and you seem interested in ignoring them.

Ok, let's not get mad because of a stupid question, I didn't ignore you. I didn't mean to offend you and I apoligise if I did. I just had to many posts to reply.

I insist that I agree with what you've posted so far.

Oh, and about 1/x, hell I knew I was missing something. I study Industrial Egineering and this is such a common case that I forgot to explain it properly. What convergence/divergence means is: If you take the 1/x function and add all the possible natural values for the function, does it aproach a certain number? This is 1/1 + 1/2 + 1/3+ ... You can check that 1/x aproaches no number, while 1/x^2 does (1/1 + 1/4 + 1/9 + ...). Anyway let's leave this apart, since it has nothing to do with the thread.

About the other two quotes, I don't know, we say that the result of 1^infinity is indeterminate (it is a different result depending on the function, not that it doesn't have an answer). And about infinite, is it a number? I don't think so. Which is the natural number that goes right before infinite? I can answer this for every number but infinity (because it's an idea, not a number).

Anyway, I don't feel like discussing calculus and physics stuff, so let's focus on the damn coins, lol. I wanna thank you for posting Colin, I really apreciate your graphs and explanations, they describe the possible results. However they don't exactly answer the question: Does the amount of flips have anything to do with the tendency to grow/decrease of difference between heads and tails (if there's any tendency at all)?
 
Status
Not open for further replies.
Back
Top