Imagine we do the following experiment: we flip a coin a certain amount of times.
Previous knowledge:
1-We all agree that the chances of getting heads or tails is 0,5 for both. However, if you just flip a coin 3 times, you may get a frecuency of 0,33-0,66 for each.
2-It is general knowledge that if you flip a coin many many times, say 1000000 times, the frecuency for heads and tails will get very very close to 0,5 (theorical chance). The more times you do the experiment, the closer you will get to the theorical results.
If you flip a coin 1000 times, let's say the difference between the amount of heads and tails is 8 (I chose this number randomly), 496 heads-504 tails. If I do it again I get a difference of 13 (again, random number). What I want to point out is that fliping the coin 1000 times may result in a difference around 10 (I insist that I've made up this number, so instead of 10, say it's 5 or 20 or whatever the actual number is).
Now the question is: If I flip the coin 1000000 times, the difference between the amount of heads and tails will still be 10, or will it be bigger (say 100) since I've done the experiment more times?
Note that if we agree that the difference raises, it can't raise in a lineal way, i.e. If 1000 times means 10 deviation, then 2000 times means 20 deviation, because this would violate fact 2. So that would suggest that it grows in a logarithmical way (it grows but in a slower way than the amount of times the experiment is done) and tends to estabilise and in the long run ends up behaving like it doesn't raise (it raises very very little).
However I believe that the difference is constant since differences tend to compenste (5 extra heads will get compensated with 5 extra tails sooner or later). What is you opinion? Please include logical reasoning to support your answer.
Previous knowledge:
1-We all agree that the chances of getting heads or tails is 0,5 for both. However, if you just flip a coin 3 times, you may get a frecuency of 0,33-0,66 for each.
2-It is general knowledge that if you flip a coin many many times, say 1000000 times, the frecuency for heads and tails will get very very close to 0,5 (theorical chance). The more times you do the experiment, the closer you will get to the theorical results.
If you flip a coin 1000 times, let's say the difference between the amount of heads and tails is 8 (I chose this number randomly), 496 heads-504 tails. If I do it again I get a difference of 13 (again, random number). What I want to point out is that fliping the coin 1000 times may result in a difference around 10 (I insist that I've made up this number, so instead of 10, say it's 5 or 20 or whatever the actual number is).
Now the question is: If I flip the coin 1000000 times, the difference between the amount of heads and tails will still be 10, or will it be bigger (say 100) since I've done the experiment more times?
Note that if we agree that the difference raises, it can't raise in a lineal way, i.e. If 1000 times means 10 deviation, then 2000 times means 20 deviation, because this would violate fact 2. So that would suggest that it grows in a logarithmical way (it grows but in a slower way than the amount of times the experiment is done) and tends to estabilise and in the long run ends up behaving like it doesn't raise (it raises very very little).
However I believe that the difference is constant since differences tend to compenste (5 extra heads will get compensated with 5 extra tails sooner or later). What is you opinion? Please include logical reasoning to support your answer.