Random probability theory. Napalm continues! - page 27

 

And you forgot about the pitch, the smaller the pitch, the more likely that the next state will be, say, indistinguishable from the previous one within the statistical error, of the older TF (as far as the market is concerned).

GameOver: Finally, at least you have a sense of humour, as far as I remember ))

"we don't want to shoot, we observe. I'm warning you, if you move, we'll kill you all!" (с)

 

What's the centre of the cube, I don't understand? probabilities of the next state are based on the last dropped side. i.e. theoretically they are equal - in a perfect world, in a vacuum.

let me just summarise.
the sequence has

1. randomly distributed probability of any allowed state (1\0) falling out

2. A randomly distributed probability of changing of the previous tendency or continuing

3. And for a snack - a randomly distributed probability of presence of a trend or randomness of the series itself.

))))) everything is clear with the first one, what about the rest? :-)))))) well, yes, taken from the ceiling, but why is it wrong, justify it? :-)

 

GameOver: возьмите пример с кубиком. вероятность повторения предыдущего состояния меньше чем какого либо другого, так?

Why less? A perfect cube has no memory. The same probability there, the same 1/6.

Once again - as applied to the cube problem: only "integrated" states, i.e. series, have memory. And you apply the notion of memory to an elementary outcome. This is a mistake here, because individual elementary outcomes are independent of each other.

And now let's imagine that there are no variants at all. Wouldn't the desire of the object to change state become obvious? Because the probability of remaining in the previous place would be 1/number of variants ?

It is not about this problem. Here the variants of falling out of a die are only 6. In tervers, not only elementary outcomes are considered, but they are combined in every possible way. There are a lot more series variations. Here with them is more interesting, there you can try to stick your "change of states". Let's say the following task: there were 1000 trials, 600 heads and 400 tails fell out. They did another 1000 trials. Which result of a series of 2000 trials is more likely - 1000 eagles/1000 tails or 900 eagles/1100 tails? It counts.

and also - if the state does not change, then maybe the very assumption that the sequence is random is undermined?

Not "states don't change", but the distribution of those states don't change. The premise is roughly that in a sufficiently long series of trials, all elementary outcomes will occur with roughly equal frequencies.

There are too many fuzzy questions next, you can't do that.

 
Mathemat:

It is not "states do not change", but the distribution of those states do not change. The premise is roughly that in a sufficiently long series of trials, all elementary outcomes will occur with roughly equal frequencies.

In other words, the law of large numbers is stronger than the law of meanness.

 
paukas: In other words, the law of large numbers is stronger than the law of meanness.
Exactly!
 
GameOver:

I didn't say it was the same thing. don't attribute to me what it isn't.

where did i claim laurels? ) lying again? :-)

)))) i.e. if the example is about spins, then it's roulette. and if the example is about a coin, then who?

You may have one but you cannot allow others to have one?

If you don't want to talk about it, fine, good luck.

kitty, are you offended? (с)

what was the point of all that long talk about ter.faith, dice, roulette, coins, etc.?

If you want to discuss the indicator - go ahead, if you want to discuss the TS, show it to me, but don't bring the weird stuff in here.

 
HideYourRichess:

kitty, are you offended? (с)

what was the point of all those lengthy arguments about ter.faith, dice, roulette, coins, etc.?

You want to discuss the indicator - go ahead, you want to discuss the TS - show it, but you don't need to bring the weird stuff in here.


i just don't like rude people. i may snap back similarly. is that what you're driving at?

the indicator, thec and therwer are kind of related.
 
Mathemat:

Why less? A perfect cube has no memory. The same probability is there, the same 1/6.

Once again, as applied to the cube problem: only "integrated" states, i.e. series, have memory. And you apply the notion of memory to an elementary outcome. This is a mistake here, because individual elementary outcomes are independent of each other.

It is not about this problem. Here the variants of falling out of a die are only 6. In tervers, not only elementary outcomes are considered, but they are combined in every possible way. There are a lot more series variations. Here with them is more interesting, there you can try to stick your "change of states". Let's say the following task: there were 1000 trials, 600 heads and 400 tails fell out. They ran another 1000 trials. Which result of a series of 2000 trials is more likely - 1000 eagles/1000 tails or 900 eagles/1100 tails? It counts.

Not "states don't change", but the distribution of those states don't change. The premise is roughly that in a sufficiently long series of trials, all elementary outcomes will occur with roughly equal frequencies.

There are too many fuzzy questions next, you can't do that.


Great. That's what I want to talk about. I keep getting poked at the last spin with a 1\2 probability.

Why less? You roll a 1 on a die.
the probability of hitting 1 next is 1\6, and hitting any other number is 5\6. right? that's what this implies - that the probability of repeating is less than any other outcome.
As a consequence, on infinite variants, the recurrence of the condition gallops towards zero.
The premise of all this is that an object tends to change its state - and only then can it be called random.

About series. Exactly the fact that on large series the distribution will tend to normal can be used.
the whole question is how we define the series length and probability to get to a tendency (i.e. to get to an extreme case when all outcomes are equal). say, a series of 20 outcomes - are we satisfied with the one in a million (0.0000009) risk? if yes, then why cannot we work for that in the series of 20 outcomes we will need?

I asked a question - no one has answered. why do the casinos limit the bet? because martin is lost in principle for the player?
maybe because the casino sees its horizon for 5 years? because players who bet on a series of 16 will win, but the series of 20 (when players lose) will have to wait for twenty years?
There is a reasonable limit, a reasonable limit between the length of the series and the risk [probability] of losing the series.

it is the same in the market. perhaps everyone has studied the variants of martin in the forex market. everyone understands that it is useless - the profit does not correlate to the risk (drawdown).
BUT
i.e. the market may pass 5 or 7 figures but i.e. no one will pass 20 without a hitch.
If you want to trade on the Forex market you have to be careful when trading, you have to be careful.

 
GameOver:


Great. That's what I want to talk about. I keep getting poked at the last spin with a 1\2 probability.

Why less? I got a 1 on the die.
the probability of hitting 1 next is 1\6, and hitting any other number is 5\6. Right? That's what's implied by the fact that the probability of repeating is less than any other outcome.

Why less? The odds are the same whether it's before or after the first shot. That's what probability is. Or how is it: when you flip a coin, it turns out to be an eagle, so the second time the probability of getting an eagle is smaller or what? Not at all, it's the same 50/50. Try to do a test with any even the most primitive LSCP. It will show the same thing.
 
And a follow-up question

let's say we are collecting stats for a series of 10 spins.
We need stats for 100 variations.
Mind if we roll the dice 1,000 times?
or
we roll 10, then we discard the last outcome and add a new random outcome.
So, the rolls will be 10+100 = 110.
Question - statistics, distribution will be normal in both cases?