From theory to practice - page 351

 
Alexander_K2:

But, I decided to go further - to confirm or refute Kolmogorov's theory about prediction of stationary sequences. Just started the way and some obscurantists immediately, interrupting each other, prove that it is impossible. Are they smarter than Kolmogorov? What other experience? Experience of shame? Ugh...

Excuse me, but have you read the very first sentence of the article?

The spectral conditions for the ability to extrapolate and interpolate stationary random sequences...

 
Yuriy Asaulenko:

Sorry, did you read the very first sentence of the article?

Spectral conditions are set for the ability to extrapolate and interpolate stationary random sequences...

I did. So you need to check it - to achieve independence of the increments, for a start. Check expectation and variance for compliance with Kolmogorov's conditions, etc. This is the only way to make neural networks work IMHO. And then pre-schoolers run up to the foreground and shout that it's all nonsense. How does it feel?

 
Alexander_K2:

But, I decided to go further - to confirm or refute Kolmogorov's theory about prediction of stationary sequences. Just started the way and some obscurantists immediately, interrupting each other, prove that it is impossible. Are they smarter than Kolmogorov? What other experience? Experience of shame? Ugh...

If you don't like a coin, take a playing cube, it's also a stationary sequence) and show us all dumb kids a master class on how to predict its next value)
For a cube you can only forecast the average and dispersion, in trading it is useless, because the price is non-stationary by definition.
 
Alexander_K2:

Read. So it is necessary to check it - to achieve, for a start, independence of the increments. Check expectation and variance for compliance with Kolmogorov's conditions, etc. This is the only way to make neural networks work IMHO. And then pre-schoolers run up to the foreground and shout that it's all nonsense. How's that?

So, how are we doing with... by spectral conditions? How can you be sure that by getting to ...... you'll get the spectral conditions you're looking for? Have you even seen the spectrum of this stuff?

 
Yuriy Asaulenko:

So, what's our situation with... spectral conditions? How can you be sure that by getting to ...... you'll get the spectral conditions you're looking for? Have you ever even seen the spectrum of this stuff?

No, I haven't. What's the point of looking at it now if the process is non-stationary?

 
Alexander_K2:

No, I haven't. What's the point of looking at it now if the process is non-stationary?

If it is stationary, will the spectral conditions be right? I'm sure they won't. And stationary/non-stationary doesn't say anything about the spectrum at all, and doesn't affect it in any way. If the spectrum is shit now, it will remain so after any transformations. It's elementary, Watson).

- Only you, dear comrade from Paris, spit on all this.

- How to spit?!

- How do you spit? With spit, as they did before the epoch of historical materialism. (с)

 
Yuriy Asaulenko:

Will it be stationary, will the spectral conditions be suitable? I'm sure they won't. And stationarity/non-stationarity doesn't say anything about spectra at all, and doesn't affect them in any way. If the spectrum is shit now, it will remain so after any transformations. It's elementary, Watson).

- Only you, dear comrade from Paris, spit on all this.

- How to spit?!

- How do you spit? With spit, as they did before the epoch of historical materialism. (с)

I believe you. I don't know why. Or rather, I know, because you are one of very few who knows the diffusion equations here on this forum :))) Then, it turns out that neural networks will NEVER predict. Right? Why then do you participate in the next thread?

 
Alexander_K2:

I believe you. I don't know why. Or rather - I know, because you are one of the few who knows the diffusion equations here on this forum :))) Then it turns out that neural networks will NEVER predict. Right?

In general case, never. However, prediction is possible in some limited areas. If you manage to make the NS work only in these areas, then prediction will be possible.

 
Yuriy Asaulenko:

In the general case, never. However, prediction is possible in some limited areas. If you can make the NS work only in those areas, then prediction will be possible.

Mm-hmm. Thank you. Now that's a talk on the case.

 

I would like to hear constructive feedback, continuation of research, because the topic is interesting, people are reading and writing, e.g.@cemal wrote back with interesting information about Weibull distribution https://www.mql5.com/ru/forum/221552/page350#comment_7358875. I am now working on the latest data provided by Alexander, as soon as it is ready, I will post it here. I ask everyone who is interested to respond and help to the best of their ability, let it be information, let it be graphs or studies, any possible help.

From myself: the possibility of transformation approximating to normal distribution possibly lies through Erlang flows. The whole question is about the right sifting algorithm, the options are heaps, not done.

Next: After exponential readout of tick intervals (Alexander's latest datahttps://www.mql5.com/ru/forum/221552/page349#comment_7348507) if we try to logarithm these increments and build a histogram of distribution, will it approximate the Gaussian? That is also an option.

Next: Custom symbols, Maxim promised, so much work and so few people willing to help. I appeal to those who read the forum, please respond, the work will boil, there are many more options for development, you can try, just a faster result).