Testing real-time forecasting systems - page 59

 

The picture at the moment is not equal, I would not make any deals.

 

to marketeer

Какая-то неувязка получается: прогноз на 3 дня, а отпуск на 3 недели ;-).

On the contrary, everything fits, in the sense that it hasn't diverged either. Three days' forecast, three weeks' holiday. By the way, I must say that when a two-metre-deep morayna snapping its jaw at 30 cm from the causal site, comes a special kind of marketeer.... :о)


to marketeer, Yurixx

why were the winners chosen with maximum entropy and not minimum entropy?

The concept of identifying realizations is the same - maximum likelihood. There are several specific realisations. Here I demonstrate the "modernized" principle of maximum entropy, taken from NS based on information theory, in the original so:

If conclusions are based on incomplete information, it must be chosen from a probability distribution maximizing entropy under given distribution constraints

It's simple, a deterministic signal carries no new information. Only stochasticity carries new information. It is the only way and no other way. The question is how much "new" to expect. The basic characteristics of the trajectory depend on that. Statistically plausible, will be that signal which has the maximum entropy, i.e. the most random or more correctly, the most "saturated with new information" signal. But so far my theory (and complete understanding :o) is at the stage of creative construction. Screenshots show "a snapshot" but it's roughly speaking - permanently.

:о)


Respect to the author !

Thank you, it's always nice to get some respect, and it's twice as enjoyable to get it from my colleagues, whose opinion I value and respect. Yes, the system was beginning to show the 1.5 level very persistently and the market had a few moments when this level appeared.

 
Welcome back Sergei.
grasn >> :

By the way, I must say that when a two-metre deep moray eel snaps its jaw at 30 cm from the causal area, there is a special understanding of the market .... :о)

I don't see dead people after that ^_^

 
grasn >> :

It's simple, a deterministic signal carries no new information. Only stochastics carry new information. It is the only way and there is no other way. The question is how much 'new' to expect. The basic characteristics of the trajectory depend on that. Statistically plausible, will be the signal that has the maximum entropy, i.e. the most random or more correctly, the most "new-information-saturated" signal.

Maybe I'm wrong, but the quality of a prediction should not be equated with the quantity of information. Information can be false. According to well-known formula (Potapov has it too) forecast horizon T = 1/K*log(1/d0), i.e. high entropy makes forecast short-term - information overflow happens. Maybe different types of entropy are meant?

 
By the way, there is a question about the interpretation of this formula in the case of discrete time in particular, and about the unit of measure T in the general case. Suppose I have d0 = 0.001, log - according to the formula should be natural, so we get 6.9, and with entropy around 13 comes out T = 0.5. What are these "parrots"? Need to get the answer in bars. ;-)
 
NEKSUS_ >> :
Welcome back Sergei.

Thank you! I'm still on holiday, I won't be appearing that often :o)

I don't see dead people after that ^_^

You what!!! The moray is beautiful in its own way. Didn't see any sharks on a dive, but saw a manta(https://ru.wikipedia.org/wiki/%D0%9C%D0%B0%D0%BD%D1%82%D0%B0), unbelievable beauty.

 
marketeer >> :

Maybe I'm mistaken, but the quality of a prediction should not be equated with the quantity of information. Information can be false. According to the known formula (Potapov has it too) forecast horizon T = 1/K*log(1/d0), i.e. high entropy makes forecast short-term - information overflow happens. May we mean different types of entropy?

The quality of the prediction is determined only by the adequacy of the process model and the method of identification, nothing else. What I wrote above is correct, maybe you did not understand me, due to my illiterate explanation. By the way, what is the information entropy of false information? :о)

 
marketeer >> :
By the way, there is a question on interpretation of this formula in case of discrete time in particular, and on unit T in general case. Suppose I have d0 = 0.001, log - according to the formula should be natural, so we get 6.9, and with entropy around 13 comes out T = 0.5. What are these "parrots"? We need to get the answer in bars. ;-)

You didn't answer the question!!! :о) I'm still relaxed :o). I didn't really get it here, I'll try to reread it later. But here:

According to the known formula (Potapov has it too) forecast horizon T = 1/K*log(1/d0), i.e. high entropy makes forecast short-term - information overflow occurs. May we mean the different types of entropy?

The writing is a bit incorrect. Entropy as such has nothing to do with it. It is entirely determined by the nesting of the system (its dimensionality). Only by this, nothing else. The higher the dimensionality, the harder it is to predict the system, that's all. Well, each dimension brings its own "slice" of entropy. Entropy can be "a lot" and the system can be quite "understandable".

 

A picture please !

Hi Sergey. Could you, as the hero of this thread, repeat your picture with the forecast that you made before the holiday and what you really got?

That is, all -teen forecast trajectories + real price movement on one picture.

 
Yurixx >> :

A picture please !

Hi Sergey. Could you, as the hero of this thread, repeat your picture with the forecast that you made before the holiday and what you really got?

I.e. all -teen predicted trajectories + the real price movement on one picture.

Hi Yuri, nice to meet you! Unfortunately not near my lab. Will only be able to in at least a week, or even later. I guess it will be too late :o( But that's ok!!! I'll make more predictions, new nteen trajectories, and one of them will probably turn out to be successful, and we'll justify the entropy! :о))))))))))