Dependency statistics in quotes (information theory, correlation and other feature selection methods) - page 71

 
faa1947:
What the numbers on the left mean

These are bits. For example, 0.05 bits.

Ask further and we will continue the dialogue.

 
alexeymosc:

These are bits. For example, 0.05 bits.

Ask further, so we can continue the dialogue.

What is the physical quantity being measured? What are the limits of variation? Up to 1?

 
IgorM:

we take an arbitrary length of the alphabet, in the screenshot it is 24 bits and encode

Red means that price has updated low = 1, blue means that price has updated high = 0,

And so for each TF, I checked the statement that the trend on the higher TF is "more important", it's partly true, but I haven't seen any clear rules yet


Thanks, I got it.

The subordination of lower to higher TF is an undeniable fact.

http://www.onix-trade.net/forum/index.php?s=2b118a5435ec895351a317ca24d55206&showforum=74

http://forum.fxtde.com/index.php?showtopic=2635&st=1820

http://forum.alpari.ru/showthread.php?p=2984861#post2984861

Here there is a discussion of Vadim's models and he gives answers to the questions. Here is all the information and all the evidence and clear rules.

 
VNG:


Thanks, got it.

The subordination of the younger to the older is an undeniable fact.

http://www.onix-trade.net/forum/index.php?s=2b118a5435ec895351a317ca24d55206&showforum=74

http://forum.fxtde.com/index.php?showtopic=2635&st=1820

http://forum.alpari.ru/showthread.php?p=2984861#post2984861

There is a discussion of Vadim's models here and he answers the questions. Here is all the information and all the evidence and clear rules.

Thanks, I'll look into it.

ZS: I didn't even expect the links to appear, I've been asking for days, they're sending me to Google to look for them ... )))

 
faa1947:

Econometrics is the application of matstatistics to economics.

That's what I wanted to hear.

Thank you and everyone else for the informative discussion. Good luck!;)

 
faa1947:

What is the physical quantity being measured? What are the limits of variation? Up to 1?

The maximum may be 2.098 bits. This is the average information of this particular data series. If, for example, a bar on lag 1 completely determines a zero bar, then their mutual information becomes 2.098 bits.

What is this number? It is a measure of information ) You need to read articles on TI. In short, bits reflect a measure of the randomness of the data source values according to the eigeninformation formula of one particular value

I(X) = - log(P(x))*P(x).

Another example. We flip a coin, count the mutual information between two consecutive events. By formulas I've translated in my article we get that the mutual information I(X;Y) = 0. And if a tails toss would accurately indicate a subsequent tails (or heads) toss, then I(X;Y) would be 1 - this is the average information of the "fair coin" data source.

 
alexeymosc:

The maximum may be 2.098 bits. This is the average information of this particular data series. If, for example, a bar on lag 1 completely determines a zero bar, then their mutual information will become 2.098 bits.

What kind of number is that? It is a measure of information ) You need to read articles on TI. Briefly, the bits reflect a measure of the randomness of the data source values using the eigeninformation formula of one particular value

I(X) = - log(P(x))*P(x).

Another example. We flip a coin, count the mutual information between two consecutive events. By formulas I've translated in my article we get that the mutual information I(X;Y) = 0. And if a tails toss would accurately indicate a subsequent tails (or heads) toss, then I(X;Y) would be 1 - this is the average information of the "fair coin" data source.

In statistics, the concept of significance is very important. The values in the graph of 0.05 and 0.01 are the same values in terms of their significance and cannot be the basis for any conclusions. Although I may be wrong.
 
faa1947:
In statistics, the concept of significance is very important. Values on the graph of 0.05 and 0.01 are the same values in terms of their znAvalue and cannot serve as a basis for any conclusions. Although I may be wrong.

You are wrong in this case.

I've purposely compared it to the statistics of mutual information on a random data set with the same distribution. The difference is substantial and it is confirmed by tests.

This comparison is about the same as the confidence interval in ACF.

 
alexeymosc:

You are wrong in this case.

I've purposely compared it to the statistics of mutual information on a random data set with the same distribution. The difference is substantial and it is confirmed by tests.

This comparison is about the same as the confidence interval in ACF.

Maybe.

Any confidence interval sounds like this: at the 5% level (for example) the null hypothesis is confirmed (not confirmed).

How does your null hypothesis sound? Where is the confidence interval? etc. If the ACF is an understandable thing to me, your graph is not understandable. If the max is 2.098 bits, then 0.05/2.098 should not be discussed. And the issues at the beginning of the line are not removed.

By the way, what did you calculate the ACF on?

 
faa1947:

I'll take the usual increments for the opener.

Much more interesting. Statistics

ACF

The likelihood is that there is no correlation. There is some correlation to begin with, but not meaningful.

Exactly, that's exactly right. The ACF is of no use at all.

And from mutual information - there should be, as zero is not even smelling there, even at a distance of hundreds of bars.