Machine learning in trading: theory, models, practice and algo-trading - page 370

 

I don't believe that analyzing the correlation between the predictor and the target will do anything.
There are a lot of examples when closely correlated variables do not depend on each other, although it seems that you can predict one from the other, such as this -http://pikabu.ru/story/lozhnyie_korrelyatsii_2287154 , here on the forum even earlier inserted articles from hubra on the same topic.

There is a more interesting term: cross-entropy. It's something from statistics, a way of analyzing whether a predictor fits a variable, a nonlinear relationship.

 
Dmitry:


Do you have an example?

Show rows of incoming and rows of outgoing data - post

For XOR dataset can consist of 4 samples, the essence does not change. {x,y,z} x,y - features z - target

{-1,1,-1},{1,1,1},{1,-1-1},{-1,-1,1}

Let's calculate the covariance of the first trait and the target: taking into account that mo = 0 we have: ((-1*-1) + (1*-1) + (1*-1) + (-1*1))/4 = (1+1-1-1)/4 = 0 it's obvious that the correlation is also zero, the same will be with the second trait, you can check, but both traits are more than valid for the nileney classifier

 
Dr. Trader:

I don't believe that analyzing the correlation between the predictor and the target will do anything.
There are many examples when closely correlated variables do not depend on each other, although it seems that you can predict the other from one, such as this -http://pikabu.ru/story/lozhnyie_korrelyatsii_2287154, here on the forum even earlier pasted articles from Habra on the same topic.

There is a more interesting term: cross-entropy. It's something from statistics, a way of analyzing whether a predictor fits a variable, a nonlinear relationship.

Same opinion, what does it matter what these curves show, if we are looking for non-linear dependencies between a set of features and a target. And about removing highly correlated predictors not explicitly... They may be correlated but they are not correlated :) for example if you feed a set of indicators with a bias the correlation will be high but so will the information content
 
Dr. Trader:

I don't believe that analyzing the correlation between the predictor and the target will yield anything.
There are plenty of examples when closely correlated quantities do not depend on each other, although it seems that you can predict the other from one, such as this -http://pikabu.ru/story/lozhnyie_korrelyatsii_2287154 , here on the forum even earlier pasted articles from hubra on the same topic.

There is a more interesting term: cross-entropy. It's something from statistics, a way of analyzing whether a predictor fits a variable, a nonlinear relationship.


1. no one analyzes the correlation - it's about the choice of predictors.

2. You repeated my point three pages earlier -"Dependence is a special case of correlation. If two variables are dependent, then there is definitely a correlation. If there is correlation, then there is not necessarily dependence."

3. cross-entropy just like correlation will not give an answer on the presence of functional dependence

 
Aliosha:

For XOR a dataset can consist of 4 samples, the essence does not change. {x,y,z} x,y - features z - target

{-1,1,-1},{1,1,1},{1,-1-1},{-1,-1,1}

Let's calculate the covariance of the first chip with the target: considering that mo = 0 we have: ((-1*-1) + (1*-1) + (1*-1) + (-1*1))/4 = (1+1-1-1)/4 = 0 it is obvious that the correlation is also zero, the same will be with the second chip, you can check, but for the nileney classifier both chips are more than valid


Two equally correlated predictors - what do we throw out based on lower correlation? Which one is less correlated?
 
Dimitri:


1. no one is analyzing correlation - it's about choosing predictors.

2. You repeated my point three pages earlier:"Dependence is a special case of correlation. If two variables are dependent, then there is definitely a correlation. If there is correlation, then there is not necessarily dependence."

3. cross-entropy just like correlation will not answer for the presence of functional dependence

Inverse correlation is not correlation? how can you even talk about correlation by correlation curves, I don't get it... what kind of correlation can there be between the popcorn crop curve in the fields and between the number of chicks hatched by diligent traders? Why would it be better for ns if the random correlation between unrelated phenomena is high?
 
Maxim Dmitrievsky:
Inverse correlation is not a dependence? how can you even talk about dependence on correlation curves, I don't get it... what kind of dependence can there be between the curve of popcorn yield in the fields and between the number of chicks hatched by diligent traders? Why would it be better for ns if the random correlation between unrelated phenomena is high?


I don't get it.

What does inverse correlation have to do with it?

There are correlated quantities. Some of them may have a functional dependence between them, and some of them may have a false correlation.

Once again,"Dependence is a special case of correlation. If two variables are dependent, then there is definitely a correlation. If there is correlation, then there is not necessarily dependence."

 

And again - as of today, there are no methods for distinguishing functional dependence from false correlation.

Only analytical ones.

 
Dimitri:


I don't get it.

What does inverse correlation have to do with it?

There are correlated quantities. Some of them may have a functional dependence between them, and some of them may have a false correlation.

Once again,"Dependence is a special case of correlation. If two variables are dependent, then there is definitely a correlation. If there is correlation, then there is not necessarily dependence."


Well if two variables have an inverse correlation then how? like franc quotes with the euro. There is a correlation, but there is no correlation.
 
Maxim Dmitrievsky:

well if two variables have an inverse correlation then how? like franc quotes with the euro. There is a correlation, but there is no correlation.


I still do not understand - inverse correlation or no correlation?

Or, do you think that if two random series have a correlation coefficient of -1, then they "have no correlation"?

Yoklmn.....

Reason: