Absolute courses - page 25

 
Dr.F.:

It's hard to make a more moronic assumption. Of course they can go down somewhat. Even as low as 0.8. But in the intervals of price changes within a day or two, in absolute numbers, the agreement of the graph pieces will be as good as the one already shown. Do not get hung up on the value of correlations. It doesn't make any sense by itself and depends to some extent on the length of the averaging interval where you calculate them. That's not what we're talking about.

Oh, very curious to know what then the correlation coefficient carries meaning, if not its value.


Up to 0.8 - prove it. I'd put it at 0.2, cautiously so.

 
Dr.F.:
Well, add the THIRD EQUATION to the studio and show me HOW I got the curves https://forum.mql4.com/ru/54199/page23

Continuing to make a mockery of yourself, my friend. You could have had a serious discussion.
 
So from 25 July 12 to 6 February 13 on the euro and the yen are both moving in the same direction, i.e. either both are getting cheaper or both are getting more expensive, right?
 
alsu:

Oh, very curious to know what then the correlation coefficient carries, if not its value.

Up to 0.8 - prove it. I'd bet on 0.2, cautiously so.


Once again, my colleague: if your chart consists of 10 thousand chunks and each shape is known with the accuracy of 0.9999 on the correlation, and the unity is re-formatted at the beginning of the chart each time, i.e. they are merged on a scale, is it not clear that the error is accumulating due to mismatching with time? At the beginning of the chart the agreement is perfect, then it is worse and then the error is accumulated in larger steps. This is NOT the point. This is purely a technical problem. Because the correlation on the chunks is 0.9999. And if they were 0.9999999999999999999999999 there would be no problem. You don't need to know the values of E, D, Y exactly relative to a benchmark located 10 years in the past to trade. It's enough to take any benchmark, even if it's in the last bar altogether (and invert the rates over time and do the same thing you've already done and invert back again). You are clinging to a question of no relevance at all and don't want to understand the trick I'm pushing here.
 
grell:
So from 25 July 12 to 6 February 13 on the euro and the yen are both moving in the same direction, i.e. either both are getting cheaper or both are getting more expensive, right?

The author claims that this happens at all times and for all currencies.
 
grell:
So from 25 July 12 to 6 February 13 on the euro and the yen both move in the same direction, i.e. either both are getting cheaper or both are getting more expensive, right?

0_o how did you draw such a global conclusion from pictures of arbitrary 12 hours??? 0_o Are there any sane people here at all?
 
alsu:

The author argues that this happens at all times and for all currencies.

Colleague, you are inadequate.
 
Joperniiteatr:
I kind of figured it out. there is YY and we got YY from them, so we need to build YYY so that the KC between them was 1 (almost), but the additional condition is that the dollar is 144 bars ago, respectively euro starts from 1.31, and yen from 0.010815 approximately. right?

GA. One has come to one.
 
Dr.F.:

Once again, colleague: if your graph consists of 10 thousand chunks and each shape is known with 0.9999 correlation accuracy, and the one is renormalised at the beginning of the graph every time, i.e. they are merged by scale, isn't it obvious that the error can accumulate only due to mismatch as time goes by?

No, what is obvious is that I can take these chunks even if they consist of 1 reference each and the correlation will be 100% on each, but it does not give me the right to say that it is 100% on the entire row. It doesn't give you any information at all. Precisely because you are renormalising the calculations each time, which means that the neighbouring chunks are in no way related to each other.


You're picking on a question that has no meaning at all and don't want to understand the focus I'm pushing here.

I figured it out a long time ago, it's you who can't figure out what you're doing. Yes, I can take chunks of history and renormalize them as I see fit, and get any QC I want, even 0.99, even 0 at all. But since I can do it in a thousand ways (take an interval length of 100 candlesticks, 101, ... , 1100, ...), and still get DIFFERENT "true" series each time, the procedure itself has no practical value, until it is proved that parameters of transformation do NOT depend on sample length, or do depend on an acceptable range. Once you have understood this, stop playing with nonsense and see your research through to the end.
 
Dr.F.:

Colleague, you are inadequate.

You posted the pictures, that's one. Whichever count of 144 you look at, all three currencies have steps in the same direction, that's two. You predict a correlation coefficient on large samples of at least 0.8, and after that you consider yourself adequate? Nuh-uh. :)))