double iRSIOnArray(double array[], int total, int period, int shift)

 

Hi,

When I used this function, the value returned defers for different values of total.

In most examples, the value for total is the same as that returned in Bars. However, Bars is dependent on the number of bars per chart which the user customized via the Options menu.

So my question is, what is the minimal reasonable value of total should be should one not want to choose Bars as the value?

The reason I am asking this is that I want to use this function in an EA and not an indicator.

Thanks,

cook

 
Just use the size of the array: ArraySize(array)
 

Sorry I wasn't writing about the size of the array. It is when I call iRSIOnArray and the value returned is dependent on the value of "total" which I enter it to be the same as ArraySize(array). Different sizes of array can result in different values when I call iRSIOnArray. So, my question is why are the values dependent on the size of the array?

If I use this is an EA, certainly I don't want to have a big array because it takes more CPU time to build. In an indicator this is OK because the array is build incrementally.

So, anyone can explain why iRSIOnArray values are dependent on the size of the array?

 
It needs to know how big the array is so it doesn't read past the end of the valid data.
 

What do you mean? If used in and indicator and using Bars as the count, it can be as big as what the user choose for the chart, for example, 50000.

In an EA, I want the array to be small around 50, means I take 50 bars for computation towards iRSIOnArray.

So if I used 50000 bars in once instance and 50 in another instance, the output from iRSIOnArray diifers. As an example:

int noBars = 50000;

double sourceBuffer[];

double destBuffer[];

....... some work here ....

for (int i = noBars; i >= 0; i++)

{

destBuffer[i] = iRSIOnArray(sourceBuffer, noBars, PERIOD_M15, i);

}

After doing the above accessing destBuffer[0] returns a value, say XXXXX.

If I do the same method again but instead of noBars = 50000, I use only 50, then accessing destBuffer[0] returns a slightly differ value, say YYYYY.

So, my question is really why XXXXX is different from YYYYY? From my test this has obviously something to do with the size of sourceBuffer chosen.

 
cook:

So, anyone can explain why iRSIOnArray values are dependent on the size of the array?

MT4's RSI calculation uses a smoothed average, and this type of average is dependent on the size and entire contents of the data set. Even if you ask for the RSI of the last 10 bars, the value you get is still affected by the 10,000th bar - albeit by a tiny amount under normal circumstances.

For another discussion of this, see https://www.mql5.com/en/forum/106620 and the Wikipedia description of Cutler's RSI versus the Wilder RSI which MT4 uses (https://en.wikipedia.org/wiki/Relative_Strength_Index#Cutler.27s_RSI)
 
jjc:
MT4's RSI calculation uses a smoothed average, and this type of average is dependent on the size and entire contents of the data set. Even if you ask for the RSI of the last 10 bars, the value you get is still affected by the 10,000th bar - albeit by a tiny amount under normal circumstances.

For another discussion of this, see https://www.mql5.com/en/forum/106620 and the Wikipedia description of Cutler's RSI versus the Wilder RSI which MT4 uses (https://en.wikipedia.org/wiki/Relative_Strength_Index#Cutler.27s_RSI)
Thank you jjc. Though I have long returned all knowledge of mathematics, I think I do understand what you have pointed out. The RSI data is dependent on the number of bars used in the computation (Wilder RSI). So, the approach to my problem would be to choose a reasonable number and yet not compute intensive.
 
cook:
Thank you jjc. Though I have long returned all knowledge of mathematics, I think I do understand what you have pointed out. The RSI data is dependent on the number of bars used in the computation (Wilder RSI). So, the approach to my problem would be to choose a reasonable number and yet not compute intensive.

Basically, yes. Any calculation involving a smoothed/exponential average will depend on the totality of the data available to it. In such a context, a parameter such as "10 bars" doesn't really mean 10 bars; the 10 is an expression of how much weight to give the most recent values.