Can anyone explain this or is it a bug?

 

Running the same test with all the same settings produces an identical result except for the sharp ratio

the only difference being changing the timeframe for the test (though for the indicators it is set independently in the code so is not impacted by the test timeframe set). 

How is this possible? Is this a known bug?

Which Sharpe ratio would be the correct one?

(see attached the 1D, 1H, the 1M would give half of this, also for test periods of a year the differences could be much much larger)

Files:
D1.png  46 kb
H1.png  46 kb
 

I confirm the issue.

Not sure if it's a bug or not but it's strange for sure. I guess the tester timeframe influences the sharpe ratio calculation.

I will bring it to MetaQuotes attention.

 
Alain Verleyen #:

I confirm the issue.

Not sure if it's a bug or not but it's strange for sure. I guess the tester timeframe influences the sharpe ratio calculation.

I will bring it to MetaQuotes attention.

thanks a lot Alain!

Did you get any explanation from them? What would be the best timeframe to use to get an accurate sharp in your opinion?