A quick and free library for MT4, much to the delight of neuralnetworkers - page 8

 
newerty >> :

Please advise. How do I make my EA trade several symbols at once?

For example gold, gbpusd, nzdusd, audusd etc...

On each pair, obviously different S/L.

Clone MT4... clone advisor....???? And run at the same time?

Just put the EA on different charts.

StopLoss on each pair is different.


The only thing you cannot do is hang another EA on a pair occupied by an EA, or trade manually, because the magic numbers are not used. I.e. no more than one EA per pair.

 

Yuri, could you please give me an example of the settings (timeframe, period, etc.) that give you a large number of trades?

 
Solver.it >> :

Yuri, could you please give me an example of the settings (timeframe, period, etc.) that give you a large number of trades?

On page 6 of this thread is an excerpt from backtest. It's all stated there.

 

If the line

if (IsOptimization() || IsTesting()) {


is replaced by

if (IsOptimization()){


then the results of the single run become more stable.

I ran into a different problem: the network adjusts very quickly to the data, graphs like a ruler in the tester, but the forwards and backtests show a very different curve character.

 
Kharin >> :

If the string

if (IsOptimization() || IsTesting()) {


to replace with

if (IsOptimization()){


then the results of the single run will become more stable.

I ran into a different problem: the network adjusts very quickly to the data, graphs like a ruler in the tester, but forwards and backtests show a very different curve character.

This is self-explanatory, because removing this function disables adaptation in test mode. But I left this feature in the EA on purpose. The more unstable the results on different runs of the adapted test, the more likely it is that the network has not learned anything in particular, because it is just one more epoch and everything is already quite different. I.e. if unstable results show that the grid is not sure on the test sample, then we can not even mention forwards - neuronka has not even seen their quotes.

 
It's a matter of taste, of course, but I'd put this choice in external variables. However, I have done so)))
 
Reshetov >> :

I repeat: this line carries no information load. The sign of ret does not change, while trades will be opened depending on positive or negative values ret

It is obvious about the sign. It is also obvious that without a deuce this function returns a meaningful value of the averaged response of the meshes committee, but with a deuce it is bullshit. Taking into account that the same function is called in the normal run of grids after training and it's better to reject small ret values (both positive and negative) without generating deals by them, this line really contains important information.

You still haven't answered why you only teach the nets on negative examples?

 
Kharin >> :

I ran into a different problem: the network adjusts very quickly to the data, graphs like a ruler in the tester, but the forwards and backtests show a very different character of the curve.

Yes, in its current form, the EA does not perform training quality assessment. If we change the data collection logic, we could insert a couple of calls: f2M_test (with validation data, not training data) and f2M_get_MSE, stopping training when the error starts to grow.

 

Yuri, I want to ask an off-topic question: is it possible to set up a separate grid for SL (say by volatility - make predictions, and adjust SL to that)?

Maybe that would help the paternoster grids to learn more consistently?

 

Yuri, I think I've found another inaccuracy in the code... Was poking around in my code for weird learning results and found this:

double ann_pnn() {
...
    ret = 2 * ret / AnnsNumber;

It's a must:

ret = ret / AnnsNumber;

The fact that the author of the library in his EA, for reasons I do not understand, split the grid in half for short and long positions into respectively even and odd with respective cycles:

for (i = 0; i < AnnsNumber; i += 2) - для четных С ПРИРАЩЕНИЕМ "2" !!!
for (i = 1; i < AnnsNumber; i += 2) - для нечетных

Hence the two in the denominator. We do not need it in our case. Although it's clear that it won't have much effect on training results...

The meaning of this loop (ann_pnn and run_anns functions) escapes me completely...

for (i = 0; i < AnnsNumber; i++) {    ret += AnnOutputs[i];    }

If we have a grid with one output neuron, where do we get 16 outputs?!... Or is it a committee of 16 meshes? That's what I'm leaning towards... Then the question is: what's the point? I've left this piece unchanged for now too, until I figure out its meaning finally... Does anybody have any ideas about it? Please share...