"New Neural" is an Open Source neural network engine project for the MetaTrader 5 platform. - page 75

 
joo:

Well, you can roughly estimate. To do this, run once through the history of the FF measuring the time and then multiply by 10000. You'll get a pretty realistic result, which is what you'll get if you run the training.


And that's... What's there to lighten up about? :)

Not quite adequate, you need to separately measure and subtract the total time of the FF, to get the net value of the algorithm execution.

Then you can multiply by the number of parameters.

The cutlets are separate, the flies are separate.

 
Urain:
I once thought to write a tester to train a small grid with Tester GA, like the one I drew above, 6 scales 3 neurons, XOR problem, but I still can't get to it :)
I'm talking about:
Network 100x1000x1 - full mesh
 
2)how many examples?
I'm talking about:
her.human:

1) What to teach,

2) How many examples,

3) error, etc.?

1) For the sake of experiment - try to approximate some function, e.g. the one in the article.

2) Well, not less than 1000, I think.

3) ZFF is the smallest error with the surface of the test function.

 
joo:

And that's... What's there to lighten already? :)

UGA - universal, suitable for many tasks. You can sharpen, lighten, specifically for network training.
 
joo:

3) ZFF is the smallest error with the surface of the test function.

ZFF - I don't get it. ?
 
her.human:
ZFF - I don't get it. ?
The meaning of FF, or - VFF, if you follow the terminology of the article.
 
joo:
The value of FF, or - VFF, to follow the terminology of the article.

The smallest mistake is an elongated concept...

That's it, I'm leaving, I'm already too much here. If you have any more questions, I will ask you in private, so as not to clutter up. I will show you the results.

Hopefully Urain & yu-sha will decide on the architecture and network description.

 

The XOR problem solved by a standard tester GA, for 100 examples of 14 discrete errors.

there are two meshes in the code for two neurons in cascade, and for three neurons as in the classic MLP.

step at the top of the comments, for 7 parameters step 0.005, for 9 0.2-03 weights with input 0.2, independent 03.

so the game but pretty.

ZZZ I am a fool, I have given in the examples of corners of 1 for example, and the middle 0. And no matter how the grid twists of the two zeros at the input can not produce a 1.

I verify grid outputs error should be equal to zero in discrete form, and tends to zero in real.

ZS Strangely enough inverted outputs and the error did not go away, even a little bit increased 16, okay with it write off on GA :) Maybe I'm sleepy now.

Files:
NN_GA.mq5  9 kb
NN_GA.mq5  9 kb
 

Today is Elder Day :)

Complete silence, everyone is smoking the interview.

 

Probably a silly question.

Is it possible to classify vectors whose dimensionality is not equal to N with the Kohonen map tuned to a vector of dimension N. Basically a person will classify a sphere with a circle, a square with a cube, a pyramid with a triangle into the same class. I hope the idea is clear.