"New Neural" is an Open Source neural network engine project for the MetaTrader 5 platform. - page 60

 
gpwr:

Rebyata, ya budu not often syuda zahodit'. Esli est' voprosi or interes k sovmestnim razrabotkam, pishite na moy yahoo email (ukazan v profile).

Good luck with the EngiNeuro project!

Thank you Vladimir! Come more often :)

Urain:

All at once

I also used to think that the piece by piece is the best. But no. Gradient algorithms are designed for total error, so if you feed by the piece, it's not exactly a gradient descent.

So only all at once, at least for feed-forward

 
TheXpert:

Thanks, Vladimir! Come by more often :)

I also used to think that piece by piece is best. But no. Gradient algorithms are designed exactly for total error, so if you feed by the piece, it's not exactly gradient descent.

So only all at once, at least for feed-forward


Yes it depends on the implementation of the algorithm, I'm brainstorming now if I haven't forgotten anything?
 

Notes on NS training

1) Fitness function (learning functional)

Any NS learning process is reduced to finding a maximum (minimum) functional in the space of adaptive arguments (in our case this is the space of weights)

2) MSE

The classical version of the functional is MSE(mean square error).

For it, the partial derivatives for each adaptive parameter can easily be found, which allows for the use of gradient adjustment of weights.

For each example, which is fed to the input of NS, we get the network answer - y, and we have a "correct answer" - y'

For stepwise learning method weights are adjusted after each example. In this case MSE=MODUL(y-y')

For the batch method, the weights are adjusted after running through the entire training sample. Here MSE=ROUND((AMOUNT(y-y'))^2)

The main disadvantage of MSE is that you need to have the "right" answer for each example

This functional is useful, for example, for reconstruction of an "unknown" indicator algorithm, if its "correct" values on each bar are known.

3) Non-standard functionality

A wider range of tasks can be solved using non-standard functionality.

For example, to search for synthetic trading strategies you can use neuronets that are trained, for example, to maximize the F=Profit/MaxDownload functional

Here the estimation is performed after the full training run, because the strategy performance can be estimated only at the end of the period

Very nice note - we do not need to have the correct answer for every bar

There is an unpleasant "but": there is no possibility to find the partial derivative of the functional for each weight, so gradient methods of weights construction are unacceptable - we need to use stochastic methods, such as GA

This is where GPUs come in - there is no budget alternative to them yet

P.S.

Unlike the optimizer of parameters of each particular trading strategy, neural networks allow to "grow" fully synthetic algorithms of trading strategies )))

And the potential of such an approach is very great

 

I am preparing a function to save the grid to a bin file. All information is easily encrypted into an array of strings, but what to do with the weights I do not know.

They are in duplicates, and I would like to use the standard array save function.

Does anyone have an algorithm for restorable double encryption in ulong?


I think that the grid should be saved in a simple format, suitable for MQL5 (taking it as a base), and then I will use it to write the converters for different formats of different neural network packages.

 
Urain:

I am preparing a function to save the grid into a bin file. All information is easily encrypted into an array of strings, but what to do with the weights I do not know.

They are in duplicates, and I would like to use the standard array save function.

Does anyone have an algorithm of restorable double encryption in ulong?


I think that the grid should be saved in a MQL5-friendly format (having taken it as a base format), and the converters for different formats of different neural network packets should be written from that format.


What does FileWriteArray do not suit? I don't understand the problem. Can you give us an example?

In any case, the grid configuration in one file and weights in another.

I don't see what is more convenient for MQL. I don't see why we need torestore the encryption, it's too slow.

 
her.human:

What's wrong with FileWriteArray? I do not understand the problem. Can you give us an example?

In any case, the grid configuration in one file and weights in another.

I do not see what is more convenient for MQL. I don't see why we need torestore the encryption, as it would cause unnecessary braking.

Yes, it will be slow (according to my estimation it will take 5 seconds to encrypt 1 mile of weights), I just want to store both weights and grid structure in one file, and I don't care how many files it will take, this will be handy.

For FileWriteArray I just want to sharpen, but there is an array ulong forming a network description (number of layers, number of neurons, types of neurons, the connection between them) and it is still attached an array of weights, but in duplicates,

how to cram it all into one bin-file (assuming that there is no explicit partitioning, the partitioning itself is encoded in the first numbers of the grid) ?

 
Urain:

0. I'm preparing a function to save the grid to a bin file. All information is easily encrypted into an array of counts, but what to do with the weights I can't think of.

I want to use the standard array function, but I don't know what to do with them.

1. Does anyone have a restorable algorithm for encrypting double into ulong?


I think that mesh should be saved into a simple format, suitable for using in MQL5 (taking it as a base), and only then write the converters for different formats of different neural network packages.

0. It's extremely premature. First you absolutely need to clarify and reconcile the whole set of data, unambiguously mapped to the structure and setup of the grid, on a logical level. Physical preservation is in no way a problem.

1. It's easy. In mql5 there is a special hole for such conversions - structures of different types can be assigned to each other without limitation, as long as they have the same size.

// ulong and double are exactly the same.

Check out a perverse example here : https://www.mql5.com/ru/forum/3775/75737#comment_75743

2. Mmmm... On the one hand I agree - you need a convenient and simple format, on the other hand it should be very universal, like xml. Maybe plan two variants (as much as possible unambiguously displayed), one text, another binary. In general - see point #0.

---

I have been circling around this branch, still hesitating whether or not to join in... So I couldn't resist.

A thought gnawed at me. One. // I mean, I have a lot of thoughts, but only one. :)

The idea is this: the grid code should be generated after preliminary configuring in the "grid-editor" (configurator). // This idea has been suggested many times before and I don't remember it being rejected.

Hence, the scheme - a mandatory intermediate representation (e.g., as an xml-file) containing full information about the neural network structure.

The intermediate representation is carefully thought out, analyzed for completeness and other intricacies, approved and fixed by.

Only after that you can code (separately): (1) All sorts of grid-configurators, (2) codogenerators - translating the intermediate representation into mql5 code.

And both can be several implementations - which is good and right.

 
Urain:

Yes brakes will be (1 thousand weights by my estimation will encrypt 5 seconds), I just want and weights and the grid device in one file to store, and then scatter a bunch of files, the devil himself will break a leg, in this I see the convenience.

I just want to sharpen FileWriteArray, but there is an array ulong forming a network description (number of layers, number of neurons, types of neurons, connections between them) and an array of weights attached to it, but in duplicates,

how to cram it all into one bin-file (assuming that the partitioning is not clear, the partitioning itself is encrypted in the first numbers of the grid) ?

Categorically against shoving everything in one file. Description of the network separately - weights separately. Otherwise there will be other unnecessary problems.

 
Why bin? Wouldn't a plain text file be better, so you could look at it with your eyes?
 
joo:
Why a bin? Wouldn't you prefer a plain text file that you could look at with your eyes?

Sure. I was talking about xml from the beginning. You could also use json.

And the storage is simple - every class part of the network inherits from the serialization interface