"New Neural" is an Open Source neural network engine project for the MetaTrader 5 platform. - page 20

 
Urain:
Wait, you have laid out optimization methods here, not all optimization is suitable for training, for example Newton's method is defined only for known function, if you do not know function type you cannot calculate it outright, quasi-Newtonian methods are already used for that (I cannot say for others at a glance but I suppose there are also their limitations).
Of course, that is why I put the word universal in quotes. And these three are universal without quotation marks.
 
TheXpert:
The fundamental point here is that all these methods require additional memory for learning.
By the way, now that we're talking, tell the people what training methods you're sculpting?
 
Urain:

I personally can only remember Leov. Maybe someone else knows someone on this forum who can be instructed to rack the brains of progeramen?

gpwr

AlexMosc

Neutron, but he has a somewhat unusual approach :)

 
Urain:
By the way, while we're on the subject, tell the people what training methods you're sculpting?

Each network has its own. I, by the way, am only in favor of expanding the list. But special methods for each network.

I do not make any. I am not making anything at all. The code above is C++ with MLP.

And for broadening the list of target functions! Who has something to offer?

 
TheXpert:

gpwr

AlexMosc

Neutron, but he has a somewhat unusual approach :)

gpwr is already in the know, appearing here.
 
TheXpert:

Each network has its own. By the way, I'm only in favor of expanding the list. But with special methods for each network.

I don't make any. I don't make anything yet. The code above is C++ with MLP.

And for broadening the list of target functions! Who's got something to offer?

And the target probably need to wiggle and write at the end so that it is easy to unhook and change, everyone can be different, not every trader will lay out his hard-earned FF.

ZY For now, train on pictures, triangles circles.

 
Urain:

And the target probably need to wiggle and write at the end so that it is easy to unhook and change, everyone can be different, not every trader will lay out his hard-earned FF.

Also unlikely. The training algorithms depend directly on the target function. We will have to think about it. But to begin with they should be proposed :)

________________

Who has dealt with SLTM?

Long short term memory - Wikipedia, the free encyclopedia
Long short term memory - Wikipedia, the free encyclopedia
  • en.wikipedia.org
Long short term memory (LSTM) is a recurrent neural network (RNN) architecture (an artificial neural network) published1 in 1997 by Sepp Hochreiter and Jürgen Schmidhuber. Like most RNNs, an LSTM network is universal in the sense that given enough network units it can compute anything a conventional computer can compute, provided it has the...
 
TheXpert:
Also unlikely. The training algorithms depend directly on the target function. We will have to think about it. But first we have to propose them :)
Are we talking about the same thing? The target function is the one which calculates the network output error.
 

Something about the text of the last pages turns out to be right joo, which noted the point of the survey "-to give everything to one person for execution".
There are no experts on the forum for our ambitions for the project, which will decompose everything and determine the amount of programming.
Right, and Renat, who advised to look for these very experts on other thematic forums.

So far I see the following solution: we need a network expert. And it does need to be found somewhere. Renat said that the project has a budget, so we need to look for a network expert and pay him for his advice.
We won't be able to do this before someone from the methaquot management looks into this thread. And we will have an admin from them.

On the other hand, before looking for a specialist, we need to know our goals clearly and know what to ask from him. I have described my vision, and it is exactly the same as joo.
Let it be the goal of this neural network engine:

Yes, the library should be universal, like lego set - you can assemble anything you want.

Yes, the library should be easy to use, complete with a template expert. Easy to use so much that it can be used by a non-programmer.

Yes, the library should have a universal input and output interface so that everything can be connected to it, starting from indicator values to...

In general, it should be a tool like a Swiss knife, but not a magic wand (however much we would like, but it's not possible).


Yes, this is what we need. That's why we need a specialist who can fit two things to this engine - a variety of topologies and training methods.

Andrew's (TheExpert) sentiments about this utopia should be postponed until the announcement of the verdict of the hired specialist, the project administrator and the final consilium of the participants. And as a consequence would have to adjust their goals to achieve at least similar.

In the meantime, the theme remains valid.
 
sergeev:

Something about the text of the last pages is right joo, that he noted the point of the survey "-to give everything to one person to execute.
There are no experts on the forum for our ambitions for the project, which will decompose everything and determine the amount of programming.
Right, and Renat, who advised to look for these very experts on other thematic forums.

So far I see the following way out. We need a network specialist. And he must be found somewhere. Renat said that the project has a budget. Then it is necessary to look for a specialist and pay him money for his advice.

On the other hand, before looking for an expert, we need to know our goals clearly and know what to ask of him. My vision I have described, and his exact titel to titel repeated joo.
Let this be the goal of this neural network engine:


Yes, that's what we need. That is why we need a specialist, who can fit two things to this engine - variety of topologies and training methods.

Andrew's (TheExpert) sentiments about this utopianism should be postponed until the announcement of the verdict of the hired specialist, the project administrator and the final consilium of participants. In the meantime, the theme remains valid.

And as a consequence we will have to adjust our goals to achieve at least similarity.

Let's make a graphing engine, create a universal mesh (a couple of variants) and then ask an expert to explain if such mesh algorithms can lead to unification of training algorithms. I am writing unification, because it is clear that a universal learning algorithm will not work, but I am sure it is possible to make a learning algorithm as easily transformable as the network itself (to be honest, my confidence rests more on faith than on fats).