"New Neural" is an Open Source neural network engine project for the MetaTrader 5 platform. - page 3

 
TheXpert:

You probably won't be able to fully interface all the networks, although you should try.

if you put all the functions on a base class and give them virtual, then making a flexible abstraction will work.
 
TheXpert:

6. Models for fuzzy logic (not to be confused with probabilistic networks). Have not implemented them. But may be useful. If anyone can find information, please send it to me. Almost all models have Japanese authorship. Almost all of them are build by hand, but if it was possible to automate topology by logical expression(if I remember correctly) that would be really cool.

Are we talking about Self-Organizing Incremental Neural Network ???
 
sergeev:
if you put all functions on a base class and give them virtual, you can make a flexible abstraction.

You can't take such a clumsy approach. Why does the Kohonen network need virtual network topology functions for MLPs?

Only basic functions can be abstracted, like

-distribute signal (run inputs)

-train

-add training patterns

-issue an error

-save/retrieve from file

 
progma137:
You're not talking about the Self-Organizing Incremental Neural Network, are you?
No.
 
TheXpert:

You can't take such a clumsy approach. Why does the Kohonen network need virtual network topology functions for MLPs?

You can only combine basic functions, like

Of course, that's what we're talking about.

but functions like "CreateNet" should also be in base classes. and how it is already implemented in the heirs - what the topology will be - is up to the heirs themselves.

 
sergeev:

But functions like "CreateNet" should also be in base classes. And how it is already implemented in the descendants - what the topology will be - is up to the descendants themselves.

No, that won't work. In fact, interfaces are only needed to merge into committees, and you can pass a pointer to an already created network to the committee, so you don't even need to.
 
TheXpert:

5. PNN -- I haven't used it, I don't know. But I think there are people who can do it.

Suggest other models.

PNN is easy. For example, you can take already ready code "nearest neighbor" (kNN) in codebase. GRNN is also included here.

This project is pretty huge. One can spend years writing code for all the networks and still not please everyone. Well known neuro-specialists here tell me that if the network has not been introduced in the last 10-15 years, it is already outdated. The latest trends in this field are self-learning biological networks using ICA and sparse coding. Google "sparse coding" and "compressed sensing" as well as the work of Olshausen and Fields on Sparse Nets and their followers. It's a treasure trove. Restricted Boltzman Machines (RBM), which are the basis of Deep Belief Nets (DBN), and Convolutional Networks have also gained a lot of popularity because of their versatility. Read the work of Geoffrey Hinton and Yann LeCun:

http://www.cs.toronto.edu/~hinton/

http://yann.lecun.com/

These lectures by Ohlshausen and Hinton in English are very interesting:

https://www.youtube.com/watch?v=_G1RsAZXovE

https://www.youtube.com/watch?v=AyzOUbkUf3M

If anyone decides to code Sparse Net for MQL5, I will be very interested in cooperation. Although, for those who know me, my patience is very short and I often lose interest :)

Home Page of Geoffrey Hinton
  • www.cs.toronto.edu
I now work part-time for Google as a Distinguished Researcher and part-time for the University of Toronto as a Distinguished Professor. For much of the year, I work at the University from 9.30am to 1.30pm and at the Google Toronto office at 111 Richmond Street from 2.00pm to 6.00pm. I also spend several months per year working full-time for...
 
I would like to see some kind of help to all of this, in the help to devote a separate section, and aimed at written with an emphasis on beginners. I, for example, already want to get acquainted with these neuro jokes.
 

For testing, I propose a trading system on the intersection of two MAs

What is the advantage: The system is elementary, clear and as old as the world.

What is required: The system requires constant reoptimization of only 2 parameters, but when it comes to - the number of currency pairs, timeframes and search for the duration of optimization periods - the task grows exponentially. You can add progression if you add waving averaging methods and calculation methods.

In fact, there are only two parameters. You will not be distracted by the trading system itself, but concentrate on the neuro-project itself.


PS At the moment my Expert Advisor is participating in two MAs in the contest. I do not have any illusions, the Expert Advisor was developed in a hurry, the risks and parameters were set at random and will not become relevant in three months. It is enough that my dream of participating in the contest has come true :-), although I want to be on the first page of the rating, but that's lyric...

 
IvanIvanov:

For testing, I propose a trading system on the intersection of two MAs

What is the advantage: The system is elementary, understandable and as old as the world.

(Of course everything is clear here, except one thing - what does it have to do with NS?)

gpwr:

You can spend years writing code for all networks and still not please everyone. Well-known neuro-specialists tell me that if a network has not been introduced in the last 10-15 years, it is already outdated. The latest trends in this field are self-learning biological networks using ICA and sparse coding. Google "sparse coding" and "compressed sensing" as well as the work of Olshausen and Fields on Sparse Nets and their followers. It's a treasure trove. Restricted Boltzman Machines (RBM), which are the basis of Deep Belief Nets (DBN), and Convolutional Networks have also gained a lot of popularity because of their versatility. Read the works of Geoffrey Hinton and Yann LeCun:

If you start there you can put an end to the whole affair. You can get stuck in it for years. Thanks for the references by the way, I started reading them - it's a dark forest.) But it is better all the same to start with a simple (listed classics NS will be enough in my opinion) and gradually to the complex and for all new, complementing and improving. The faster the project gives some tangible "output" - the more chances it has to come to its logical conclusion.