"New Neural" is an Open Source neural network engine project for the MetaTrader 5 platform. - page 90

 
sergeev:
then remove the broken link from wikipedia.
Already replaced it.
 
Торговую утилиту NeuroPlus
Торговую утилиту NeuroPlus
  • 2015.10.19
  • Roma Ivanov
  • www.mql5.com
Утилита позволяет обучать нейронную сеть (многослойный перцептрон) не прибегая к языку программирования. В сеть можно подавать любые индикаторы. Нормализация входных данных происходит на лету, по выбору пользователя. Обучение сети происходит также на...
 
Andrey Dik:

Open Source neural network engine project for MetaTrader 5 platform

What is the result? Is there a neural engine for MT5? Where is it?
 
kapelmann:
So what's the result? Is there a neuron for MT5 somewhere? Where is it?
I do. In Python. It's in there somewhere.)
 
kapelmann:
So what's the result? Is there any neuronics for MT5? Where is it?

the project has unfortunately stalled.

But Alglib has appeared as part of the terminal, look out for it.

 
Andrey Dik:

the project has unfortunately stalled.

But Alglib is now part of the terminal, look out for it.

There is a limit to the number of layers in the MLP. As in the original Alglib.
 
Andrey Dik:

the project has unfortunately stalled.

But Alglib has been added to the terminal, look out for it.

Dmitriy Skub:
There's a limit on number of layers in MLP. As in the original Alglib.

Hmmm... strange at all, reading some posts like:

TheXpert:

Ahem (modestly) about the reaction -- there are already 3 libs for neurons.

One has more than 10 nets. Worked with Kohonen nets, MLP, recirculation, Hopfield ... ,

the second one is implementation of general case of MLP + Jordana-Elman network -- i.e. any topology (directed graph) with possibility to loop back any layers,

the third is an implementation of an Echo network, my favourite :) .

It's been a long time really (apart from the echo network), but you can remember. Haven't worked with probabilistic models. Not familiar with recent improvements to gradient descent method and hybrid methods.


To me 4 of the implemented networks are interesting

1. Kohonen networks, including SOM. It is good to use for cluster partitioning where it is not clear what to look for. I think the topology is well known: vector as input, vector as output or otherwise grouped outputs. Learning can be with or without a teacher.

2. MLP , in its most general form, i.e. with an arbitrary set of layers organized as a graph with feedbacks. Used very widely.

3. Recirculation network. Frankly speaking, I have never seen a normal working non-linear implementation. It is used for information compression and principal component extraction (PCA) . In its simplest linear form, it is represented as a linear two-layer network in which the signal can be propagated from both sides (or three-layer in its unfolded form) .

4.Echo network. Similar in principle to MLP, applied there as well. But totally different in organization and has a well-defined learning time (well, and always produces a global minimum, in contrast).

5. PNN -- I haven't used it, I don't know how. But I think there's someone out there who can do it.

6. Models for fuzzy logic (not to be confused with probabilistic networks). Not implemented. But may be useful. If someone finds information throw plz. Almost all models have Japanese authorship. Almost all are built manually, but if it were possible to automate topology construction bylogical expression(if I remember everything correctly), it would be unrealistically cool.

And you think wow, what geniuses are gathered here, but in result "project stalled" and from ready there is only this and it seems that most of them are just petty liars.

 
kapelmann:

Hmmm... it's weird at all, you read some of the posts like:

And you think wow what geniuses are gathered here, but the result is "project stalled" and there's only this and it looks like most of them are just petty liars.

There's a lot of dirt in their marketplace.

 
kapelmann:

Hmmm... it's weird at all, you read some of the posts like:

And you think wow what geniuses are gathered here, but as a result "the project is stalled" and there is only this and it seems that most of them are just petty liars.

it's an open source project, no one owes anyone anything.... purely on instinct.
It's not easy to gather geniuses into a group, and some people get bored, others think they're working harder than others...
It was interesting, it was fun, what else do you need? those who were interested - learned something new for themselves, looked through books and books, and that was a workout for the brain.
But those who sat and waited for ready-made solutions - don't be sorry.
thanks to all the active participants in the project, and don't mention the rest.

 
Evgeny Belyaev:

Their marketplace is a dime a dozen.

Open-source? I haven't seen any yet...

And the bazaar black boxes, no sorry, that's for the completely out of their minds, I would not be surprised if 90% of them do not have any neural networks at all, but ordinary indicators, with GUI visualization of "learning process", they are bazaars, everything for them is marketing, nothing sacred, after death hell awaits them.

Andrey Dik:
this is an open-source project, no one owes anyone anything.... purely on intuition.
And gathering geniuses into a group can be very difficult, moreover, some people just get bored, some people think that they are working harder than others ...
it was interesting, fun, and what else do you need? those who were interested - learned something new for themselves, looked through wikis and books and that was a workout for the brain.
Those who sat and waited for ready-made solutions - don't be sorry.
Thanks to all active participants in the project, but don't speak ill of the rest.

I agree, the main thing is the process, especially when you already have at least a couple of million dollars in the offshore