"New Neural" is an Open Source neural network engine project for the MetaTrader 5 platform. - page 31

 
TheXpert:
Already solved and applied. And even posted in this thread.

I wrote on materials posted tol64 , another such proposal did not see (by inattention or lack of it).

If something duplicated, I do not mind let NeuroOpenSource delete.

TheXpert:

you can not dll

I'm just saying, for now, just do it by hand, and then MQ will think of something and make special methods for special cases :o)
 
Urain:

If I duplicated something, that's fine with me. NeuroOpenSource delete.

He's not an admin :) .

I'm just saying, use your hands, for now, and then maybe MQ will come to his senses and make special methods for special cases :o)

You shouldn't try to embrace what you can't grasp. It is better to proceed steadily. Now the following fundamental entities are being outlined:

__________________________

Network (consisting of layers, synapses and buffers).

A tutor (external versatile learning algorithm) which needs the network to enumerate and apply all its customizable parameters. For example, a genetic tutor. By default, training is embedded into the layers and inside the mesh.

Initializer. This is probably the simplest entity :) Initializes the network's adjustable parameters.

Pattern Manager. An entity that allows you to create (generate) patterns, load and save them, compatible with the grid.

Visual manager. An entity which allows you to visually design a network.

Data processor. Entity for pattern normalization and analysis.

_________________________

Aren't you forgetting something?

All entities are related in some way (i.e. some entities are supposed to support other entities through interfaces) but they are essentially independent.

_________________________

At the moment, you can develop a Patterns Manager and a Data Processor at the same time, with minimal support arrangements, without any problems.

 
TheXpert:

Aren't you forgetting something?

All entities are connected in some way (i.e., some entities are supposed to support other entities through interfaces), but they are essentially independent.

_________________________

At the moment, you can develop a Patterns Manager and a Data Processor at the same time without any problems, with minimal support agreements.

Yes, I don't seem to have forgotten anything - it's all there.

One addition - it is desirable that there would be some compatibility of Pattern Manager with standard Expert Advisor generator wizard (may have to dopilize the wizard itself) .Hence, the possibility to generate patterns for neurodrive with standard wizard.

 

I suggest that we consider the following architectural points:

1) prepare all external (relative to the NS) data in the form of indicators, which will allow:

- be independent of the entire system

- Visually assess the "correctness" of the idea

- to choose the way of normalization

2) to actively look towards OpenCL (CUDA, unfortunately, is not available to the lucky owners of AMD GPUs)

- HD6970 has 1536 Stream Processors - it is not 6 Core on CPU

- In most cases, the training of neural networks is a SIMD task, which fits perfectly on the GPU

- The architecture of the whole complex should be designed with these requirements in mind from the very beginning

3) the entire file turnover between subsystems (configurations, networks, queries, ...) should be maintained in XML

- open standard

- 100500 visual editors

- ready-made parserhttps://www.mql5.com/ru/code/97

XmlParser
XmlParser
  • votes: 11
  • 2010.04.12
  • yu-sha
  • www.mql5.com
Простой XML-парсер, который использует стандартную библиотеку msxml.
 
TheXpert:

Network (consisting of layers, synapses, and buffers).

A trainer (external universal learning algorithm) that needs the network to enumerate and apply all configurable parameters. For example, a genetic tutor. By default, training is embedded into layers and inside of the grid.

The initializer is probably the simplest entity :) initializes the network's tunable parameters.

Pattern manager An entity that allows you to create (generate) patterns, load and save them, compatible with the grid.

Visual manager An entity that allows you to visually design a network.

Data Processor An entity for pattern normalization and analysis.

As I understandthe pattern manager is a ready-made templates of various networks in the Initializer format ?

the Visual Manager is also dependent on the Initializer, since the VM saves what is created through the MT in the Initz format.

The Initializer is dependent on the Network.

The tutor must be sewn into the network itself, unless of course it is external like GA. so the internal tutor is dependent on the Network

Data processor is independent even from itself :o), pre-processor does not depend on post-processor (the main thing is not to lose synchronicity).

In general, we have only two independent elements so far: Network and Handler.

 
Urain:

The pattern manager , as I understand it, is a ready-made templates of various networks in the Initializer format ?

No, the pattern manager is the one that can read patterns into a file and work with the time filter.

The Visual Manager is also dependent on the Initializer.

The initializer just initializes the data slipped to it, what dependencies?

The initializer is dependent on the Network .

See above.

The tutor should be sewn into the network itself, unless of course it is external like GA.

It is external. I put it in brackets by design.

 

two interesting SVM algo: SVM with dynamic time warping as kernell function:http://notendur.hi.is/steinng/ijcnn08.pdf incremental SVM learning : http://www.isn.ucsd.edu/svm/incremental/

 
TheXpert:

The initializer will translate the tabular data about the network structure into the format of initialization, i.e. into the format "and now you need to call this function, or this cycle will have so many iterations", so it depends on the capabilities of the network. If the network can't be created like this, then this denial imposes restrictions on the initializer.

The manager saves/loads, saves the created by the visualizer, loads the saved, but if the initializer is dependent then the dependency is passed to both the manager and the visualizer through it. For example, the user has introduced a requirement that the results in*wg should be summed in pairs and then multiplied, and network functionality does not support it, it means you need to introduce restrictions in the renderer and it is a dependency.

(in0*wg0+
 in1*wg1)
 *
(in2*wg2+
 in3*wg3)
 
Urain:

The initializer will translate tabular data about the network structure into the initialization format.

Damn. Where did these fantasies come from? I have a feeling that you don't know what you're talking about at all.

class IInitializer
{
public:
   virtual void Init(double& value) {ASSERT(false);}
   virtual void Init(array& values) {ASSERT(false);}
   virtual void Init(matrix& values) {ASSERT(false);}
};

Where are the dependencies?

 
TheXpert:

Shit. Where do you get these fantasies? I have the feeling that you do not understand what we are talking about.

Where are the dependencies?

You slightly misunderstood, the dependence is not direct but inverse, passing through the chain, if the network can not do something (to create a configuration), then when you write the upper blocks it must be taken into account, this is the dependence. Until there will be an approved configuration of the network to talk about the upper blocks, because they are floating in a field of uncertainty.

SZY You can go another way, write all the blocks as is, then with detection of inconsistencies put patches, then patches again, as a result when the volume of patches reaches a critical to analyze everything and from scratch already rewrite everything. A little painstaking, but it allows you to get to work immediately and gradually identify all the inconsistencies.