"New Neural" is an Open Source neural network engine project for the MetaTrader 5 platform. - page 54

 
Urain:
Come on, what's closer to you?
Closer to echo, easier to SOM. Probably better the SOM, because it can be done with or without a teacher.
 
TheXpert:
Closer Echo, Easier SOM. SOM is probably better, since it can be done with or without a teacher.

SOM is like that.

1. initialize the grid
2. work the grid
3. teach the grid

Who does what?

Or should we first work out the class hierarchy?

 
Urain:

I was thinking about how to screw it in so that during training new neurons would be born in the right places, but after studying the algorithms I came to the conclusion that they will not have a common formalization, almost no points of intersection. That is why I refused, concentrating on the vertical construction of the network. Type:

data --> neuron --> encapsulated neural network --> container neural network

You can make GAs amorphous. The human genome didn't always consist of 28,000 genes.

Ouch...

 
joo:
Oh, you're online. Answer about the topology.
 

First estimates

class IEvolvable // интерфейс для подключения эволюционных алгоритмов
{
public:
   virtual void GetWeightsAsVector(double& weights[]) const; // получаем все изменяемые веса собранные в одном векторе для генетики
   virtual void ApplyWeightsVector(double weights[]); // применяем подобранные генетикой веса к сети
   
   virtual void FeedInput(double inData[]); // подаем вход
   virtual void PropagateSignal(); // прогоняем входной сигнал
   virtual void GetOutput(double& outData[]) const; // берем выход
};

class ISerializable // сохранение
{
public:
   virtual bool LoadFromFile(string filePath);
   virtual bool SaveToFile(string filePath) const;
};

class IBasicNet
   : public IEvolvable
   , public ISerializable
{
public:
   virtual void FeedInput(double inData[]); // вход можно брать из коллекции или отдельно. Подразумевается, что коллекция включает в себя обработку входов, поэтому отдельные входы надо преобразовывать коллекцией
   virtual void FeedInput(int index);
   virtual void PropagateSignal();
   virtual void GetOutput(double& outData[]) const;
   virtual void Init(); // инициализация. Инициализатор можно подавать в конструкторе, можно в функцию, посмотрим как будет удобно
}

class ISupervised // сеть с учителем
   : public IBasicNet
{
public:
   virtual void SetPatternCollection(PatternCollection* collection); // у сетей с учителем обязательно каждому входному образу соответствует выходной поэтому лучше их сразу организовывать по парам
   virtual void CountError(); // подсчет ошибки. сюда например будет входить ОРО для MLP
   virtual void Learn(); // изменение весов. никаких итераций внутри, чтобы можно было на каждом шаге просмотреть состояние
};

class IUnsupervised // сеть без учителя
   : public IBasicNet
{
public:
   virtual void SetInputCollection(InputCollection* collection); // у сетей без учителя только входы
   virtual void Learn();
};

class IInitializer // инициализатор
{
public:
   virtual void Init(double& value);
   virtual void Init(double& value[]);
   virtual void Init(Matrix& value);
};

 
TheXpert:
Oh, you're online. Answer about topology.
That's an "oops". As I see it, topology management can and should be viewed as genetic programming. I'm not very strong here - it's a separate broad field of knowledge. But, if necessary, I will take up this question closely. This is the kind of GA in which the chromosome length can change dynamically. In my GA, the chromosome length is fixed. Although, it is possible to get around this by introducing additional flags to freeze individual genes, and you can take the chromosome length with a reserve.
 
TheXpert:

First estimates

Commentary.
 
joo:
This is an "oops".
Okay, for now, there's a way around the problem, and we'll think about it when we turn the genetics around.
 
Urain:
This is the next step and is not directly related to the engine, its implementation is through an external GA that creates different topologies, those initialize objects of different engines.

And yes, you should probably use several GAs. One for topology and one for adjusting all weights of all networks. Then the second GA would just freeze some of the genes depending on the current topology.

 
Urain:
Throw in some comments.
threw some