Machine learning in trading: theory, models, practice and algo-trading - page 591

 
Yuriy Asaulenko:
Actually, everything must be written in C++/C#, and there are no problems with interaction with anything at all. The problem is that the main DM libraries are in Python and R, and this is necessary, at least, to master. And interaction is nothing, there are API everywhere (except MQL). You can at least transfer files via RAM-Disk.

Well, yes, that's right, there's no problem with that.

The problem is how deeply you need to put down roots in MO and at what stage to understand that these models are already enough...

I've settled on the classics for now and everything is enough for me... I'm not after a 1-10% increase in accuracy :) now I'm focusing on strategies, a lot of ideas have been written out - and I need to test it all, it's hard

I`m studying PNN in details - they work with probabilities and don`t retrain very much.

 
I amEvgeny Belyaev:

Yura, you're really something! )) You can say that at the MQL forum. Now your comrades will come running and throw stones at you.

I also use MQL only to open/close orders. I also use MQL to open/close orders. All the calculations are done in Visim. However, this is the only forum with more or less professional physicists and mathematicians, that's why I came here. The rest of the sites are just village idiots.
 
Alexander_K2:
I will defend Yuri - I also use MQL only for opening/closing orders. All the calculations are done in Visim. However this is the only forum with more or less professional physicists and mathematicians that's why I came here. The rest of the sites are just village idiots.

Alexander, you may be interested in this topic :)

The method of approximating probability density functions by means of kernel functions is much similar to the method ofradial basis functions, and thus we naturally arrive to notions ofprobabilistic neural network (PNN) andgeneralized regression neural network (GRNN)(Speckt 1990, 1991). PNNs are designed forclassification tasks, while GRNNs are designed forregression tasks. Networks of these two types are an implementation of kernel approximation methods, fram ed asa neural network.

 
Maxim Dmitrievsky:

Alexander, you may be interested in this topic :)

The method of approximating probability density functions by means of kernel functions is much similar to the method ofradial basis functions, and thus we naturally arrive to notions ofprobabilistic neural network (PNN) andgeneralized regression neural network (GRNN)(Speckt 1990, 1991). PNNs are designed forclassification tasks, while GRNNs are designed forregression tasks. Networks of these two types are the implementation of kernel approximation methods, framed asa neural network.

Aha. Thank you, Maxim!
 
Alexander_K2:
Yep. Thank you, Maxim!
Maxim is actually very good. Sometimes I am amazed at his outlook. But, we should remember - people who read a lot, get out of habit to think for themselves. Guess who?)
 
Yuriy Asaulenko:
But, we should remember - people who read a lot, get tired of thinking for themselves.(c) And it's not me who said that. Guess who?)

There is also this opinion:

People stop thinking when they stop reading. Diderot

 
Yuriy Asaulenko:
Maxim is actually very good. Sometimes I am amazed at his outlook. But, it is necessary to remember - people who read a lot, get out of habit to think independently. Guess who?))
I agree. But, the link is interesting - I'll read it when I have time. I'm busy now - I see the Holy Grail on the horizon and, pushed by the powerful hands of my father-in-law, I move towards it.
 
Evgeny Belyaev:

There is also this opinion:

People stop thinking when they stop reading. Diderot

One does not exclude the other.) No one claimed that it is not necessary to read at all.)
 
Alexander_K2:
I agree. But the link is interesting - I will read it when I have time. I'm busy now - I see the Grail on the horizon and, pushed by my father-in-law's powerful hands, I move towards it.

Seek and ye shall find. (с)

But not this time.

 
Yuriy Asaulenko:
Maxim is actually very good. Sometimes I am amazed by his outlook. But you should remember - people who read a lot get out of habit to think for themselves. Guess who?)

Yes right, just went through all sorts of articles to see what is interesting in this topic :) Well, the main advantage over MLP, as I understand it, - speed and minimum settings (here they are not at all) and the fact that these grids are almost not retrained

Well, and Gaussian f-me is used, not the Student's f-me. A vertex density f is created for each input, then the results at the output are linearly summed up

By the way, PNN and GRNN are available in mql-view, but I haven't tried them yet and haven't compared them with MLP

https://www.mql5.com/ru/code/1323

Класс нейронной сети PNN
Класс нейронной сети PNN
  • votes: 41
  • 2012.11.30
  • Yury Kulikov
  • www.mql5.com
Класс CNetPNN реализует вероятностную нейронную сеть (Probabilistic Neural Network - PNN). Создание сети объявляется параметрическим конструктором класса. Нумерация классов(целей классификации) начинается с нуля и должна быть непрерывна. Например, если задано 3 класса, то номера классов должны быть: 0, 1, 2. Обучение сети осуществляется вызовом...