Hybrid neural networks. - page 3

 
IlyaA >> :

Where did I write that I have them all crossed...?

->

IlyaA wrote(a) >>

Oh yes, the network at first stages is full mesh, well, or like convolution networks, but there are many layers). And all this happiness is multiplied by 10 and starts mating. So we have 10x.


IlyaA wrote (a) >>

Didn't you read about XOR?

Reveal structure of grid (which by 200 instances).

And about number of weights, while I wrote the post, you've answered. I didn't correct my own post.

So, it turns out: the number of scales = 50*60+60+60*39+39*2+2=5519. Is that right?

And what has 200 pieces got to do with it? You didn't write about it anywhere.

 

To IlyaA and gumgum

Why are you using 2 hidden layers? One hidden layer is enough for any problem. It is proved mathematically.

 
joo >> :

->


And about the number of scales, while I was writing my post, you replied. I did not correct my own post.

So, it turns out: the number of scales=50*60+60+60+39+39+2+2=5519. Is that right?

And what has 200 pieces got to do with it? You didn't write about it anywhere.


Yes there are that many scales.

Please disclose your perceptron structure, of which there are 200 specimens in the population (difficulty to estimate).

 
joo >> :

To IlyaA and gumgum

Why are you using 2 hidden layers? One hidden layer is enough for any problem. Proved mathematically.



What do you know about roll-up nets? There's at least four layers. Four layers.
 
IlyaA >> :

Reveal the structure of the grid (which is 200 specimens each).

Do you recommend to increase the population? If you don't mind, set up a little experiment. How long will it take to train a simple task (time, number of populations) for 200 individuals and for 25 individuals. Let's leave the rest unchanged. At this point I haven't experimented at all.

Ahh, was that a question about my 200 specimens?, there's just no question mark there, so I didn't get it.

My grid: 400-600-200. The total is 360800 weights.

Yes, I recommend increasing the population.

About the experiment. I've experimented quite a lot with the number of individuals in the population. And I don't want to waste time on more experiments. The answer is not clear-cut. Much depends on the GA algorithm, and which stopping criterion is used. It is quite obvious that most of the time is spent on the fitness function itself, while the running time of a pure GA algorithm is negligible. Therefore, it is reasonable to try to reduce the number of ffs runs. You can achieve this in different ways. And the simplest one is to select the number of individuals in the population.

If to take very many individuals, about 1000, then the best individual is found very quickly, in terms of passing the number of epochs, but the fitness function is run 1000*n times, where n is the number of epochs. This is not good - it takes a very long time.

If we take too small a number of individuals in a population, say, 10-25, then there is not enough gene pool in the population for search, search time increases, again from the fact that the number of runs of ff increases.

Optimal option I think 200 individuals in the population.

I would also like to advise this. Start an additional population into which you put the best individuals from each era (I call it "Epoch's Gene Pool" or GE). When mating, take individuals from the current population and from the GE. This drastically reduces the number of ff starts. This is not to be confused with elite selection.

 
IlyaA >> :


What do you know about roll-up nets? There's, like, four layers. Four layers.

Are we smoothly on a first-name basis yet? >> okay.

I don't know what a convolution net is. Why four layers? Can you explain to me, and to those who think that one internal (hidden) layer is enough? There is no need to complicate the algorithm. It is not easy to calculate as it is.

 
joo >>:.

And your GA is implemented in what? MQL?

 
joo >> :

Are we smoothly on a first-name basis yet? Okay.

I don't know what a roll-up net is. What are the four layers for? Can you explain to me, and to those who think that one internal (hidden) layer is enough by the eye? There is no need to complicate the algorithm. It's already hard enough to calculate.


There's a book by Haykin. Neural networks?
 

to dentraf

MQL4

to IlyaA

Yes, I do. And also about 200-300 books by different authors. But I thought I will master NN and GA on my own faster than read this library. And so it turned out. Faster.

By mastering, I mean practical application, not the mastery of terminology.

 
joo >> :

If you take a very large number of individuals, about 1000, the best individual is found very quickly, in terms of passing the number of epochs, but the fitness function is run 1000*n times, where n is the number of epochs. Which is not good - it takes a very long time.

If we take too small a number of individuals in a population, say, 10-25, then there is not enough gene pool in the population for search, search time increases, again from the fact that the number of runs of ff increases.

Optimal variant I think 200 individuals in the population.

>> Thank you. Very detailed. Basically yes, if you've already run the algorithm several times with different parameters, then we'll use the results. So 200... All right, let's keep it that way. Then the next point. We should look for the profitable "fake" (combination of candlesticks and indicators) searching for it not with our eyes but with perceptron. Let it build linearly separable groups for us. Search criteria Profit => max. Stopping at will. Then analysis of the weights and identification of the "feint". Then a normal indicator and trading system. Quite complicated, but that's at first glance. Fumbling with scales is very interesting (at least for me). Question :) I have to run the history for 5 years on candlesticks + indicators (optional) through each individual, and there are now 200 on each population. This is a HUGE resource consumption, besides we don't know when we will stop. Let's try to reformulate the problem or otherwise preserve the most important property of this design - detection of the "fink" by the machine.