Hybrid neural networks. - page 6

 
joo писал(а) >>

Try writing an Expert Advisor on this indicator. I think the result will surprise you. Unpleasant...

It is stupid to write an Expert Advisor on one such indicator.

 
gumgum >> :

I teach at the initialisation stage of the indicator. And then it thinks by itself...


in init() genetics or error propagation backwards?
 
IlyaA писал(а) >>

In init() genetics or reverse error propagation?

Reverse. The terminal hangs for about 7 seconds.

 
gumgum >> :

It's silly to write an expert on one such indicator.

I didn't say it was the only one.

 
gumgum >> :

It's silly to write an expert on one such indicator.


Look. Be sure to check for overtraining. When you are done you need to identify how it is thinking to itself. To do this, you need to unset the scales and analyse the amplitude of the neurons. I do not know how to do it in MT4. But it is very important. You should not enter the market on the real before you have found out what the signal is.
 
gumgum >> :

Reverse. Terminal hangs for about 7 seconds.

Here's what I do. I run the training script, which writes all the settings for the network in a file. The indicator reads the information from the file. I do some fine-tuning of the auto-training. The principle remains the same. The indicator operates continuously and checks the file for changes on every bar.

 

I was just trying to get a grid to this indicator

bottom one (Waytoway)

 
gumgum >> :

I want to attach a grid to this indicator

Waytoway.

Is it worth it (indicator)? I did the opposite in one of my studies - I built indicators based on the mains readings.

 
IlyaA >> :
Report back on the noise.

Answered. A couple of posts above on the previous page.

 
joo >> :

If you meant exactly suppressing noise by averaging, that's not a good idea.

To extrapolate Mach to BP is a dead issue, and not the best way to use NN.

You don't have to neuter the brain to make it smarter. It needs to be trained properly without using averaging filters in the process. What does it mean to neuter, though? I have no idea what you're feeding into the input. Maybe 20 neurons is a lot, or maybe 10,000 is not enough. You don't really need to try to make NN remember one or two "tricks". A properly trained network is capable of extracting data it doesn't know from the scarce information available to it.

"Don't read too many books "C - I don't remember who said....


Didn't the task I described touch you? It's kind of all over the place. No way. Most radio electronic equipment is built on this principle.

We don't extrapolate or average, we isolate. Noise integration works here.

How do people learn? :) They read one subject, then another. Each topic is studied individually. Then generalization. And this way, your grid will rote the picture and not generalize anything. It'll become narrow-minded.

Don't read many books, what do you suggest? Watch TV and bang your head against the wall?