Machine learning in trading: theory, models, practice and algo-trading - page 631

 
forexman77:

Do you have three neurons at the input of the second layer processed by the sigmoid? How do you select weights on the second layer, which range is chosen from -1 to 1 in steps of 0.1, for example.

In my network the number of deals fell down after the second layer was processed and the result did not improve much. Unlike when I just fitted a perceptron with 9 inputs and one output neuron and then took another independent perceptron and fitted it again with saved settings of the first one, etc.

The 4th neuron processes the results of the 1st three, i.e. + 3 more weights

Yes, from -1 to 1 in increments of 0.1, but not sigmoid, but tangent.

I've tried to make an intermediate layer with the same weights as the first input one, but the number of trades has fallen and the quality has considerably improved, and optimizing 9 extra weights is too much :)

Your version sounds good... I was thinking about training conventional NS on the results of optimization... I'll have to try it. But I'm getting bored with this approach

 
Maxim Dmitrievsky:

The 4th neuron processes the results of the 1st three, i.e. + 3 more weights

Now I've tried to make an intermediate layer with the same weights as the first input layer - the number of trades has fallen and the quality has also fallen, while optimizing the additional 9 weights is already too much :)

Your version sounds good... I was thinking about training conventional NS on the results of optimization... I'll have to try it. But I'm getting bored with this approach

My impression is that I should make an indicator on the first layer and visually see what weights should be applied on the second layer. Or treat it with sigmoid (then you get the values from about 0.2 to 0.9, approximately) and then you can take small weights and do not need a large range.

Plus additional weight without reference to the input is simply the weight of the bias, what I was told by Dr. Trader. I think Bias somewhat improves results, for example it was 1.7 with bias and now it is 1.8.https://www.mql5.com/ru/forum/86386/page505#comment_5856699

Машинное обучение в трейдинге: теория и практика (торговля и не только)
Машинное обучение в трейдинге: теория и практика (торговля и не только)
  • 2017.10.04
  • www.mql5.com
Добрый день всем, Знаю, что есть на форуме энтузиасты machine learning и статистики...
 
forexman77:

I have the impression that you need to make an indicator for the first layer and visually see what weights to apply to the second layer. Or process it with sigmoid (then you get values from about 0.2 to 0.9, approximately) and then you can take small weights and do not need a large range.

Plus additional weight without reference to the input is just the weight of the bias, what I was told by Dr. Trader. Bias somewhat improves results, for example the profitability was 1.7 with bias became 1.8.https://www.mql5.com/ru/forum/86386/page505#comment_5856699

It is difficult to do something on pure impromptu :) but the main problem remains - retraining
 
Maxim Dmitrievsky:
it's hard to do something on pure impromptu :) but the main problem for now remains - retraining

Well, it's not a problem to retrain 5-10 neurons). It seems to be somewhere like that with you.

I had an interesting specimen on my computer. You have a piece of speech. A noise is generated and superimposed on this speech. Then a simple MLP is taught, and we hear almost pure speech again.

I was really stunned by this, although there is a description of similar noise modifier in Haikin as an example of using it.

 
Yuriy Asaulenko:

Well, it's not a problem to retrain 5-10 neurons). It seems to be somewhere like that with you.

I had an interesting specimen on my computer. You have a piece of speech. A noise is generated and superimposed on this speech. Then a simple MLP is taught, and we hear almost pure speech again.

I was really stunned by it, although it is described in Haikin as an example of such a noise-modifier.

Yes, but it's only one of the bots... if it doesn't work, I switch to another one... and so on, until synthesis happens :) or I get bored
 
Please do not stoop to insults, and watch your use of profanity.
 
Some trash seems to have been missed, I wish I had had time to participate
 
Maxim Dmitrievsky:
Some trash seems to have been missed, I wish I had had time to participate

The other guys deleted my post, where I answered to Mr. Terenyev. I replied to Terentiev about time series tests and said that local writers are just amateurs, because they don't understand that with 70-80% accuracy Sharp ratio will be over 20, and they have some nonsense

 
toxic:

deleted my message, where I responded to Mr. Terenyv. Terentyev about time series tests and said that local article writers are just amateurs, because they don't understand that with 70-80% accuracy Sharp ratio will be over 20, and they have some nonsense

ah, ok )

 

I'm interested in reinforcing lerning, so I found an interesting article, I'm trying to buy it and add it to the bot

https://hackernoon.com/the-self-learning-quant-d3329fcc9915

I'm trying to buy a bot and try to add it to the bot.

The Self Learning Quant – Hacker Noon
The Self Learning Quant – Hacker Noon
  • 2016.09.27
  • Daniel Zakrisson
  • hackernoon.com
Recently there has been a lot of attention around Google DeepMinds victory over Lee Sedol in the board game Go. This is a remarkable…