Machine learning in trading: theory, models, practice and algo-trading - page 604
You are missing trading opportunities:
- Free trading apps
- Over 8,000 signals for copying
- Economic news for exploring financial markets
Registration
Log in
You agree to website policy and terms of use
If you do not have an account, please register
That was a quip. No banter - twist the twist, see how it affects you.
If the inputs are adequate for the task, you can do it on "1 neuron".
In the context of mo, ideologically correct at toxic-a.
--------------
Professor on Deep Networks - youtu.be/qx3iM2aa2yU
31 min. "There's still not much science, but a lot of voodoo magic."
Stages of development characterized by a high rate of change are specifically called a leap.
The activation function (sigmoid, tanh and others) is a leap, modified by the introduction of a limit on the rate of change.
How much more time will it take for the "seekers" here to realize the meaning of this fact...
Well it does not work on the forex)
Stages of development characterized by a high rate of change have a special name - a jump.
The activation function (sigmoid, tanh, etc.) is a jump, modified by the introduction of a limit on the rate of change.
How much more time will it take for the "seekers" here to realize the meaning of this fact...
what is the point of making sense of anything without actual evidence of robustness?
i prefer such statements: here's a deposit growth curve (at least on the test)... and now you're all m...kikes... then yes, no questions
What is the point of realizing something without actual proof of robustness?
Did you realize what you just said?
Do you know what you're saying?
I do.
And in the learning process you can include steepness optimization, I actually did so, but only for fuzzy logic. The steepness can have a big effect, yes.
You gave a link to the article https://habrahabr.ru/post/322438/
If the graph of neural network error function is plotted really so (posted here in tangents):
Obviously, it is possible to build something similar using sigmoid, but the steepness of individual sections will be lower.
If sigmoid is less steep, you can probably do the same thing with tangents, but you just need to take them 3-5 times more. I.e. increase the number of neurons.
Probably sigmoid was giving me less error, because I lacked the number of neurons in network at tangent.
Who has an opinion? Is it better to study trading and pay money or on a free basis? And another question, is it worth spending money on a paid course at all?
Something I thought about the article https://www.mql5.com/ru/articles/497 where the steepness of the activation function changes and came to the conclusion that the network itself will find the right steepness:
Let's see the formula:
When training, the network must pick up the multipliers Wn. If it is more profitable for the network to total *0.4, then it will simply pick up all weights of Wn, each of which will already be * 0.4. I.e. we just put a common multiplier in brackets, which itself will be determined by minimum error.
If anyone disagrees, correct me.
Something I thought about. and came to the conclusion that the network itself will find the right steepness:
Exactly. The NS will either proportionally increase or decrease all the weights by the right amount (which will be -steepness), and even pick up the right offset.
Anyway, for most tasks it doesn't matter.