1) The grid is able to recover the function if the input data contains it. If in the last experiment the period value depends on volatility, then the grid should have given some estimate of that volatility, i.e. you may not have provided all the necessary data for recovery.
2)You can squeeze everything you need out of MLP. Use other networks when you can mathematically prove that using other architecture is better than MLP.
3)NS2 - fast, quality result, easy to transfer anywhere...
The error on the test one stops decreasing... I usually do at least 3-5 trainings, maybe more when the result is more important, with selection of neurons in layers, more precisely in a layer. a few trainings to see the spread and minimum.
In my opinion, when the error on the test one stops decreasing, it is most likely over-training. How does the network behave in OOS, with such minimal error on the test one?
If neurons are selected correctly, the network behaves absolutely the same as in the training one, even more so with a 200 000 sample the same result is obtained with a much smaller training sample (more than 5 times smaller).
I.e. sometimes by selecting neurons we can equal errors of the test and training sample.
If the neurons are selected incorrectly, the error in the test one is a bit larger but remains on the "general" sample.
- Free trading apps
- Over 8,000 signals for copying
- Economic news for exploring financial markets
You agree to website policy and terms of use
Gentlemen, good afternoon. A question for experts in the field of neural networks. The bottom line is this. Installed statisctica and started my research with automatic neural networks. Multilayer perspetron. Set a goal to understand how intelligent neural networks are in finding patterns. What did I do? I took the most usual LVSS (linearly weighted average) for the last 20 bars. I gave the last value of the LSS as a target (output) and the last 20 points on which the current value of the LSS depends. Obviously, a person knowing the last 20 points and the formula for calculation of the LVLS would be able to restore 100% of its value. The grid did not know the formula and its task was to understand it in its own way. Result - the stack restored the LFSS by 100%, i.e. it understood how the LFSS is arranged. We can consider that it coped with the task perfectly, i.e. if there is a pattern, the net really finds it. Then a similar experiment with EMA, SAR and oscillators was performed. The result is the same. 100%. After that I decided to complicate the task. I took the adaptive average. Let me remind you that it changes the averaging parameter depending on the market volatility. The volatility, in its turn, is calculated for a certain amount of bars. I input all necessary bars to build the ACS and start the grid. The result was much worse than 100%, though a person knowing the ACC formula and possessing all points would be able to build the ACC at 100%. Actually the network failed, we are talking about automatic neural networks.
Conclusion and questions to the experts in the field.
1) Did I get it right that a neural network is unable to reconstruct a function if it is inherently dynamic as in the case of ACC, even if I have all necessary data for calculation, since if the formula is rigidly static as in the case of LVSS or EMA, there is no problem.
2) If I am wrong, which networks should be used? And used MLP in statistics.
3) I have heard the opinion that auto nets and nets of own e.... design, if I may say so, there is not fundamentally much difference. Is this really the case?
4) What networks and what programs would you advise for application to financial markets, in particular for the task I have described, i.e. to restore values from all known data.
Respectfully, mrstock