Machine learning in trading: theory, models, practice and algo-trading - page 514
You are missing trading opportunities:
- Free trading apps
- Over 8,000 signals for copying
- Economic news for exploring financial markets
Registration
Log in
You agree to website policy and terms of use
If you do not have an account, please register
https://www.searchengines.ru/yandeks-vylozhil-v-dostup-biblioteku.html
The R package is there, great.
installation -https://tech.yandex.com/catboost/doc/dg/concepts/r-installation-docpage/
1) Pre-install Visual C++ 2015 Build Tools -http://landinghub.visualstudio.com/visual-cpp-build-tools
Why R, I don't like it... command line or dll :)
I made a neural network regression predictor, displaying as a histogram the current price-prediction model for n bars ahead (15 in this case), trains for 5000 bars, retrains every 500 bars by itself. It looks good at first glance, but of course it works not as fast as I would like it to because I actually want to train several of them :)
And so if you look at the minutes - pretty small dispersion, of course it is high on the extreme emissions, but on average in the 100 points (5 signs) range.
The most tasty things are circled by arrows
Of course it does not work as fast as I would like it to,
On ALGLIB?
On ALGLIB?
Yep
of course you can twist with external NS or woods, for example CatBoost on gpu, but so far too lazy and there is no time
the more accurate you are, the harder it is to run it in the tester
ALGLIB is a terrible brake on learning.
Served on the ALGLIB network 240-50-1, - 2 days waited, did not wait and turned off.
Network 70-5-1 trained in half an hour. And nnet of R trained less than a minute with the same data. So now I'm sitting with R to deal with.
ALGLIB is a terrible brake on learning.
Served on the ALGLIB network 240-50-1, - 2 days waited, did not wait and turned off.
Network 70-5-1 trained in half an hour. And nnet of R trained less than a minute with the same data. So now I'm sitting with R to figure it out.
RF more or less, 50 inputs of 5000, 100 trees, 25 sec on average (on laptop). But it's also very long for optimization. Yes, NS is really slow, but it's a normal MLP, you shouldn't expect anything else from it.
I had to learn everything in a second at the most, where to get it? )
Once again I was convinced that the scaffolding cannot extrapolate, no matter how many exclamations there are that it is not:
above the red line 150 training prices (entries and exits). After that, the market began to fall and new prices appeared that were not in the training sample (were not fed to the output). The scaffold began displaying the lowest price they knew at the time of training, i.e. 1.17320, which corresponds to the horizontal line. Because of this, the residuals histogram was skewed.
Forests do NOT know how to EXTRAPLORE . All the clever ones are left for the second year to re-learn the math.
- Like decision trees, the algorithm is totally incapable of extrapolation
http://alglib.sources.ru/dataanalysis/decisionforest.phpPrices without any conversion are not fed into the model.
Forests for extrapolation take the nearest known value. Neuronka or ruler in the extrapolation will calculate something according to internal formulas. But in fact all these models in this situation will merge, so there is no difference.