Machine learning in trading: theory, models, practice and algo-trading - page 3036
You are missing trading opportunities:
- Free trading apps
- Over 8,000 signals for copying
- Economic news for exploring financial markets
Registration
Log in
You agree to website policy and terms of use
If you do not have an account, please register
It is very interesting to experience how perceived visual and auditory patterns become perceptual rubbish while still retaining their patterns.
Maybe there is a brain limit to perceiving complex patterns.....
Randomness without repetition :)
Like if you get a steady state via FF, it will characterise a steady state TC? Everyone realises that this is kurwafitting
The other variants make it almost random. It is necessary to try different variants.
The other variants make it almost random. You have to try different variants.
From this variant, too, you will get shit ...
watching beautiful/not beautiful balance growth is not enough.
Well here even the approach itself does not include the possibility of non-random success :) the rules look better, which are tested for stability .
rule stability is as unworkable on OOS as the balance curve is to fite.
I've done all this before, in different forms, many times....
But I still think that everyone should know how to write FF and use AO...
This option will also make a mess ...
watching the balance grow beautifully/not beautifully is not enough.
rule stability is as unworkable on OOS as the balance curve fite
I have done all this already, in different forms, many times ...
But I still think that everyone should know how to write FF and use AO....
There is a huge number of different models, each of which has parameters for customisation. You get a sea-load of results from fitting models.
One should try to improve the prediction by being able to select the best predictions from the models, for example, by using an ensemble of models caretEnsembles::
If you create a complete trading system from preprocessing and predictor selection to EA, you will find that at each step, and you get a lot of them, there are some chips that can reduce the "out of sample" prediction error below 20%, with the same ratio of profitable to losing trades in the tester.
Unfortunately, this small and painstaking work is substituted with rubbish.
There are a huge number of different models, each with customisation parameters. The result is a sea-spill of model fitting results.
One should try to improve the prediction by being able to select the best predictions from the models, for example, by using an ensemble of models caretEnsembles::
If you create a complete trading system from preprocessing and predictor selection to EA, you will find that at each step, and you get a lot of them, there are some chips that allow you to reduce the "out of sample" prediction error below 20%, with the same ratio of profitable to losing trades in the tester.
Unfortunately, this small and painstaking work is substituted with rubbish.
You have already copied it for the 40th time, the same thing, the same thing ....
The only question is where is the robot?
СанСаныч Фоменко #:
the "out-of-sample" prediction error is below 20%, and in the tester there will be the same ratio of profitable and losing trades.
Classification error is not an indicator. The indicator is balance and balance line. Years 5 and more.
I showed you the balance with 8.3% classification error on the OOS. https://www.mql5.com/ru/forum/86386/page3008#comment_46150275
Profitable, but still threw such a model in the basket.
Show your balance line with 20% on OOS. It will be an example to strive for.
You've already copied it for the 40th time, the same thing, the same thing ....
The only question is where is the robot?
No robot, because we had to reject the teacher and there were technical problems with the advisor. Now all technical problems have been overcome.
I consider your idea with balance to be unworkable, although I liked it initially. Balance can NOT be a teacher, because it does NOT exist. You have to design a teacher more carefully than I did before.