Machine learning in trading: theory, models, practice and algo-trading - page 3297
You are missing trading opportunities:
- Free trading apps
- Over 8,000 signals for copying
- Economic news for exploring financial markets
Registration
Log in
You agree to website policy and terms of use
If you do not have an account, please register
I can't help but share the stunning news (so accurate for me), an even stronger algorithm than SSG has been found.
This is a good thing indeed.
You're confusing entities. You are trying to fit optimisation to approximation, or vice versa.
Approximation and optimisation are different approaches to solving machine learning problems.
If I understood correctly, approximation in algo-trading is the creation of the TS itself. I want martin - created, I want scalper - created, I want patterns - created, etc. You can have MO methods create something.
And optimisation - adjustment/study of the already created TS.
Since, unlike a human, MO is also involved in the creation of TCs through the number cruncher, we can combine approximation and optimisation. Got it right?
If I understood correctly, in algo-trading, approximation is the creation of the TS itself. I want martin - created, I want scalper - created, I want patterns - created, etc. You can instruct MO methods to create something.
And optimisation is tuning/studying the already created TS.
Since, unlike a human, MO is also involved in the creation of TCs through the number cruncher, we can combine approximation and optimisation. Is that right?
Exactly
Interestingly, in terms of having a quantity of data (quotes), a human brain (as a neural network) compared to an MO is like an infusoria compared to a human.
However, so primitive humans have proven that they can create pretty good working TCs. It turns out that it doesn't require such a huge amount of data to create a working TC.
It is a mystery to me how, for example, man got to the point of working scalper models. It was done almost entirely without number crunchers.
The scenario for this must have been something like this:
Apparently, on some subconscious human brain is still able to find "patterns" on extremely small amount of data. You can't call it luck. It's a mystery.
Interestingly, in terms of having a quantity of data (quotes), the human brain (as a neural network) compared to MO is like an infusoria compared to a human.
However, so primitive humans have proven that they can create pretty good working TCs. It turns out that it does not require such a huge amount of data to create a working TC.
It is a mystery to me how, for example, man got to the point of working scalper models. It was done almost entirely without number crunchers.
The scenario for this was apparently something like this:
Interestingly, in terms of having a quantity of data (quotes), the human brain (as a neural network) compared to MO is like an infusoria compared to a human.
150 billion neurons, and not 1 output each, but many. AI will not grow to such a level soon or ever.
NS is compared by the level of intelligence to a cockroach - run, bite - run away.
One-shot-learning. When a large pre-trained NS (brain) is pre-trained on left data with just a few examples. If the model has initially learnt the laws of the world, it easily clicks a new task with a cursory glance.
here, you yourself have shown that a pre-trained brain with left data solves specific problems that it did not know before. and you say that extra "knowledge" is not needed.
Apparently, on some subconscious level, the human brain is still able to find "patterns" on an extremely small amount of data. You can't call it luck. It's a mystery.
In fact, a trader processes much more information in different ways at the same time than MO-models do in relation to trading. and besides, the brain is armed with various knowledge that is not related to trading, but helps to solve trading tasks.