Machine learning in trading: theory, models, practice and algo-trading - page 3369

 
Aleksey Nikolayev #:

There is some theorem that states, roughly speaking, that any model predicting an object is always equivalent to some model describing the object. It turns out that we always build some model of the market in addition to the desire to do so.

Fair enough :)
 
Aleksey Nikolayev #:

There is some theorem that states, roughly speaking, that any model predicting an object is always equivalent to some model describing the object. It turns out that we always build some model of the market in addition to the desire.

Well, that's obvious.

but I would correct it

any model successfully predicting an object is always equivalent to some model describing the object.

 
mytarmailS #:

Well, that's obvious.

except I'd tweak it.

any model successfully predicting an object is always equivalent to some model describing the object.

This is true for planning "from the ground up", from the assumption that trends will continue.

However, it is completely untrue for investment based models.

Take kinescope TV and plan based on the past. Everything works fine for many years. But these plans do not take into account that TVs on other physical principles, such as LCD TVs, may appear.

In economics, this example works EVERYWHERE. The cyclical nature of recessions in a market economy is based on the overproduction of the old and the emergence of the new.

 
It's all mixed up, horses, people.
 
So? Why doesn't anyone write about their achievements? Or are we gonna keep choking? :)
Did anyone even finish ONNX?
 
mytarmailS #:
It's all mixed up, horses, people.
where's the article?
 
Maxim Dmitrievsky #:
Where's the article?
What's Maxim talking about
 
mytarmailS #:
What Maxim's about
About MO. A bot of some sort maybe.
Give me an intelligent product
There are already 70 articles about RL over there, everyone has already figured it out. Need other topics :)
 
СанСаныч Фоменко Searching for patterns in history with the help of MO. If some effort is put into preprocessing, such patterns will give a future classification error of less than 20%.

There is only one pattern, aka pattern. Once you simplify any movement to ABC, then you can find all the mat. proportions of the pattern. Although even without machine learning, all this is easily calculated by hand. Ranok is no more complicated than algebra and geometry lessons at school.

 
Bogard_11 #:

Thepattern, aka model, is only one. As soon as you simplify any movement to ABC, then you can find all the matrix proportions of the pattern. Although even without machine learning all this can be easily calculated by hand. Ranok is no more complicated than algebra and geometry lessons at school.

A pattern, for example, in wooden models, of which there are many, models, is a tree. Each wooden model, for example, CatBoost finds over a hundred trees, read patterns. For RandomForest I have statistics: up to 50 trees the classification error drops, and over 150 trees the classification error is stable, i.e. in the time series I processed the number of patterns does not exceed 150.

Reason: