Machine learning in trading: theory, models, practice and algo-trading - page 3524

 
Maxim Dmitrievsky #:
Separate file for each cluster

Thanks.

I took one sample (cluster), trained 100 models with different seed, 10 trees of depth 6, tempo 0.03.

This is the spread in accuracy - not very critical, but significant.

In terms of responses, it is much more significant.

And, I did not change any other settings - which are many in CatBoost.

As a result I consider that one model is not enough to estimate quality of markup with the help of a model.

 
Aleksey Vyazmikin #:

Thank you.

Took one sample (cluster), trained 100 models with different seed, 10 trees each with depth 6, tempo 0.03.

This is the variation in accuracy - not very critical, but significant.

In terms of responses, it is much more significant.

And, I did not change any other settings - which are many in CatBoost.

As a result, I think that one model is not enough to evaluate the quality of markup with the help of a model.

Well, that's the average.

Ah, it's on a different principle than the cv articles. Nothing was filtered here.
 
Maxim Dmitrievsky #:

Well, that's how the average is rated

I understood that one (way two) models are taken for each markup - am I misunderstanding?

But for the balance of the error - there are such a variety of totals.

Worst variant - balance and model

The best variant - balance and model.

All on one sample - what will be there on future data - I do not know. The main thing is that there is diversity.

 
Maxim Dmitrievsky #:
ah, it's on a different principle than the cv articles. There was no filtering.

You just randomly scattered grains?

 
Aleksey Vyazmikin #:

Just randomly scattered grains(units)?

With a random number of forward prediction bars. Then which way the profit is so marked, 0 or 1.

So forget it, you will not find anything in these datasets :) they work in conjunction with the second model, 2 models at once.

 
Maxim Dmitrievsky #:
It's a real pleasure, a real improvement. I touched upon the entropy topic for a reason.

Well, secrets of change I do not ask, only I will note, that in CatBoost since some version there made automatic balancing of metrics on proportion of classes - if you constantly change markup, for you it should be critical - I recommend to switch off balancing (use of weights).

 
Maxim Dmitrievsky #:

With a random number of forward prediction bars. Then in which direction the profit is marked, 0 or 1.

So forget it, you will not find anything in these datasets :) they work in conjunction with the second model, 2 models at once.

Yeah, he probably won't find anything that way.
But I'm surprised at the persistence. So many years have passed since my last visit to the thread, and still about the same. I won't quote from the classics.) A man, after all, works.
 
Yuriy Asaulenko #:
Yeah, he probably won't find anything that way.
But I'm surprised at the persistence. So many years have passed since my last visit to the thread, and still the same thing. I won't quote from the classics.) A man, after all, works.
For a long time everything works. Finds. And you are all teaching mashka, how many years have passed... and bragging about a low error, which says nothing :). In general, I am surprised by the desire to teach from those who are not familiar with the basics of MO.
 
Aleksey Vyazmikin #:

Well, secrets of change I do not ask, only I will note, that in CatBoost since some version there made automatic balancing of metrics on proportion of classes - if you constantly change markup, then for you it should be critical - I recommend to switch off balancing (use of weights).

I don't bother with it at all. Let it work as it works. It's not the most important thing. Classes seem to be balanced always.
Default catbust works better than other classifiers, that's enough for me. And fast.
 
Maxim Dmitrievsky #:
It's been working for a long time. It's finding it. And you're still teaching mashka, it's been years....
I'm training a mashka? I don't remember that. What's that about?
I'm all coca cola with the MoD, but it's not about this branch.