Machine learning in trading: theory, models, practice and algo-trading - page 3310

 
fxsaber #:

Started to reject such a thing - completely OOS (2023). In the second half, the character of the curve changes.

by eye or automated?
 
Andrey Dik #:
by eye or by some sort of automation?

By eye. Otherwise it's p-hacking.

 
fxsaber #:

By eye only. Otherwise it's p-hacking.

powerful ))))

 
fxsaber #:

Started to reject such a thing - completely OOS (2023). In the second half, the character of the curve changes.

Why? Small profit per 1 trade?
 
Forester #:
Why? Small profit per trade?

Character's changed. Roughly speaking, it's a pecking order.

I mean, some alpha is starting to turn into some beta.

 
fxsaber #:

Started to reject such a thing - completely OOS (2023). The character of the curve changes in the second half.

Your spread is undervalued, hence the edge.
 
2saber: if the marcapas are okolule, I can drop a grail on the tests. I can train on your story, 5 minutes. I will give you the model with sources, you can adjust what you need, for the sake of science. Addressing to the model signals is simple, you can use your own logic of orders.
Then you can organise a hedge fund with a negative balance.
You can give me quotes through the standard terminal export.
 
Maxim Dmitrievsky grail on the tests. I can train on your story, 5 minutes. I will give you the model with sources, you can adjust what you need, for the sake of science. Addressing to the model signals is simple, you can make your own logic of orders.
.
Then you can organise a hedge fund with a negative balance.
You can give me quotes through the standard terminal export.

I have not understood everything. Let's not on the forum.

 

Who has tried using the"Compactness Profile" method?

The purpose of the method is to exclude inconsistent examples from the sample, which should improve learning and reduce model size if K nearest neighbours type learning methods are used.

I couldn't find an implementation in python.....

 
Aleksey Vyazmikin #:

Who has tried the"Compactness Profile" method?

The goal of the method is to eliminate inconsistent examples from the sample, which should improve learning and reduce the model size if K nearest neighbour learning methods are used.

I couldn't find an implementation in python.....

Your same link talks about linking "profile" to cross validation, for which it might be easier to find packages.
Reason: