Machine learning in trading: theory, models, practice and algo-trading - page 2814

 
mytarmailS #:

I've thought, I've tried, I've experimented, I've written code...

So you think clustering doesn't work on new data? That's nonsense )

And if you think that hmm can be considered a clustering algorithm.
 
Maxim Dmitrievsky #:
So you think clustering doesn't work on new data? That's nonsense )

you don't get it...

There is a cluster/state.

It has a beginning, it has an end, let's say 5 candles.


through klsterisation you will know the number of this cluster at the 5th candle (when the cluster prototype is compared to the current state).

through hmm you will know the number of the cluster/state at the 1st candle (or rather probability)

 
mytarmailS #:

you don't get it.

there's a cluster/state

It has a beginning, it has an end, let's say 5 candles.


through klsterisation you will know the number of this cluster at the 5th candle (when the cluster prototype is compared to the current state).

through hmm you will know the number of the cluster/state at the 1st candle (or rather the probability).

You will know the cluster number on the current data in both cases, without lag.

Hmm is the same clustering algorithm for sequences. Nothing outstanding.
 
Maxim Dmitrievsky #:
You'll know the cluster number on the current data in both cases, without any lag

No.

Clustering is recognition.

Hmm, it's predicting what state you're in right now.


Let's say we have two states, head-shouldered and non-head-shouldered.


That's how clustering works - by comparing it to a prototype cluster.


I.e. we learn about the state of the cluster after the fact, when the comparison with the prototype has taken place.

=================

And HMM will give us the probability that we are in the state of the GP.


 
mytarmailS #:

No.

Clustering is recognition.

Hmm, it's predicting what state you're in right now.

Knock it off.)
 
Maxim Dmitrievsky #:
Knock it off )

You too...

If the Viterbi algorithm from HMM can produce something like clustering 111222444111111.....

and someone wrote that it can be used as a cluster, it doesn't mean it's clustering.

 
mytarmailS #:

You too.

if the Viterbi algorithm from HMM can produce something like cluster 111222444111111....

and someone wrote that it can be used as a cluster, that doesn't mean it's clustering.

How many hidden states you set, you get as many clusters. It's the same thing. Okay, okay, I don't like to chew on this.

What you care about is the separation for training different models, no matter what the principle is. In any case, the average of increments will affect the cluster number, that's what you need.
 
Maxim Dmitrievsky #:
As many hidden states as you set, you get as many clusters. It's the same thing.

What does that have to do with it?

For the last time.

with clustering, you get the cluster number when the cluster ends.

with SMM, you get the cluster number at the beginning of the cluster.

 
mytarmailS #:

What's that got to do with it?

For the last time.

When clustering, you get the cluster number when the cluster ends.

with SMM, you get the cluster number at the beginning of the cluster.

No, you're delusional.
 
Maxim Dmitrievsky #:
No, you're delusional.

Okay.

Reason: