Machine learning in trading: theory, models, practice and algo-trading - page 2261

 
Maxim Dmitrievsky:

Complex models are a huge advantage, but to arrive at them you need to spend a couple of years of your life, which has been successfully spent already

Biodiversity is everything.

 
fxsaber:

Biodiversity is our everything.

A simple example. Thousands of runs with different parameters VS one training, including all trait space. Then the same visualization and analysis, elimination of the superfluous.

well that's every ... as he wants.
 
fxsaber:

I doubt that any MO would be able to reengineer such a TS: it looks at the most similar areas to the current one in the past. And if they statistically have a preponderance of further movement in some direction - that's where we signal.

If before that for simplicity the search of similar segments is performed not on the price series, but on the transformed price series, for example: Entries or bars are replaced by the binary logic: up(0)/down(1). Then the reengineering task for MO becomes quite complex.

Missed it... it's interesting to try this on example trades.

There are two points.

  • Any price transformation - zigzag, etc. - is an analytical approach, which MO does not really care about. You can fill any signs in any quantity and then filter out uninformative ones to make the model easier. All the zigzags and so on are just a different representation of the same data.
  • MO generalizes, so if you give examples of where to trade and where not, it may work.
The funny thing is that you think you've found some kind of pattern through the zigzag. But most likely it's some kind of veiled seasonal dependence that can be described in 1,000 and one ways. Well, or a different one if it's tics.

It may even turn out that MO rips your TS on your own data

But I'm not ready to deal with ticks until I update my laptop

 
Maxim Dmitrievsky:

It may even turn out that the MO will tear up your TC on your own data

I don't doubt it. But MO won't reproduce TC based on a search for similar situations in the past.

 
fxsaber:

I don't doubt that. But MO will not reproduce TC based on finding similar situations in the past.

Why is it so unique? You train on history, MO will pull the same dependencies from other traits

You found cats and dogs by some sort of overshoot, but they have other traits. For example, cats have long whiskers.

ns will learn to distinguish by whiskers, not by ears...what will change

Well, it's an individual thing. Theoretically there is no problem.

And well, here's the boxplot example from the article. I found statistical patterns like the case you described. Then I trained NS on random signs to trade seasonal patterns and it did better. This is for understanding.

 
Maxim Dmitrievsky:

What makes it so unique?

Because there will be no comparative similarity characterization in MO. It is impossible to prepare such data for training unless you know in advance what the TS is based on.
 
fxsaber:
There will be no comparative characterization of similarity in the MO. You can't prepare such data for training, if you don't know beforehand what the TS is based on.

may not work. But when the data is already prepared, i.e. there is a regularity, then in some Hilbert space the points of classes (for example, to buy and to sell) are well divisible, otherwise it cannot be. The MO will pick up (try to) such features that will correspond to them. There is a certain magic in this, because it is not even so important to know the right traits as to correctly mark up the data, to distinguish cats from dogs.

If there are deals with their timing, you can check.

 
Maxim Dmitrievsky:

If there are deals with their timing, you can check.

The example was hypothetical.

 
fxsaber:

The example was hypothetical.

Hypothetically, there is no problem."Similarity" will be pulled out through other traits because the time series is the same. Practically, there may be difficulties, such as hand curvature )

You have a set of close patterns that generalize well. You generalized through correlation, the model will generalize through a sliding window on history. Similar entities will be grouped and labeled buy/sell/don't trade.

Similar clusters within the model will look like this, only in multidimensional space. Each cluster has its own buy/sell label. It's a very simple task. It's just a generalization.

 
Maxim Dmitrievsky:
If there are specialists in generative models, you can try to shake the covariance matrix of GMM model. I.e. do not change the mean and variance of the series, but change the GMM covariance matrix. The output should be a lot of examples with different properties.

What do you mean?

just shake the matrix cov. it will be random ....

You need to know the purpose - what to shake for, what the final res. should be ???