Machine learning in trading: theory, models, practice and algo-trading - page 1761
You are missing trading opportunities:
- Free trading apps
- Over 8,000 signals for copying
- Economic news for exploring financial markets
Registration
Log in
You agree to website policy and terms of use
If you do not have an account, please register
A thought occurred, the archiver can be used as a test for randomness. Random will not be compressed by the archiver.
We need to test cotier, different sizes, several samples. Try to do a decomposition into components, whether compression improves.
flac compresses worse, although it seems to be better suited for that. Rorschach: flac is worse for compression, although it seems to be more suitable for that.By volume ratio, the price is similar to fractal
By volume ratio, price looks like a fractal
a good start. If the price is similar to fractal variations, where the patterns are definitely there)))) P with randomness is also true))))
one thing I don't understand is why the compression ratio is the highest in the random series and the P's?One thing I do not understand is why the compression ratio is the highest in the random series and in Pi?
It's the opposite, they have the lowest compression ratio
It's the opposite, they have the smallest compression ratio
Got it, the degree was confusing. the only thing left to do is to pull out the patterns and unzip and see the difference with the original.
A thought occurred, the archiver can be used as a test for randomness. Random by the archiver will not be compressed.
It is necessary to check cotier, different sizes, a few samples. Try to do a decomposition into components and see if the compression improves.
flac compresses worse, although it seems to be better suited for that. Rorschach: flac is worse for compression, although it seems to be more suitable for that.Already encountered such an idea. Since links to third-party resources are not allowed, I will paste a picture.
An interesting idea is to represent images as a wave, or rather 50 waves.
Each wave can be fed to the NS input.
Just a thought.
An interesting idea is to represent images as a wave, or rather 50 waves.
Each wave can be fed to the NS input.
Just a thought.
Funny, but it makes no sense.)
fun, but pointless )
I have experimental "convolution" predictors, the principle of which is to break the chart into a grid and inform about the accumulation of bars, actually contrast. And, surprisingly, the patterns work in some leaves. I haven't developed this idea further yet, but focused on the data structure and aggregation capability. My point is that the principle of data compression is essentially similar here.
By the way, the proposal to work experimentally with an open data set is interesting. However, how do you imagine it's possible without open source EA with partitioning function and saving predictors?
I have experimental "convolution" predictors, the principle of which is to break the chart into a grid and inform about the accumulation of bars, actually the contrast. And, surprisingly, the patterns work in some leaves. I haven't developed this idea further yet, but focused on the data structure and aggregation capability. My point is that the principle of data compression is essentially similar here.
I did something similar, but in the form of distributions, or in more trader's terms - I built something like volume profiles instead of charts, it's a funny thing... it makes sense to dig + representation of levels + convenient algorithmic
By the way, the proposal to work experimentally with an open dataset is interesting. But how do you imagine it's possible without having an open source EA with markup and saving predictors?
I imagine it all in the form of tcht or csv file, where the columns are features/predictors/attributes and the last one is a target file
We agree with the target one, what it should be and with what parameters and go ahead...
Each participant opens a dataset in his software and tries to reduce the classification error, if he succeeds, he adds those features to the dataset, which he has generated, and returns the improved dataset to the public, and so on, a human genetic algorithm of feature generation and selection appears.
When an acceptable error is reached, then we can think about porting everything to mql code
That's how I see it
I think the acceptable error is 95% +, so far the maximum is 77-83% +-
Also, to avoid generating "garbage signatures" you can set limits such that the sign should improve the error by at least 1%, for example, otherwise they would throw 15000 bags and say here I improved the error by 2%)) I'm a hero))
I did something similar, but in the form of distributions, or in more trader's terms, I built something like volume profiles instead of charts, it's an interesting thing... it makes sense to dig + representation of levels + convenient algorithmic
I have 16 4x4 cells, dynamical window and I count how many bars closed in each cell.
I show it all in txt or csv file, where the columns are features/protectors/signatures and the last one is a target one
We agree straight away with the target one, what it should be and with what parameters and go ahead...
Otherwise it is not clear how to get new predictors - on the basis of what data.
Then we can add a lot of targets, that will allow to identify the predictive power of the predictors, because a good predictor will be successful at different targets.
The instrument should be the same for everyone - Moex futures or glue is better. I like Si.