You are missing trading opportunities:
- Free trading apps
- Over 8,000 signals for copying
- Economic news for exploring financial markets
Registration
Log in
You agree to website policy and terms of use
If you do not have an account, please register
Preprocessing, i.e. what is being fed into the system, is likely to be important here. IMHA, this is a cornerstone of adaptive systems. These values themselves should characterise stable market phases. And synthetics should be generated based on these inputs. Roughly speaking, they should be generated and their distribution should be changed (changing values of input parameters of the adaptive system)
Err, you don't need to think about what is fed to the ATS input. You must first have the original source of the input data, the same OHLC but synthetic.
After all, the question was addressed to those who understand statistics, as I don't know the subject well enough.
When I implement it in working code, I'll put it in the codebase. Whoever is interested, will use it.
If you have anything to say on the topic I suggested, Ilya, I'd be happy to hear.
Oh there is and there is a lot! What is the best way to approach the problem of adaptability? It is to construct a statistically similar series based on the statistical parameters of the general sample. Thus we obtain a price model with characteristics similar to those of the general sample, but with new data and with any number of them. And the novelty of its movement will correspond with the properties of the general sample. A grid or an adaptive Expert Advisor will not be able to adapt, because the data is constantly changing. But those data contain stat laws that will be targeted by the neuron or adaptive Expert Advisor. And it is the stat laws that will generalize (try to generalize, we must still think about the neuron). That's the end of the first part. :)
The question was not about learning. The question was about creating a synthetic VR with given stat. parameters.
The question was not about training. The question was about creating a synthetic BP with the specified statistical parameters.
The algorithm is roughly as follows:
1. decide on a group of parameters for the general population. About 5-10 usually. Sociologists have 100-150.
2. Construct a probability density for each feature or combination of features. 3.
Start modelling data with specified distributions. Combination is checked for consistency with all parameters, and corrections are made to the generation algorithm.
4. The data is used for training of Expert Advisors.
The algorithm is roughly as follows:
1. decide on a group of parameters for the general population. About 5-10 usually. Sociologists have 100-150.
2. Construct a probability density for each feature or combination of features. 3.
Start modelling data with specified distributions. Combination is checked for consistency with all parameters, and corrections are made to the generation algorithm.
4. The data is used for training of Expert Advisors.
That's great! And now for 1,2,3 except for 4, please elaborate.
What kind of tricks? if it's not a secret....)))
I would use (real signal + artificial noise) to investigate TC for stability.
And I do not see any practical sense in a purely artificial oscillator. Yes, I understand the idea of debugging the TC algorithm on the simulated by a specialist needed conditions, but I'm not sure that it will be adequate. Then the necessary area in real quotes can always be found and not one.
What kind of tricks? if it's not a secret....)))
Well, it's really quite simple. I'll tell you the methods, but I'm sure you know them too.
1. An early stop.
2. Cross-checking.
3. Reducing weights
4. eliminating weights.
5. Smoothing approximation.