Machine learning in trading: theory, models, practice and algo-trading - page 1496
You are missing trading opportunities:
- Free trading apps
- Over 8,000 signals for copying
- Economic news for exploring financial markets
Registration
Log in
You agree to website policy and terms of use
If you do not have an account, please register
roughly speaking yes, here is the article where the basis of everythinghttp://gekkoquant.com/2014/09/07/hidden-markov-models-examples-in-r-part-3-of-4/ was taken from.
And here is a code snippet from the example in the article.
If you take away the data processing and visualization, yes, the code is three lines long.
well, here's where e.g. bulmarket and beermarket and bulmarket, where to get them from, i.e. we need some data preprocessing at first
and then comes the calculation of the path through the Viterbi and forward-backward, also 2 different algorithms. I don't get it, I'll read it.
in python it looks like this, in lib hmmlearrn
I'm currently working on Evolving Neural Networks through Augmenting Topologies according to the article http://gekkoquant.com/2016/10/23/evolving-neural-networks-through-augmenting-topologies-part-4-of-4-trading-strategy.
I haven't managed to remotely install RNeat package via devtools, I used its alternative - remotes (remotes::install_github). The script for MT4 is almost ready. I excluded complex preprocessing transformations, I will try to use raw data first. I have added possibility to work with any number of predictors. I will write to you if something interesting appears.
I am adding an example of R-script for forex data. The analyzed symbol is USDJPY-H1. Initial data - last known price and 10 RSI lags.I would very much like to see how RNeat works on my indicator
Using well-known indicators and even with a fixed period is a rotten idea, no algorithm will not find any patterns there because they simply do not exist, the market has a dynamic, fractal (mutually enclosed structure), we need an indicator that is at least a little bit adequate market, which at least a little bit, even indirectly, takes into account the fractal
I agree. I got good results with the ZigZag indicator. I feed the prices of the last extrema or derivatives thereof including the unfinished price of the last extremum. The indicator is calculated for each instance from the training set, that is, a variant without re-drawing is obtained. This is the only indicator that has shown more or less satisfactory results, which can be traded.
If I understood correctly, I also did almost the same thing, but with a different algorithm, I predicted not the direction but the knee of turn, I wrote here how I processed the pricehttps://www.mql5.com/ru/forum/86386/page1476
when you input the target it does not get a perfect result
It's not quite clear... Can you be more specific?
p.s. And what was the target? I looked at the code, but I do not understand, maximizing profits?
If I understood correctly, I also did almost the same thing, only with a different algorithm, predicted not the direction but the knee of a turn, here wrote how to process the pricehttps://www.mql5.com/ru/forum/86386/page1476
I can give a market explanation of this effect. At the turning points of the market there is a change in the balance of supply and demand associated with the manifestation of stronger (fundamental) pricing factors. Perhaps, the mathematical relationship between different past points may contain valuable information for technical forecasting of future points.
It seems to me that everything is even simpler... If we leave only significant extrema of the price and throw out everything else, plus we do not predict the direction, but just the trace of the extremum, it does not clean the data from a lot of noise and reduces the degree of freedom for the neural network, for which it is grateful
It seems to me that everything is even simpler... If we leave only significant extrema of the price and throw out everything else, plus we do not predict the direction, but just the trace of the extremum, it does not clean the data from a lot of noise and reduces the degree of freedom for the neural network, for which it is grateful