Machine learning in trading: theory, models, practice and algo-trading - page 14
You are missing trading opportunities:
- Free trading apps
- Over 8,000 signals for copying
- Economic news for exploring financial markets
Registration
Log in
You agree to website policy and terms of use
If you do not have an account, please register
Okay, let's think about it. Imagine for example that at each iteration neural network does not do a standard error back propagation based on target deviation from prediction, but gets data that at previous version of weights real trade differs from ideal by so and so many points. And on the basis of this information it updates the weights. In other words, the information should be a parallel flow to the machine. Maybe, it can be done.
Why do you need weights? Why are you talking about them at all?
I understood that you need previous weights, now I realized that I don't understand anything)
Why do you need weights? Why are you talking about them at all?
I understood that you need the previous weights, now I understand that I do not understand anything)
Well, it looks like it ))
We don't need the weights. I just mentioned them to make it clear to YOU. We need to know how the system traded on previous weights. We need the outcome of the trade in some integrated way. Everything.
The weights are updated by the algorithm.
Here, a simple function works. You have to try to make a more complex one. It still needs to differentiate, that's the trick.
library(neuralnet)
y <- as.data.frame(matrix(runif(n = 10000, min = -15, max = 15), ncol = 2))
y$V3 <- runif(n = 5000, min = -15, max = 15)
y$V4 <- runif(n = 5000, min = -15, max = 15)
y$V5 <- runif(n = 5000, min = -15, max = 15)
y$V6 <- runif(n = 5000, min = -15, max = 15)
y$V7 <- runif(n = 5000, min = -15, max = 15)
y$V8 <- y$V1 ^ 2 + y$V2 ^ 2
colnames(y) <- c('input_1', 'input_2', 'noise_1', 'noise_2', 'noise_3', 'noise_4', 'noise_5', 'output')
f_f <- function(x, y){
1/2*(y-x)^2
}
print(
nn_model <- neuralnet(formula = output ~ input_1 + input_2 + noise_1
, data = y
, hidden = 1
, threshold = 0.01
, stepmax = 1e+05
, rep = 100
, startweights = NULL
, learningrate.limit = NULL
, learningrate.factor = list(minus = 0.5, plus = 1.2)
, learningrate=NULL
, lifesign = "none"
, lifesign.step = 10
, algorithm = "rprop+"
, err.fct = f_f #"sse"
, act.fct = "logistic"
, linear.output = F
, exclude = NULL
, constant.weights = NULL
, likelihood = FALSE
)
)
f_f <- function(x, y) 1/2*(y-x)^2
f_f - counts the error for the network right?
x is what the value should be (ideal curve)
y is what it really is (the real curve)
the difference between them is the error
i need a vector of trade by candlesticks, i need the rules to open the deals, so how should the input data with trading look like?
f_f <- function(x, y) 1/2*(y-x)^2
f_f - counts the error for the network right?
x is what the value should be (ideal curve)
y is what it really is (the real curve)
the difference between them is the error
I don't understand how to feed it with data, I need a vector of trade by candlesticks, and what rules should be used to open the deals, how these input data should look like
A colleague threw me a link to a course on machine learning, look at plz, what do you think? The course is free, but it's in Python for some reason (
https://www.udacity.com/course/machine-learning-for-trading--ud501
A colleague threw me a link to a course on machine learning, look at plz, what do you think? The course is free, but it's in Python for some reason (
https://www.udacity.com/course/machine-learning-for-trading--ud501
The most effective:
1. put R - 5 minutes
2. Download the rattle package, which is designed as a GUI and therefore does not require any knowledge of R.
3. To cut down on your initial costs you can read my article. It contains explanations and, what is more important, has a ready file attached. Once you see my file you can easily prepare your own.
4. You get six models.
5. The main thing in the rattle you can see the full cycle of machine learning:
All this will give you some foundation without gaps, and most importantly a concrete forex-linked machine learning experience.
PS.
Rattle will be very useful to you not only at the first steps, but also in the future: minimal costs, to estimate something, to experiment...
PSPS
Of course, you can't do without books. There are plenty of them here. The search works perfectly.
Thank you for the detailed clarificationDr.Trader!
You know probably the best and most correct would be to teach the reversals themselves, even the same zigzag, that is, to give three states 1) reversal up
2) down reversal
3) not a turn
But whether to learn, it is quite difficult to catch reversals, plus the skew in the number of observations, classes "not reversal" will be ten or maybe hundreds of times more
And what predictors do you use and what are the results?
I have just started to use spectral analysis, the first tests were much better than the ones with the indicators, I ran it with rattle, the error of learning and testing was 6%, but when I started to use R the error went up to 30% if I am not mistaken, San Sanych says that it was overtraining, so I still do not understand much
There is also a way to find out which periods dominate in the market through spectral analysis and then you can use these periods in the indicators, the result will be adaptive indicators, but not adjusted to the history.
I use standard indicators as a base for creating predictors. I'm still experimenting with it myself, trying out the ideas from this forum thread.
I've been doing it for the last weeks, now the best result is as follows: (a lot of calculations, I'm studying this approach on D1 timeframe to be faster, then I'll try to use a smaller timeframe)
1) export from mt5 to csv: ohlc, time, indicators, everything for the last 10 bars. Recently I started to take time only from the newest bar, I believe that time of other bars is calculable and therefore it does not bring any new information. Several hundred "primary" predictors come out. The required result of training is "1" or "0" - the price increase or decrease for the next bar. My results with zigzags are unstable and difficult, I work better with close prices now. When I work out the full algorithm for model training from scratch - I can start working on zigzags and trend prediction.
2) I'm doing in R various mathematical operations with available data - adding, deltas, min, max, etc. It already comes out more than a thousand predictors.
3) Obviously, there is more garbage after the second step than needed. I am sifting it out by the method of the article about basic componentshttp://www.r-bloggers.com/principal-components-regression-pt-2-y-aware-methods/, SanSanych wrote about it earlier. I'm not teaching the PCR model itself; for now I've settled on the following function for pre-screening of predictors:
srcTable is a table with predictors, the last column should be the required training result.pruneSig is better to leave -1.
As a result, the function will return a list with column names from the table that carry some useful information. Or an empty list if nothing useful is found. This method is indicated in the article as not very significant, but it turns out to be quite adequate, it sifts out garbage very well. Also, the list of results will be sorted by importance, from more useful to less useful.
4) If function returns an empty list, I again do the second step, again generating different mathematical combinations on available data, then the third step for sifting out. So I have to repeat it 3-4 times. The volume of data grows with each repetition, so it's better to somehow limit the volume of new data generated. You can change this function for sifting out so that if the list comes out empty, it returns a hundred or two best results and generates new predictors only from them.
5) Next, according to the article we need to teach the model of the main components itself. I have problems with it - so far the best r-squared for the trained model is 0.1, it's not enough, the article says that I need at least 0.95. But I can train some other R model on obtained predictors and it will give better result. I have the most experience with neuronics, the best result in fronttest with it comes out with an error of about 37%. PCE model is supposed to be more stable, without retraining, etc., but so far I can't get predictors for it.
If you have error of 30% in fronttest then it's quite a profitable model, make Expert Advisor for mt5 and check it in strategy tester.
A colleague threw me a link to a course on machine learning, look at plz, what do you think? The course is free, but it's in Python for some reason (
https://www.udacity.com/course/machine-learning-for-trading--ud501