Machine learning in trading: theory, models, practice and algo-trading - page 340

 
Vladimir Perervenko:

R has a nice mxnet package. But more advanced models should be looked at in Pythone.

Strange. The other day I was reading a relatively recent review on neural networks and free NS software. So there Python is far from being ahead of the rest of the world. It is mentioned among others, but, as I understand it, it does not shine with variety. I wish I had saved the link.

As for R, due to its specialization, it completely lacks the usual modeling mathematics - such as signal filtering, and much more. What can you do - either sing or dance.

 
Vladimir Perervenko:

Take a look at This, This, This, and maybe This.

Not everything will be clear, but some basic neural network concepts will hopefully come through.

Good luck

By the way, has MT4|5 integrated with R or should I use a DLL?
 
elibrarius:
By the way, has MT4|5 integration with R been done or do you still need to use DLL?
It's not integration, but some similar libraries of algorithms in MT. The integration implies working directly with R. The DLL is already done. Ask SanSanych for a link to it. He is the organizer and inspirer of our victories.
 
Dr. Trader:


My personal opinion - neuronics, forest, regressions - all this is too weak for forex.............

I am currently studying pattern recognition models that look at the history of how price behaved after similar patterns.............

Right in my footsteps, and you think the same way I do, it's funny.

The market is an interesting beast, it's hard to understand it right away... I'll tell you how you can add a little more stability to neuroke. I wrote about it a long time ago, you need to add the so-called "critical point of view".

The recipe is this:

1) take some market data - it can be anything from indicators to the price

2) take a training sample and divide it into three parts "A" , "Б" , "Ц"

3) take a neuron at the output having a vector with the probability of a class instead of the class itself; teach thisneuron using market data of the "A" sample

4) forecast "B" and "C" samples with our neuron, we obtain a vector of forecasts of "B" and "C" samples

5) take a new neural network and train it again with market data from sample "B" and add another vector of forecasts from sample "B" from the old neural network

6) sample "C" for validation.


Try it, see how it works

 
Yuriy Asaulenko:

signal filtering, and many other things.


Can I be more specific? Very interesting.

It seemed to me that everything in R is redundant. The rubricator from statistics doesn't look like Matlab, but everything seems to be there...

 
elibrarius:
By the way, has the MT4|5 integration with R been done or do you still need to apply a DLL?

Here is a library with examples. No complaints.
 
mytarmailS:

Right in my footsteps, and you think the same way I think, it's funny.

The market is an interesting beast, it's hard to understand it right away... I'll tell you how you can add a little more stability to the neurocircuit, I wrote about it a long time ago, you need to add a so-called "critical point of view"

The recipe is this:

1) take some market data - it can be anything from indicators to the price

2) take a training sample and divide it into three parts "A" , "Б" , "Ц"

3) take a neuron at the output having a vector with the probability of a class instead of the class itself; teach thisneuron using market data of the "A" sample

4) forecast "B" and "C" samples with our neuron, we obtain a vector of forecasts of "B" and "C" samples

5) take a new neural network and train it again with market data from sample "B" and add another vector of forecasts from sample "B" from the old neural network

6) Sampling "C" for validation.


Try it, see what happens.


Tried it on trees - you can also take the probability of the class there instead of the class. The scheme is almost yours. Tried even more: for two classes divided probability not by half, but there are methods to divide otherwise. Improvement is a couple of percent.

All empty.

You should look for predictors that are relevant to the target. And don't bother with models at all. With good predictors, models will produce about the same results.

 
SanSanych Fomenko:


Could you be more specific? Very interesting.

It seemed to me that everything in R is redundant. The rubricator is its own, from statistics, doesn't look like matlab, but everything seems to be there...

Specifically already said. For example - filtering. Filters in R are absent as such, filters in the radio engineering sense, and all the software to work with filters. Z-transformation is absent. Integral transformations are absent (from all of them only Fourier transforms are present). R is missing a lot, for what I left for SciLab a few months ago. (If you had asked then, you could have been more specific).

This is not a flaw of R, but a specificity. SciLab has its own shortcomings (specifics). The software is aimed at solving different problems, partially overlapping.

 
Yuriy Asaulenko:

Specifically already said. For example - filtering. Filters in R are absent as such, filters in the radio engineering sense, and all the software to work with filters. Z-transformation is absent. Integral transformations are absent (from all of them only Fourier transforms are present). R is missing a lot, for what I left for SciLab a few months ago. (If you had asked then, you could have been more specific).

This is not a flaw of R, but a specificity. SciLab has its own shortcomings (specifics). The software is aimed at solving different problems, partially overlapping.

You do not build a sentence correctly. You write: "I could not find the filters I need*. Since I don't know which filters you are interested in, here are a few at a glance:

mFilter package - Baxter-King filter, Butterworth filter, Christiano-Fitzgerald filter, Hodrick-Prescott filter, Trigonometric regression filter

FKF package - Fast Kalman filter

package kza -coeff() Kolmogorov-Zurbenko Fourier Transform

kz() Kolmogorov-Zurbenko filter

kza() Kolmogorov-Zurbenko Adaptive

kzft() Kolmogorov-Zurbenko Fourier Transform

kzp() Kolmogorov-Zurbenko Periodogram

kzs() Kolmogorov-Zurbenko Spline

kzsv() Kolmogorov-Zurbenko Adaptive filter with Sample Variance.

kztp() Kolmogorov-Zurbenko Third-Order Periodogram

max_freq() Kolmogorov-Zurbenko Fourier Transform

and many, many others.

Besides, if you're deep into the subject of filters and know the mathematical formula by which it is calculated, no problem to just calculate it. No?

Good luck



 
Yuriy Asaulenko:

Specifically already said. For example - filtering. Filters in R are absent as such, filters in the radio engineering sense, and all the software to work with filters. Z-transformation is absent. Integral transformations are absent (of all only Fourier transforms it seems). R is missing a lot, for what I left for SciLab a few months ago. Had you asked then, you could have been more specific).

This is not a flaw of R, but a specificity. SciLab has its own shortcomings (specifics). The software is aimed at solving different problems, partially overlapping.


It's not about R, it's about you.

As far as I understood you have a professional knowledge of some mathematical tools and of course you try to use them in trading.

It seems to me that a different approach is more correct: we are looking for problems in trading and then we are looking for tools to solve these problems.

R is a specialized system for using statistics in trading, and that's why different matlabs, matcads (skylab isn't known at all) didn't quote as competitors for R ten years ago.

More specifically about filters.

The colleague above named some of them.

But filters are a division of the input signal and the first one that stands out is the trend. So smoothing, which results in highlighting the filter, is the first step in many R packages. As smoothing tools, which are declared as such, there are many others, qualitatively different, such as SSA (crawler), wavelets.


But in reality the pseudo-problem of R filters you identified, has much deeper roots.

Why do you need them? A filter is an auxiliary tool. And R has ready-made solutions for building decision blocks. We can designate two main lines: machine learning and ARMA-ARIMA-ARFIMA-ARCH-GARCH. What do filters have to do with it?

Reason: