Machine learning in trading: theory, models, practice and algo-trading - page 1713
You are missing trading opportunities:
- Free trading apps
- Over 8,000 signals for copying
- Economic news for exploring financial markets
Registration
Log in
You agree to website policy and terms of use
If you do not have an account, please register
What are you guys, like, scientists or something?
Gentlemen, you're just looking for a system to make money.
but no one, absolutely no one, it does not work
When Scientists Want to Make Sense of Some Complex Process....
fun, I'll dig in...
When scientists want to understand a complex process, they try to decompose it into simpler components and analyze them; spectral analysis was created for this purpose. Let's try to play scientists) even though they are not very successful. I figured out how to decompose the price into simpler components. My decomposition has no additivity, and that's bad, but it's still interesting to look at the price from a different angle .
so, we need the closing price and volatility (high-low)
Let's turn the price into a conditional binary form - if the price increment is higher than the previous one, then "1" if it is lower, then "-1".
code on R
we get binary price
you can make it cumulative and compare it with the price.
It does not look like much) And now let's add volatility to our series
That's better...
Ideas...
IDEA 1
So, it turns out that almost all the "weather" is made by the "intra-schedule" volatility, instead of the "binary" price direction. The point is that volatility has a pronounced seasonality and is relatively easy to forecast, all we have to do is to forecast a binary price, which is simpler in structure than a regular one, and then just add up the forecasts and we will get a full-fledged forecast...
IDEA 2.
All MO algorithms learn badly from raw prices, even normalized ones, because they don't have repetitiveness in series, probably just because of volatility, which is constantly different, if we decompose the price into binary and volatility, normalize volatility and add them back, or do not normalize but feed them to MO, in theory we should get better generalization ability because repeatability will increase
IDEA 3.
With decomposition we can smooth out prices without losing latency. You can decompose the price and interpolate (stretch) the volatility and the price separately and then add them back together
IDEA 4
We may decompose prices and cluster volatility, i.e. reduce its degrees of freedom (for example 10 clusters (states), i.e. standardize it in a way, and return the standardized volatility
The proposal to break down a complex process into its constituent parts is very reasonable. That's the way to do it. Only you have very few constituents. There are a lot of market parameters, including derivatives, which can be added to the study. You have a powerful tool - the MO! Why not try to build a coherent, logical parametric system, in which to look for statistical patterns with the help of MO?
Initially we have 3 parameters of the tick series, bid asc, tick time. All other parameters are derived from these three. Thinning, averaging. And of many.
Initially we have 3 parameters of the tick series, bid asc, and tick time. All other parameters are derived from these three. Thinning, averaging. And there are many of them.
Bid, ask, flipper, supply and demand volumes at levels, OI, seasonality, session time, and many more and derivatives... you can include fundamental parameters like news release time, news significance, interest rate, behavior of parallel pairs at the same moment... If you're going to use IRs, then use them to the fullest extent. As in the song "my father and mother taught me... to explore, so explore!"))
The fundamental external parameters were not considered here, while the problem of their digitization is not solved, except for the importance of news, which is very small, the representation of other properties and market parameters for today have not seen, apparently somewhere there is in development, but in use they are not. From the news on this topic. The AI will take into account the intelligence on the state of the enemy's country and work out the tactics of action. The task of digitizing data on the state of society, the market, the country, is a different task.
We're just interested in solving an interesting problem, especially when it starts to be solved.
The fundamental external parameters were not considered here, while the problem of their digitization is not solved, except for the importance of news, which is very small, the representation of other properties and market parameters for today have not seen, apparently somewhere there is in development, but in use they are not. From the news on this topic. The AI will take into account the intelligence on the state of the enemy's country and work out the tactics of action. The task of digitizing data on the state of society, the market, and the country is a different task.
How do you determine that the problem begins to be solved?