Bayesian regression - Has anyone made an EA using this algorithm? - page 39
You are missing trading opportunities:
- Free trading apps
- Over 8,000 signals for copying
- Economic news for exploring financial markets
Registration
Log in
You agree to website policy and terms of use
If you do not have an account, please register
That's what you need to think about - for the data to be similar - you should take a pattern, in my opinion, rather than just a window of n bars.
I was recently discussing the history and development of linear regression with colleagues. To make a long story short, initially there were few data and few predictors. Ordinary linear regression managed with some assumptions. Then, with the development of information technology, the amount of data increased and the number of predictors could easily exceed tens of thousands. Under these conditions ordinary linear regression will not help - over-learn. Therefore regularised versions, versions robust to the requirements of distributions, etc. appeared.
It's partly correct. L2 regularisation does not help to reduce the number of predictors in the model. In neurocomputing at first they used Hebb's learning rule which led to unlimited growth of neural network weights. Then, realizing that the brain has limited resources to grow and maintain weights of neural subunits, L2 regularization was added in the 60s and 80s. This allowed the weights to be limited, but there was still a lot of negligible weighting. This is not the case in the brain. In the brain, neurons are not connected to all other neurons, even if by negligible weights. There are only a limited number of connections. Then, in the 2000s, L1 and L0 regularisations were introduced that allowed for discharged connections. Crowds of scientists began to use linear programming with L1 regularization for everything from image coding to neural models that described brain processes quite well. Economists still lag behind the rest of the sciences due to their "arrogance" (everything was already invented before us) or just a poor understanding of mathematics.
Economists are still lagging behind the rest of the sciences due to their "arrogance" (everything has already been invented before us) or just a poor understanding of mathematics.
Yes. I personally talked to a manager (swd manager) who used to work for a stockbroker. He said, price increments are considered normal and that's it. The methods and misconceptions of the last century are used. I told him there is no normality there. Not a single test passes. He doesn't even know what we're talking about. But he's not a hardcore mathematician, he's a development manager.
So what if there is no normality? Even the head of some development writes about it, Vladimir wrote about it here. How do you even use regression if you don't understand its principles or meaning at all? You walk around like a zombie in the dark night with this normality/normality. It may be a distribution in cubes, squares, zigzags or in the form of a Repin's picture. The ability to predict regression doesn't depend on it.
Totally agree. How many bars to analyse is the Achilles' heel of not only the regressions under discussion. Although, I do not want to calculate the regression, but the probabilities using the Bayes formula. For the time being, I will stupidly take the current window of n bars. And at the stage of testing and trials, for the likelihood function and a priori probabilities, I will take the periods from volatility spike to volatility spike. This is usually the interval between important news events.
And probability will express what, the forecast for the next bar, or the motion vector of the next bars?
First of all, we should decide on the purpose of building the regression: to pick a curve that would most accurately describe the selected block of the market, or to predict the future price position? How can the quality of approximation determine the accuracy of the forecast?
And how can you build a curve that is as accurate as possible in describing the past and as accurate as possible in predicting the future?
Or how can you predict the future without analysing the past?
Approximation is the analysis of the past