You are missing trading opportunities:
- Free trading apps
- Over 8,000 signals for copying
- Economic news for exploring financial markets
Registration
Log in
You agree to website policy and terms of use
If you do not have an account, please register
you have to convert the real data into normally distributed data.
Didn't you do your thesis together with the oak tree?
That is, to find such a transformation of the raw data (quotes) to see normal increments? And how does it work out?
Didn't you do your dissertation with an oak tree?
you have to convert real data into normally distributed data.
Didn't you do your dissertation with an oak tree?
First you have to learn to read and understand what's written, then you have to learn to write
- There is such a thing as a parabolic fractal distribution (quite a new thing, it concerns modelling the distribution of real objects, like the size of the city of Paris in relation to the frugal cities of France https://en.wikipedia.org/wiki/Parabolic_fractal_distribution). Unless you're straight out of university, you probably haven't been taught it. I don't see how it fits in here.
- Stationary distribution: if el. vectors are el. in the state space of a Markov chain, are nonnegative numbers, give a sum of 1, and el. i is the sum of el. vector j multiplied by the probability of transition from state j to i. How it gets here I don't understand either.
- I also know the Mois-Laplace integral theorem, that for large n the binomial distribution converges to the normal distribution. I don't know another one, and this one doesn't fit here either.
Well, about the normal distribution - the quotes as it is, as S.W. wrote and what lies in the palm of his hand, are normally distributed around the moving average, so we are in the clear here.That is, to find such a transformation of the raw data (quotes) to see normal increments? And how does it work out?
The post above is a bit rambling:
- There is such a thing as a parabolic fractal distribution (quite a new thing, it concerns modelling the distribution of real objects, like the size of the city of Paris in relation to the frugal cities of France https://en.wikipedia.org/wiki/Parabolic_fractal_distribution). Unless you're straight out of university, you probably haven't been taught it. I don't see how it fits in here.
- Stationary distribution: if el. vectors are el. in the state space of a Markov chain, are nonnegative numbers, give a sum of 1, and el. i is the sum of el. vector j multiplied by the probability of transition from state j to i. How it gets here I don't understand either.
- I also know the Mois-Laplace integral theorem, that for large n the binomial distribution converges to the normal one. I don't know another one, and this one doesn't fit here either.
Well, about the normal distribution - the quotes as it is, as S.W. wrote and what lies in the palm of his hand, are normally distributed around the moving average, so we are in the clear here.1. Fractal distribution: meaning the one discussed in Peters' book, which has a table at the end of the book. Link to the book: http://www.ozon.ru/context/detail/id/1691158/. It's also available for free on Spider, by the way. There is a more rigorous presentation in Shiryaev's Fundamentals of Stochastic Financial Mathematics. Fractality here rather refers to the stability of the probability distribution.
2. Stationarity: yes, I was inaccurate (as bad luck, after I had written it I thought I was inaccurate - surely someone would pick on me). I wasn't referring to the stationarity of the distribution, but to the stationarity of the Returns random process.
3. I know about this theorem of binomial's convergence to normal. I meant the theorem by which you can, having a uniformly distributed quantity and knowing the inverse function of the normal distribution function, obtain on your computer a fairly good imitation of a normal distribution. I don't remember exactly what it's called, but it's one of the most important ones in terver.
One last thing: we're not talking about the distribution of quotes around a moving average; their normality... well, intuitively it just seems and is not at all on the surface. What we mean is Returns, i.e. closing price differences of neighbouring bars - irrespective of muwings.
The post above is a bit rambling:
- There is such a thing as a parabolic fractal distribution (quite a new thing, it concerns modelling the distribution of real objects, like the size of the city of Paris in relation to the frugal cities of France https://en.wikipedia.org/wiki/Parabolic_fractal_distribution). Unless you're straight out of university, you probably haven't been taught it. I don't see how it fits in here.
- Stationary distribution: if el. vectors are el. in the state space of a Markov chain, are nonnegative numbers, give a sum of 1, and el. i is the sum of el. vector j multiplied by the probability of transition from state j to i. How it gets here I don't understand either.
- I also know the Mois-Laplace integral theorem, that for large n the binomial distribution converges to the normal distribution. I don't know another one, and this one doesn't fit here either.
Well, about the normal distribution - the quotes as it is, as S.W. wrote and what lies in the palm of his hand, are normally distributed around the moving average, so we are in the clear here.The author is on fire! Keep it up!
The post above is a bit rambling:
- There is such a thing as a parabolic fractal distribution (quite a new thing, it concerns modelling the distribution of real objects, like the size of the city of Paris in relation to the frugal cities of France https://en.wikipedia.org/wiki/Parabolic_fractal_distribution). Unless you're straight out of university, you probably haven't been taught it. I don't see how it fits in here.
- Stationary distribution: if el. vectors are el. in the state space of a Markov chain, are nonnegative numbers, give a sum of 1, and el. i is the sum of el. vector j multiplied by the probability of transition from state j to i. How it gets here I don't understand either.
- I also know the Mois-Laplace integral theorem, that for large n the binomial distribution converges to the normal one. I don't know another one, and this one doesn't fit here either.
Well, about the normal distribution - the quotes as it were, as S.W. wrote and what lies in the palm of his hand, are normally distributed around the moving average, so we're in the clear here.1. Fractal distribution: meaning the one discussed in Peters' book, which has a table at the end of the book. Link to the book: http://www.ozon.ru/context/detail/id/1691158/. It's also available for free on Spider, by the way. There is a more rigorous presentation in Shiryaev's Fundamentals of Stochastic Financial Mathematics. Fractality here rather refers to the stability of the probability distribution.
2. Stationarity: yes, I was inaccurate (as bad luck, after I had written it I thought I was inaccurate - surely someone would pick on me). I wasn't referring to the stationarity of the distribution, but to the stationarity of the Returns random process.
3. I know about this theorem of binomial's convergence to normal. I meant the theorem by which you can, having a uniformly distributed quantity and knowing the inverse function of the normal distribution function, obtain on your computer a fairly good imitation of a normal distribution. I don't remember exactly what it's called, but it's one of the most important ones in terver.
One last thing: we're not talking about the distribution of quotes around a moving average; their normality... well, intuitively it just seems and is not at all on the surface. What we're talking about is Returns, i.e. closing price differences of neighbouring bars - without regard to muwings.
3. Are you writing about the Box-Muller transformation? About generating pseudo-random normally distributed numbers from pseudo-random evenly distributed numbers here: http://www.taygeta.com/random/gaussian.html. But where do we have pseudo-random evenly distributed quantities here?
2. Stationarity of the process: probably yes. I don't think the distribution function changes over time either.
1. Too lazy to dig and read now, in view of the last remark:
There is for example a Kolmogorov-Smirnov test, for which, with a random sample, one can test whether the distribution of a random variable is normal or not: https://en.wikipedia.org/wiki/Kolmogorov-Smirnov_test. If this is not enough for you, then please merge all of your above written in a meaningful way into the description of what you are proposing.
The post above is a bit rambling:
- There is such a thing as a parabolic fractal distribution (quite a new thing, it concerns modelling the distribution of real objects, like the size of the city of Paris in relation to the frugal cities of France https://en.wikipedia.org/wiki/Parabolic_fractal_distribution). Unless you're straight out of university, you probably haven't been taught it. I don't see how it fits in here.
- Stationary distribution: if el. vectors are el. in the state space of a Markov chain, are nonnegative numbers, give a sum of 1, and el. i is the sum of el. vector j multiplied by the probability of transition from state j to i. How it gets here I don't understand either.
- I also know the Mois-Laplace integral theorem, that for large n the binomial distribution converges to the normal distribution. I don't know another one, and this one doesn't fit here either.
Well, about the normal distribution - the quotes as it were, as S.W. wrote and what lies in the palm of his hand, are normally distributed around the moving average, so we're in the clear here.The author is on fire! Keep it up!