a trading strategy based on Elliott Wave Theory - page 227

 
Well maybe between us housewives you can tell me what it is ? <br / translate="no"> It doesn't have to be about the Markov process, you can just talk about Markov chains.
You can even talk about chains.


It's quite long, and I'm not good at telling it nicely. About Markov chains, chains and processes, which these chains describe, I read in the book "Probabilistic Parts of Mathematics" edited by Maximov, "Course of Probability Theory" by Gnedenko. I cannot say about myself that I have become a guru in this field. Rather, I remind myself of a dog which understands everything but cannot say anything. :о)

I also don't really like a housewife's explanation of "what it is". For instance, let's take a definition from Wikipedia (quite for housewives :o):


A Markov chain (CM) is a sequence of random events with a finite or countable infinite number of outcomes, characterised by the property that at a fixed present the future is independent of the past. It is named after A. A. Markov (Sr.).


It seems to be correct, however, the more precise wording is somewhat different:


A sequence of trials forms a CM if the conditional probability in s+1 trials (s=1,2,3,,,,) of an event A(i)[s+1] i=1,2,...k depends only on which event occurred in the s-th trial and does not change from the additive information about which events occurred in earlier trials.


Apparently, for this reason, processes which can be described by such chains are called processes with short memory. Another definition, based on the notion of system state, is also introduced.

Yuri, have pity on me. I don't want to rewrite definitions and conclusions at all. The CM is not my invention and I have not yet reached the appropriate level of incompetence to retell it all in my own words. You may not recognize Markov chains then :o).

When you read in competent sources (as opposed to my narrative), then maybe my "practices" will be useful:

(1) I chose a channel as the state of the system rather than specific price values or the difference between prices
(2) I took the probabilities of some channel characteristics to make a transition probability matrix.
(3) I took a channel change as one step in the matrix
(4) I have "intuitively chosen" the process of birth and death as a process; we cannot use the process of queuing for our purposes, can we?

And I've already demonstrated the results of its use. :о)
 
We will consider the class of stationary time series. Our problem is reduced to the choice of a model suitable for describing the behavior of "random" residuals X[j] of the studied time series Y[j], obtained after the elimination of its non-random component (if any) from the original time series. Since we describe the behaviour of random residuals here, we denote the simulated time series by X[j], and assume that for all j its mathematical expectation is zero. Otherwise, we need to centre the original series. The centering and residualization can be done for the time series characteristic of the Forex market (containing only the stochastic trends) by constructing the first difference series
X[j]=Y[j]-Y[j-k], where k can be 1 to n depending on the experiment purpose.


Auto-regression model of the 1st order AR(1) (Markov process).

This model is a simple variant of the autoregressive process of
X[j]=SUM{a[k]*X[j-k]}+sigma[j], where the summation is carried out for all k=1...infinity,
when all coefficients except the first are equal to zero. Accordingly, it can be defined by the expression
X[j]=a*X[j-1]+sigma[j], (1)
where a&#61485;is some numerical coefficient not exceeding one in absolute value (|a| < 1), and sigma[j], a sequence of random variables forming white noise. Thus X[j] depends on sigma[j] and all preceding sigmas, but is independent of future sigma values. Accordingly, in the equation, sigma[j], is independent of X[j-1] and earlier values of X. Because of this, sigma[j] is called an innovation (update).
Sequences X satisfying relation (1) are often also called Markovian processes. This means that
1. The expectation of the process M is identically zero M=0.
2. The autocorrelation coefficient r between the members of the series, spaced by k-steps, equals r=a^k.

The main characteristics of the 1st order autoregression process are as follows.

The condition of the stationarity of the series is determined by the requirement for the coefficient a:
|a|<1
The autocorrelation function of the Markov process is defined by the following relation:
r(t)=a^t ,
i.e. the value of a determines the amount of correlation between two neighboring members of series X[j]. We can see that the degree of tightness of the correlation between the terms of the sequence (1) exponentially decreases as they are mutually removed from each other in time.
The spectral density of the Markov process (1) can be calculated using the known form of the autocorrelation function:
p(w)=2sigma0^2/(2+a^2-2a*cos(2Pi*w))
If the value of parameter a is close to 1, the adjacent values of series X[j] are close to each other in magnitude, the autocorrelation function exponentially decreases while remaining positive and the spectrum is dominated by low frequencies, which means a sufficiently large average distance between the peaks of X[j]. At the value of the parameter a close to -1, the series rapidly oscillates (high frequencies predominate in the spectrum) and the autocorrelation function graph exponentially decreases to zero with an alternating change in sign.

After identifying the model, i.e. determining its parameters (in this case it is a)
we can build a one step forward forecast:
Y[j+1]=Y[j]+a*X[j].

That's it.

Yura, now I have a request to you. Implement in Mathcad the algorithm shown on fig. below and show the resulting FAC from TF for EURUSD minutiae for some year.

 
<br / translate="no"> When read in competent sources (as opposed to my narrative), maybe my "practices" will be useful:

(1) I have chosen the channel as the state of the system rather than specific price values or the difference between prices
(2) I took the probabilities of some channel characteristics to make a transition probability matrix.
(3) I took a channel change as one step in the matrix
(4) I have "intuitively chosen" the process of birth and death as a process; we cannot use the process of queuing for our purposes, can we?

And I've already demonstrated the results of its use. :о)


Everything is clear here, except point 2). Probably considered a simple banal thing, or maybe even know-how.
On point 4) (I've already pestered solandr with this question) - "process of birth and death" was defined on statistical treatment of point 3) or from some general theoretical considerations?
 
Yurixx 22.01.07 16:24
Well maybe between us housewives you can tell me what it is ?
It doesn't have to be a Markov process, you can just talk about Markov chains.
You can even talk about chains.

The easiest way is to use an example.
The simplest Markovian process is an ordinary coin.
Which side a coin falls on is independent of the previous state.
It's said that a process like a coin has the property of being markovian,
that is, it has no memory of the past. A series of coin tosses would be called
a Markov chain. More precisely, not the tosses themselves, but the probabilities.
There are more complicated Markov processes, there are many different ones.
Markov processes. There are some that "remember" the previous state, but
but don't remember the pre-existing one, etc...
Well, in general, this is a simple story.
The mathematics there, in some places, is quite confusing and non-obvious, and the formulas are huge.
 
Yurixx 22.01.07 16:24
Ну может между нами, домохозяйками, раскажете мне что это такое ?
Необязательно про МАРКОВСКИЙ ПРОЦЕСС, можно просто про марковские цепи.
Можно даже про цепочки.

The easiest thing to do is to use an example.
The simplest Markov process is an ordinary coin.
Which way a coin falls out is independent of the previous state.
It's said that a process like a coin has the property of being Markovian,
that is, it doesn't remember the past.


As far as I remember the definition of a Markov process from the previous post (A(i)[s+1] depends only on A[s]) , flipping a coin cannot be a Markov process, since the probability of eagle falling on each flip does not depend on any previous trial.
 
To Neutron

<br / translate="no"> Our task is to choose a model suitable for describing the behaviour of "random" residuals X[j] of the investigated time series Y[j], obtained after eliminating from the original time series its non-random component (if any).


Sergey, I hope on your patience. Explain to me (it is quite possible that I have missed something), and why we need a model to describe random residuals, and what is "elimination". And it seems to me that "elimination" of random residuals is inherently random. What a wrap-up. :о)

To Rosh


everything is clear here, except item 2). It may be considered a simple trivial thing, or it may be a know-how.


It's quite simple here. I had to define somehow the state of the system in order to make predictions. I was messing around for a long time with quite understandable parameters: skoe, length of the channel, slope angle of the LR line. But in the course of the experiment I came to a conclusion that some channel parameters gave better results.

And I came to these characteristics from the following:


On item 4) (I already pestered solandr with this question) - "process of birth and death" was defined on statistical processing of item 3) or from some general theoretical considerations?


OK, I'll be honest with you. My first thought was this one. Take history, find channels, calculate statistics. I eventually abandoned this approach. As I wrote before, I named my method evolutionary fractal wave analysis (well, I named it and I like it). It is based on "evolutionary" - reworked "under the channels" of MSP. So, I investigated the dynamics of some characteristics of channels. Channel, on the other hand, is not defined in my usual way. Here in this post "grasn 18.01.07 16:11" there is a picture that shows the strength of the connection between samples. The channel is from the current datum to the weakest value of this connection. As soon as you find a weak count, it means you have found the origin of the channel. I move the "cursor" to this point and start monitoring, as North Wind says, the quality of the process.

Dynamics of some characteristics inside the channel is the process of birth and death of the channel (at least in my case it is so).
 
Once upon a time, many pages ago, I argued with the founder of the thread about Elliott's theory and he refused to articulate its essence in a nutshell, citing the thickness of the books.

Now, thanks to Neutron, grasn and Northwind, it is clearly demonstrated how it is done.

Although my age doesn't allow me to attend school any more, I am very grateful for your desire to teach me some wisdom, and the lesson you've set me, Sergey, I will certainly do it.

I promise and vow solemnly. :-)
 
Rosh 22.01.07 19:33
As far as I remember the definition of a Markov process from the previous post (A(i)[s+1] depends only on A[s]) , flipping a coin cannot be a Markov process, since the probability of eagle falling on each flip does not depend on any previous trial.

I would like to discuss this point in more detail, but unfortunately there is absolutely
time. I will only say that Mrs. Wentzel E.S. in her textbook states
the same, the coin is a Markovian process, there is even a proof.
By the way, she has a Markovian process (a process without consequences) - if for each moment
the probability of any state of the system in the future depends only on the state
of the system at the present moment, and does not depend on how the system arrived at
this state.
 
Rosh 22.01.07 19:33
Насколько я помню определение марковского процесса из предыдущего поста (A(i)[s+1] зависит только от A[s]) , подкидывание монетки не может являться марковским процессом, так как вероятность выпадения орла при каждом подбрасывании не зависит ни от одного предыдущего испытания.

I would like to discuss this point in more detail, but unfortunately there is absolutely
time. I will only say that Mrs. Wentzel E.S. in her textbook states
the same, the coin is a Markovian process, there is even a proof.
By the way, she has a Markovian process (a process without consequences) - if for each moment
the probability of any state of the system in the future depends only on the state
of the system at the present moment, and does not depend on how the system arrived at
this state.


Yes, where there are women, there is always confusion. Just kidding. :о) Let's take a simple example from a textbook edited by Maximov: a player plays a game which consists of parties. Probability of winning the next game equals p, if the previous game is won, and p1, if the previous game is lost. State E1 - the next game is won, E2 - the game is lost.

Wandering through states E1 and E2 is described by a transition probability matrix:
|(p) (1-p)|
|(p1) (1-p1)|
 
There you go, another thing :) You can even give a rationale as to why winning after losing has a different probability compared to winning after winning .
"Men don't cry, men get upset" :)