a trading strategy based on Elliott Wave Theory - page 266

 
to Candid

I don't know about the reality, but it is hoped that the tick is the result of averaging a certain number of real trades. In that sense there should of course be a fractional pips part of it. But if it can do more than catch these very pips, I will be surprised. Though of course I haven't seen these figures and anything can happen. <br / translate="no"> Is it possible to give more details about the type of stochastic member s? At least the beer variety :)


In this model a normally distributed random variable(NRSV) elevated to fifth power was used as a stochastic term. It allowed (5-th degree) well simulate "fat tails" on PDF with 1 tick step and also well simulate relaxation of PDF to normal with the increase of decretization step, see fig. The amplitude of LDCF was taken as a unit, width by level 1/e - 0.3 points.



By the way, the nonmonotonicity of the PDF in the region of small amplitudes of perturbations and at large steps of descratization is due to a noticeable negative correlation between neighboring samples in the real series. It can be seen that the model autoregressive series reproduces this feature well.
So, in answer to the question , I can with the confidence that only another glass of glorious beer can give... so where was I?... Oh, beer! Now...
 
In this model a normally distributed random variable elevated to the fifth power was used as the stochastic term. This allowed (5 powers) to simulate well "thick tails" on FR with step of 1 tick, as well as well simulated relaxation of FR with increasing step of desecrating

Yes, the coincidence is so good that one is tempted to wonder if there is something fundamental behind it.
By the way, Peters mentions that 3 logarithmically equally spaced frequencies are sufficient for modelling 1/f prices, the rest are drowning in noise. Intuitively it seems that this must have something to do with the required p value, although ticks may have their own laws.
 
to cooper123
Rounding actually causes white noise in the quotes with a size of plus minus seven points. <br / translate="no"> Did the calculations results here
http://forum.fxclub.org/showthread.php?p=717988&posted=1#post717988[/quote]

Read the whole thread with pleasure. Digested it. Didn't come across any mention of the 7 points of white noise...
Could you, for the time being, comment on the following in more detail:

I used the following transformation to extract the currency prices. If we denote by vi as the price of the currency under index i. Let's take the price of currency v0 as the price of world currency. For technical reasons, it is best to take the dollar as the world currency, there is simply more information available on it.

1) The transformation starts by choosing an initial moment. At that point, the world currency price is taken as one, and the prices of the other currencies are thereby equal to the exchange rates.
2) The price of the world currency at any other time is calculated as

V0 (tn) = (summa vi(tn-1)/(vi/v0)(tn))/N
Vi (tn)= v0*(vi/v0)(tn)
N number of currencies.
tn is time moment n.
 
The 1/f topic does not seem to be catching on, so I won't bother people with it anymore :). White noise seems to be closer to the current interests, that is why one more picture:

It shows the natural logarithm of the spectral density squared averaged over 871 512 bars of the EURUSD minute chart depending on the natural logarithm of the frequency number. We can see that we have a very smooth 1/f and only for frequencies with the number higher than 150 something similar to white noise begins to dominate. In other words, anything longer than 3 minutes is not white noise.
 
Hi Candid.
I have been dealing with spectral analysis of time series of Forex instruments for quite a long time. The result is the fact that for all methods of spectrum analysis the stationarity of periodic processes is a necessary condition, and unfortunately we do not have it! That's why the spectrum is smooth and if we can distinguish non-monotonicity areas, they are not characteristic :_-( The only thing we can note is that there is a kink in the slope of the straight line spectral density in double logarithmic scale in area of small TF, which has visible antiperstance of time series by its nature. But for the analysis and exploitation of this effect other methods are developed, such as the analysis of the correlogram of series.
 
Hi Candid. <br / translate="no"> I have been engaged in spectral analysis of time series of instruments presented in the Forex market for quite a long time. The result is the fact that for all methods of spectral analysis the stationarity of periodic processes is a necessary condition, and unfortunately we do not have it!

Hello, Neutron.
The phrase "sum of a large number of parallel relaxation processes" continues to fascinate me. It corresponds to my understanding of the mechanism of market functioning and may be the key to the model. In spectra I am now looking for clues to estimate at least some parameters. For example, if the deviation from 1/f at marginal frequencies (the graph from the previous post) is not the result of a drop in accuracy of the calculations, it's like a definite point of reference. One point is clearly not enough though.
Stumbled across another thing. I'm working with about three years of data. If you do averaging over the years you get smooth graphs for spectral density like the one above. But by averaging over three years (4191 samples), I get a picture like this:

So increasing the statistics doesn't make the graph smoother. The opposite is happening. In my mind this picture caused an association with beats which are known to be the result of superposition of processes with close parameters. A possible explanation is that the market parameters have swung in three years. The question arises: does the evidence in favor of the drift of the parameters indicate that they (the parameters) do exist?
 
The question arises: is the evidence in favour of parameter drift a sign that the parameters do exist?


If you plan to defend a Ph.D. thesis based on the results of this study, and the proposed scope of work potentially satisfies it, it is worth the game, but if the result is supposed to build an arbitrage strategy, the field has long been trampled by a mad crowd of starving baboons (may the Great Elliot forgive me). I could be wrong though.

Below is a plot of the time series days, not by IBM, but the ratio of two offset sequences generated by one RNG:



Frankly speaking, I didn't expect to see such a picture. Taking into account that RNG in this case is a normally distributed random variable(NRSV), I want to say: "People, go long on all CFD instruments, it is the guarantee of wealth!" Really, the movement downwards is "complicated", but upwards is accelerated. One day you'll get lucky for sure. If it were not for the negative swaps on long positions...

A question for the knowledgeable people.
Suppose we have a system of n equations like
rnd1(t)/x(t)=f1(t)
...
...
rndn(t)/x(t)=fn(t)

where f1(t)...fn(t) are known, rnd1(t)...rndn(t) are sequences obtained by LDCV integration, x(t) should be found. It is clear that in this form the problem could be considered incorrect, since we have n+1 unknowns for n equations. But intuitively I feel that there is a method of approximate solution based on exploitation of limiting properties of random numbers, and if n tends to infinity, the solution will tend to the exact one. So it seems to me. Maybe someone knows something on the subject?
 
Read the whole thread with pleasure. Digesting it. I didn't come across any mention of 7 points of white noise...
Could you please comment the following in details:

unfortunately I didn't save the links, there were messages, someone made an indicator like
(eurusd - eurjpy*jpyusd) (you should look carefully what to multiply by what and divide by what) and watched it in real time.
There are several threads on the internet devoted to this issue, on spider onyx and the like.

I myself have researched the question of whether the currency system is a metasystem - I haven't found anything in this regard.
What is old is this constant white noise of 14 points (plus or minus seven) and it does not depend on the number of pairs for which it is calculated. That is, if we have a few random variables with white noise, their multiplication should increase the total white noise, but here it seems to be constant, that is, it seems to me, is artificial. That is, if exchange rates are involved in exchange rates, it is precisely the exchange rates and should be the basic elements determined by the market, and the ratio of exchange_rate (eur_dollar) should not be equal to eur/dollar. Apparently they don't exist at all, but the system monitors these ratios, destroying the obvious arbitrage, or this arbitrage is played out between banks due to quick reaction time and doesn't reach a simple speaker due to the extended reaction time.
In other words, the system tracks the values of eurusd*eurjpy*jpyusd (it makes no sense to track higher values), but as the ticks are rounded, the system may react only to deviations of 2 pips and the product of three values is the sum of three such units, or 6 pips, which is approximately seven pips. Rounding exchange rates to the accuracy of ticks causes inaccuracy not +/- one pip, as it may initially seem, but noise of +/- 7 pips.

What is not clear in the conversion? We set the value of a currency at some moment of time, obtain quotations for the next moment and then calculate the average value of global currency obtained by converting the value of that currency to the quotation for the next moment.
i.e. the EUR at the moment gives the value of global currency EUR/EURUSD and averaging over currencies we get the value of USD as global currency. That is, I believe there is only one world currency, which corresponds to a group movement of currencies.

there are actually many variations. On Onyx Semen Semenych(cluster indicators) did conversions considering (according to my definitions) every currency as a global one with Ma200 as basis and changes calculated by Ma5. i.e. similar to

delta eur = [Ma5(eur) - Ma200(eur)]+ [Ma(eur) - Ma200(eur)]+ ... And so on by currencies

He wrote that this transformation corresponds to his understanding of the market - he has a right)))) but his transformation doesn't save exchange rates as ratios, while in my transformation exchange rates equal ratios.

Which from general considerations sort of shouldn't be, but is a fact in a real system. It would be interesting to know what anyone thinks about this?
 
The field here has long been trampled by an insane mob of starving wolfhounds (may the Great Elliot forgive me).

:) That's probably true. But some rakes can't be circumvented.
Gone to work.
Sisyphus :)
 
On Onyx Semen Semenych (cluster indicators) did conversions counting (by my definition) every currency of the world, and the basis was ma200 and changes were calculated by ma5. i.e. something like this <br / translate="no">
EUR delta = [ma5(EURUSD) - ma200(EURUSD)]+ [ma(EURUSD) - ma200(EURUSD)]+ ... and so on by currencies

He himself wrote that this transformation corresponds to his sense of the market - he has a point))))) but his transformation does not save the exchange rate as a ratio of currencies, while in my transformation the exchange rate is equal to the ratio of currencies.

Which from general considerations sort of shouldn't be, but is a fact in a real system. It would be interesting to know what anyone thinks about this?

Just in case, here are some links for those who want to read Semen Semenych's work:
"MQL4: Theoretical Foundations of Building Cluster Indicators for the FOREX Market
http://onix-trade.net/forum/index.php?showtopic=107