Volumes, volatility and Hearst index - page 32

 

I wish I had heard the comments on the Flowing Paternity Method.

I was hoping to get the opinion of Farnsworth Candid Yurixx Avals (and/or).

 
joo:

I wish I had heard the comments on the Flowing Paternity Method.

I was hoping to get the opinion of Farnsworth Candid Yurixx Avals (and/or).

Of course, but a little later. I need to finish what I wanted.
 
So, first approximation of the statement in very brief.

The model under study:
M{|x(t+delta)-x(t)|^2}~|delta|^2H(t)
H(t) - assumes time dependence of the power index

Plan to investigate:
  • The possibility of obtaining Hss and Hsssi series from a quotation process
  • Investigate the stationarity-correlation-self-similarity relationship for these processes

System: The picture shows the sequence of steps in the study and at the same time a first look at the system:


A0. "Making the process stationary": Obtaining a family of stationary series, with characteristics close to the following:
  • Normal distribution
  • ACF properties preserved when shifting
(possibly with loss of information)

A1. "Family self-similarity estimation" Selection of series close to Hss and Hsssi processes. Estimate a singular spectrum, obtain a generalized Hurst index

B1. "Estimating probability density functions". This is something like https://forum.mql4.com/ru/6100/page31, I am attaching a picture so it is more obvious (because it is a long story):



  • Surface - theoretical probability density of the system state for future 100 samples.

A2. "Estimation of possible development": I assume that this field must have something to do with "self-similarity". I.e. the highlighted H-processes must somehow be related to the probability density. Or maybe not.

A3. "Searching for most probable structures" - having singled out most probable H-processes, we may switch to singling out these very structures - patterns (if they exist).

A4. "Estimation of Dynamic Characteristics": There is a history as evolution of the system by each component of the process. In evolution, there is a past for which there is a future, and hence it is possible to estimate the transfer function. And here, as an option, Kalman filter or Bayes filter can help as well. And as the result to get a probabilistic estimation of phase states and model parameters (if they are parametric)


PS: colleagues - this is a first approximation, if something is not clear, there's not much use in asking - I don't understand it myself yet. :о)
 
HideYourRichess:

At least because you have to trade on the "minutes" differently than on the "days". They are completely different things.

If you look at it from a fundamental point of view, the global processes on the market and the "high frequency" processes are different, and different groups of capitals are involved. That is why the only argument for self-similarity, for similarity of charts on different timeframes, seems to be useless. That's it, in a nutshell.

The attempt to judge the self-similarity by coincidence or repetition of candlestick patterns is, imho, a substantial oversimplification. Not justified by anything. An even greater simplification, from my point of view, is judging it by trade results. The chart similarity is said to be an attempt to explain the self-similarity of the market for beginners who have not heard anything about fractals.

The self-similarity consists first of all in the structural similarity of various levels of the phenomenon. Those levels that make up the fractal structure. However, and this is the basic mistake of many, similarity does not follow from sameness. Similarity is not equality. Therefore on each fractal level different processes can develop. Don't you know that trends at different levels (rough approximation - at different tf's) can be directed in different directions ? Or a trend at one level may coincide with a flat at another?

HideYourRichess:

Plus, if we consider statistics according to Pastukhov, we can see there, that N-volatility changes with increasing N. Even if it is not very visible, but we can see tendencies.

Based on what I said just above, the difference in H-volatility for different levels is quite normal and reflects different processes occurring at those levels. It is only for a pure and perfectly stationary SB that there should be one value of H-volatility at all levels. That, by the way, is the difference between H-volatility and Hurst: it can and is very easy to measure locally. And Hurst is a global characteristic of the process. It is not because it is so steep but because it is such a curve - its definition and measuring procedure do not allow to obtain local values and hence it is impossible to measure it on different levels. But whoever can localise it or comes up with another, more practical characterisation, will be able to do so and see that for non-stationary processes with memory it will be different at different levels.

The self-similarity of a series of quotes is not that the H-wave or whatever is always the same, but that its definition, calculation methodology and meaning is the same at all levels. And the difference in the quantitative measure is just a consequence of the state.

HideYourRichess:

Coming back to Hyo, in various researches on the Internet we can also notice that log-log plots do not form strictly straight line, as they should at self-similarity. This is also a result not in favour of the fractality theory.

Apparently, you have missed the source of this mess. On pp.5-6 there are several of my posts where I laid out the results of my research on Hearst's behaviour for SB. In theory it should be equal to 0.5. However, in practice it turns out otherwise. These results are not original. All this has long been studied by the scientific community and is well aware of it. Even wikipedia gives a definition of Hurst which will tell an attentive reader everything - Hurst characteristic is marginal. Therefore, for small values of intervals its values differ from what we would like to see. That is also why the procedure of its definition is so heavy-handed (how else could we reach the asymptote?). And that is why its application in practice is of little effect. And Hearst's harpies, which differ from a straight line, are also given on p.6. And so is the interpretation of these results.

But these are all Hearst's problems. You want a straight line, work with the variance of the increments. But what does this have to do with self-similarity. So what are you crossing out a huge phenomenon just because some curve there is not a constant value ? And at the same time with self-similarity you give up the theory of fractals. Is that adequate ?

 
joo:

From all this it follows that you need to analyse Patterns simultaneously on different TFs. This is not the same as the Three Screens Method, which only gives discrete signals. The Method of Flowing Patterns (well, there's finally a name for my method) gives continuous (with the smallest possible discretization that is possible on the BP under study) signals in time.



The general idea of direction is not objectionable. But it is a very steep programme. It will not be easy to implement because there is no formal definition of a pattern and, on the other hand, identical patterns can consist of a different number of points.

Not using correlation as a measure of pattern similarity could be interesting if an alternative (and efficient) method is proposed. Without it, rejection of correlation could lead to a dead end.

 
joo:

It seems to me that the term "paternoster" should be seen in a broader sense. I will try to give my definition of a paternoster:

A PATTERN is divided into a "Causal PATTERN" followed by an "Investigative PATTERN". BP segments may include different number of elementary (indivisible) time segments (bars/types), while forming the same Patterns. The shape of the same Paternals can vary widely. The closest analogy is geometric figures - polygons. So, no matter how the sides of a triangle change, it will remain a triangle, excluding degenerate cases.

Different TFs form their own characteristic Patterns. It is not self-similarity or fractality. Patterns form all the time and are present in every indivisible segment of BP.

Somewhat summarily, but I have no other definition, but the principles I adhere to. In my opinion, the Paterns, as I've defined them, cannot be investigated by correlation and other statistical methods, and in general it is impossible to draw formulas of characteristic Paterns analytically, because they appear and disappear continuously, flowing into each other, at that, as I said, in each TF their paterns are different and do not depend on each other. Different combinations of PATTERNs in different TFs give different but moment-specific Investigative PATTERNs. It is like a kaleidoscope or snowflake pattern, although the patterns are infinitely many, but exclude the appearance of "impossible" patterns. That is, there is some set other than the set of Patterns.

It follows from all this that it is necessary to analyse Paterns simultaneously on different TFs. It is not the same as the Three Screens Method, which only gives discrete signals. The Method of Flowing Patterns (well, there's finally a name for my method) gives continuous (with the smallest possible discretization that is possible on the BP under study) signals in time.


May be, leading specialists of this branch may find my considerations useful, may be they will direct me in some useful direction. With interest I watch development of social thought in this branch, but in my opinion Hirst and similar methods of estimation is a dead end, but it is my IMHO.

Somewhat similar thoughts:

It doesn't really matter if MathCAD, MQL or C++ are used. It has to be formalised somehow in the end. I've investigated patterns, and I've investigated ZZ in the past/future framework, to no avail, no connections. None at all. Hirst's 0.5 explains everything.

A joke is a joke, but a yogi I know well, giggling at my war with mills and demons - I even won a bet. The terms of the bet are statistically unreliable of course, but as a "fact". She curled up in her lotus (I can't yet) and used kinesiology testing to determine entry/exit. Used a variation - 'muscle testing' in the subconscious. To explain in simple terms - a certain 'coiled' arm muscle was put in a trance condition with a series of 'calibration' questions (not to the brain), like "your name is Vasya?", to the right question/answer - a reaction. First - training/coaching, then testing.

 
Vita:
It seems that the process of coastline length measurement made a strong impression on you :). However, you raised a different question (although somewhat related) - about the process of R/S analysis - and there we have a new average at each step, this is a new ruler size for a new row size.
 
Farnsworth:
The model under study:
M{|x(t+delta)-x(t)|^2}~|delta|^2H(t)
H(t) - assumes a time-dependent power factor
This is the correct approach. For SB there is an exact equality, and H(t) = 1/2, and this is a theoretically proven result. It is much more logical to generalise it, rather than introducing some kind of scaling for which accuracy is only achieved in the limit.
 
joo:

I wish I had heard the comments on the Flowing Paternity Method.

I had hoped to get the opinion of Farnsworth Candid Yurixx Avals (and/or).

I took it as an introduction. It's curious, some resonances arise, something falls into the void (in the sense of lack of association).

But the introduction should be followed by the main text :).

The very division of pattern into cause and effect is entirely consistent with my views - only in this case do they deserve a separate title and separate consideration. To disassociate from similarity, correlation and other vivisectionist tools rather suggests a rather early stage in the development of the idea, when apart from the feeling that you have clearly grasped something and very general imagery there is almost nothing.

On the whole, I rather like the new world drawn in broad strokes, but I would like to understand what it has to do with reality.

 
joo:

I wish I had heard the comments on the Overflowing Paternity Method.

I had hoped to get the opinion of Farnsworth Candid Yurixx Avals (and/or).


Imha, a pattern or a combination of patterns in different frames makes sense only in a certain context - the market phase. A pattern is not the cause of a move, but only a probable sign of a transition. The context can be quite different. For example, a sanction as Neo described from the spider. Or the economic cycle, like Al Weiss. His methods, by the way, are closer to your thinking about multi-level patterns and their combined analysis:

Although I use technical analysis to make trading decisions, there are a number of important differences between my method and the approaches of most other traders in this group . First, I don't think very many technical traders go further back in their research than thirty years, let alone a hundred years or more . Secondly, I do not always interpret the same stereotypical figure in the same way . I also take into account which part of the long-term economic cycle we are in . This alone can lead to very significant differences between the conclusions I draw from the charts and those that are reached by traders who don't. Finally, I consider classic graphs (head and shoulders, triangle, etc.) not only as independent formations. Rather, I try to look for certain combinations of figures, or in other words figures within figures. These more complex multi-figure combinations can give signals for trades with a higher probability of success.

D.Schwager "The New Market Wizards".

In any case - the causes and consequences lie outside the chart. These are real economic processes, like the inflation and deflation of a speculative bubble for example. A pattern can show the phase change in time and help you adapt to the process.