Market phenomena - page 28

 
Farnsworth:

The market model

...Its essence is very simple. There are a finite number of structures which describe the transformation of input into output. Each such structure implies some kind of model according to which the transformation takes place. The observed process is formed by a transition (switching) between structures...

It seems to me that you are thinking in the right direction. I would also add that prices at a certain period of time are not just represented as transitions between these structures, but as their weighted combinations. The important thing is to find these structures. What are they? Principal vectors? Sines and cosines as in Fourier transform? Wavelets? If anyone knows how to correctly identify these structures from a time series, please share your thoughts. There may be many choices here, but only one is correct. I would consider those structures (wavelets) correct, which take the least amount to describe the price. This is from my experience from radio engineering. The digital information being transmitted is 100110... is passed through a digital filter/DAC modulator and thus converted into an analogue signal with more values than the original information. The process of representing market prices as transitions between structures is essentially identical to the process of demodulating a radio signal (or reducing the dimensionality of stochastic processes). In order to demodulate this signal correctly we need to know what filters (structures) it has been encoded with.
 
Farnsworth:
Colleagues, I will be leaving the forum for an extended period of time.
Pity of course... (count for the sake of counting alpha and omega - I think these are masking actions to hide the trend, trend reversal)
 

gpwr, the problem is that this decoding (or, roughly the same thing, switching between structures) is most likely non-linear.

Linear links between events (Pearson correlations) disappear already at small "distances" between events. By distance I mean the number of base TF units, i.e. the number of bars.

So far there is nothing additional to say, as I myself go in the dark and groping.

 
gpwr:
It seems to me that you are thinking in the right direction. I would also add that the prices at a certain time interval are represented not as simple traverses between these structures, but as their weighted combinations. The important thing is to find these structures. What are they? Principal vectors? Sines and cosines as in Fourier transform? Wavelets? If anyone knows how to correctly identify these structures from a time series, please share your thoughts. There may be many choices here, but only one is correct. I would consider those structures (wavelets) correct, which take the least amount to describe the price. This is from my experience from radio engineering. The digital information being transmitted is 100110... is passed through a digital filter/DAC modulator and thus converted into an analogue signal with more values than the original information. The process of representing market prices as transitions between structures is essentially identical to the process of demodulating a radio signal (or reducing the dimensionality of stochastic processes). To correctly demodulate this signal we must know what filters (structures) it was coded with.

As a radio technician (unfortunately already in the past) I support your idea of demodulation.

The idea of synchronous detection is suggested - the main thing is to determine the reference signal and the type of filtering (rather non-linear).

 
Mathemat:

gpwr, the problem is that this decoding (or, roughly the same thing, switching between structures) is most likely non-linear.

Linear links between events (Pearson correlations) disappear already at small "distances" between events. By distance I mean the number of base TF units, i.e. the number of bars.

So far I have nothing additional to say, as I myself go in the dark and grope.

My interest in this topic is caused by their more practical application than market price prediction. I am now more interested in the development of rapid speech recognition systems. As we know, speech consists of phonemes (the same structures), the set of which forms a word. For example, in a Russian language there are only 43 phonemes which form 150-200 thousand words. These words form sentences and speech. Speech can be considered the equivalent of a market price, the phonemes (structures) of which we do not know. That is why it looks like noise (imagine speech of an alien). Speech phonemes are generated by the vocal cords, the tongue etc. - in short, by voice filters, whose input is noise in the form of exhaled air. Our perception of speech is also a process of filtering sounds through the filters of the inner ear, which are tuned to different phonemes. That is, simply put, a coded signal (speech) is input (the ear) and output (in the cerebral cortex) is the signal (words). Price prediction comes down to predicting future phonemes (structures). But I am not interested in that. I am interested in recognizing past and present phonemes (structures). In order to achieve this one should have a vocabulary of these phonemes and correlate the speech with these known phonemes (in a simplified way of course). If we know which language is spoken by our interlocutor then we can simply look up the corresponding phoneme dictionary, decode the speech into a text and then translate it with the dictionary. But what if we don't know the speaker's language? How do we determine phonemes from speech? Or, for that matter, how do we determine the structures from the price quotes? Note that the number of price structures has to be of the same order as the number of speech phonemes (10-100).

 

gpwr:

...

Note that the number of price structures should be of the same order as the number of speech phonemes (10-100).

Here, in my opinion, the subject of "market patterns" (some, as you write phonemes) is close to me - in particular their setting and recognition, let us suppose, by NS. After that the trading decision is made - either up or down. So it is like this.
 
Roman.:
In my opinion, this is a close theme of "market patterns" (some kind of phonemes, as you write) - in particular their setting and recognition, for example by NS. After that the trading decision is made - either up or down. So it is like this.


I agree. There are many different terms: phonemes, structures, patterns, wavelets, basis functions. I like the term basis functions better. I am interested in the following question: how can one automatically determine the basis functions when knowing a time series? Of course, one can visually examine this series and find triangles, flags and other nice-looking shapes. But no one has yet proved that these patterns are statistically important and not just a product of the imagination. Remember as in the anecdote:

The psychiatrist shows different pictures to the patient asking "What do you see in them?" And the patient says "A man and a woman having sex." "You're some kind of lecher," says the doctor. And the patient says: "Well, you showed me those lewd pictures yourself."

Automatically identifying statistically important basis functions is a complicated process and I don't think anyone has figured out how to do it properly, even with neural networks. Of course, we can simplify the task and assume in advance that the time series is divided into Haar wavelets, or trigonometric functions as in Fourier series, or other basis functions that are often used in regression. And all these basis functions will successfully reproduce our series, whether it is a price series or a speech series. But imagine if we decompose speech into Haar wavelets - they have nothing to do with phonemes. It would be just as meaningless to decompose a price series into Haar wavelets or trigonometric functions. It is appropriate to mention compressive sensing, the essence of which is to describe a signal with the smallest set of basis functions. Although there are many algorithms of this method, they all assume that we know basis functions. If you have any ideas about the algorithm for finding basis functions from the price series, please share them.

 
gpwr:


... In short, if anyone has any thoughts on an algorithm for finding basis functions from a price series, please share.

There is a universal pill - genetic algorithms. At least, if nothing (or almost nothing) is known about the process, and you still need to investigate and get the result, then you should try GA first of all.
 
sergeyas:

As a radio technician (unfortunately already in the past) I support your idea of demodulation.

The idea of synchronous detection is suggested - the main thing is to determine the reference signal and the type of filtering (rather non-linear).


I like it already... Sergey, what are the main principles of physics of radio (telegraph, etc.)?
 
gpwr:


I agree. There are many different terms: phonemes, structures, patterns, wavelets, basis functions. I like the term basis functions better. I am interested in the following question: how can one automatically determine the basis functions when knowing a time series? Of course, one can visually examine this series and find triangles, flags and other nice-looking shapes. But no one has yet proved that these patterns are statistically important and not just a product of the imagination. Remember as in the anecdote:

The psychiatrist shows different pictures to the patient asking "What do you see in them?" And the patient answers "A man and a woman having sex." "You're some kind of lecher," says the doctor. And the patient says: "Well, you showed me those lewd pictures yourself."

Automatically identifying statistically important basis functions is a complicated process and I don't think anyone has figured out how to do it properly, even with neural networks. Of course, we can simplify the task and assume in advance that the time series is divided into Haar wavelets, or trigonometric functions as in Fourier series, or other basis functions that are often used in regression. And all these basis functions will successfully reproduce our series, whether it is a price series or a speech series. But imagine if we decompose speech into Haar wavelets - they have nothing to do with phonemes. It would be just as meaningless to decompose a price series into Haar wavelets or trigonometric functions. It is appropriate to mention compressive sensing, the essence of which is to describe the signal with the smallest set of basis functions. Although there are many algorithms of this method, they all assume that we know basis functions. If you have any ideas about the algorithm for finding basis functions from the price series, please share them.

I tried to use (18) from [url=https://www.mql5.com/ru/articles/250]"Universal regression model for market price prediction"[/url] as a basis function. It satisfactorily describes dependencies constructed artificially from various functions in all possible combinations, including sums, products, logarithms, power, exponentials, etc.