Discussing the article: "MQL5 Wizard Techniques you should know (Part 34): Price-Embedding with an Unconventional RBM"

 

Check out the new article: MQL5 Wizard Techniques you should know (Part 34): Price-Embedding with an Unconventional RBM.

Restricted Boltzmann Machines are a form of neural network that was developed in the mid 1980s at a time when compute resources were prohibitively expensive. At its onset, it relied on Gibbs Sampling and Contrastive Divergence in order to reduce dimensionality or capture the hidden probabilities/properties over input training data sets. We examine how Backpropagation can perform similarly when the RBM ‘embeds’ prices for a forecasting Multi-Layer-Perceptron.

Price-Embedding is used in the context of this article as a process very akin to word embedding; and this as some readers may know is the prerequisite step to transformer networks of large language models. Word embedding, which can be defined as the numberfication of words, when paired with self-attention, helps convert a lot of the written material that is available online into a format that neural networks can understand. We are similarly taking a leaf from this approach by making a presumption that, by default, security price data (Even though numeric) cannot be easily ‘understood’ by neural networks off the bat. And our approach for making this more understandable is by using a backpropagation trained RBM.

Now, the conversion of words to numbers is not simply about assigning a number to a word or letter, but rather it is an intricate process that involves self-attention as already mentioned above. Parallels from this, I believe, can be drawn to RBMs when one considers their bi-partite graph design.

While there are no direct neuron-to-neuron connections within a layer of an RBM, these connections, which could be key in capturing the self-attention component of any input data, are made through the hidden layer. With this thesis, the hidden layer not only logs what each neuron could be redrawn as, but also what the significance of its relationships to the other neurons is.

As always, as far as traders are concerned the proof is in the pudding and so the benefits of this price-embedding can only be proven by trading results. And we are going to get to the first part of this process however it could be worth highlighting that scale of rewards one gets from word to number embedding cannot be compared to those we are looking at in the number to number embedding this is because what we are doing here is not nearly as transformational. With that, let's now consider how we reconstruct an RBM with backpropagation.

MQL5 Wizard Techniques you should know (Part 34): Price-Embedding with an Unconventional RBM

Author: Stephen Njuki