Discussing the article: "MQL5 Wizard Techniques you should know (Part 33): Gaussian Process Kernels"

 

Check out the new article: MQL5 Wizard Techniques you should know (Part 33): Gaussian Process Kernels.

Gaussian Process Kernels are the covariance function of the Normal Distribution that could play a role in forecasting. We explore this unique algorithm in a custom signal class of MQL5 to see if it could be put to use as a prime entry and exit signal.

Gaussian Process Kernels are covariance functions used in Gaussian processes to measure the relationships among data points, such as in a time series. These kernels generate matrices that capture the intra-data relationship, allowing the Gaussian Process to make projections or forecasts by assuming the data follows a normal distribution. As these series look to explore new ideas while also examining how these ideas can be exploited, Gaussian Process (GP) Kernels are serving as our subject in building a custom signal.

We have recently covered a lot of machine learning related articles over the past five articles, so for this one we ‘take a break’ and look at good old statistics. In the nature of developing systems quite often the two are married, however in developing this particular custom signal we will not be supplementing or considering any machine learning algorithms. GP kernels are noteworthy because of their flexibility.

They can be used to model a wide variety of data patterns that range in focus from periodicity, to trends or even non-linear relationships. However, more significant than this is when predicting, they do more than provide a single value. Instead, they provide an uncertainty estimate that would include the desired value, as well as an upper bound and lower bound value. These bound ranges are often provided with a confidence rating, which further facilitates a trader’s decision-making process when presented with a forecast value. These confidence ratings can also be insightful and help in better understanding traded securities when comparing different forecast bands that are marked with disparate confidence levels.

In addition, they are good at handling noisy data since they allow a noise value to be incremented to the created K matrix (see below), and they also are capable of being used while incorporating prior knowledge into them, plus they are very scalable. There are quite a number of different kernels to choose from out there. The list includes (but is not limited to): Squared Exponential Kernel (RBF), Linear kernel, Periodic kernel, Rational quadratic kernel, Matern kernel, Exponential kernel, Polynomial kernel, White noise kernel, Dot product kernel, Spectral mixture kernel, Constant kernel, Cosine kernel, Neural network (Arccosine) kernel, and Product & Sum kernels.

Gaussian Process Kernels

Author: Stephen Njuki