Machine learning in trading: theory, models, practice and algo-trading - page 2952

 
Evgeny Dyuka #:
So ban them already, it's high time
You have a business here and you are annoyed that people write wrong things instead of spending their efforts on developing MO in MQL5 environment.
You - why?
I was not arguing with you there. Don't take it out on yourself.

I will delete this post (and my post above) later.
 
Evgeny Dyuka #:

If I may, a counter similar question.
(this is not about your business, but specifically about the topic of the MoD)

Specifically for MO is done:

  • Metatrader 5 trading platform
  • MQL5 language
  • matrix maths in MQL5
  • integration of Python into the terminal, including communication library
  • ONNX models integration
  • OpenCL/DirectX for GPU usage
  • cloud network, including tester
  • www.mql5.com ecosystem in 11 languages

This is made for the public and is used massively around the world.

Want to compare this to a couple of copied (as is common with machine learning adepts) scripts?

Be rational and don't throw yourself at those who do the work and put it out to the public.

 

I would like to add my five kopecks and separate the flies from the cutlets, which, no matter how qualitative they are, do not solve the problems of the flies.

On this thread, a certain part of participants have a firm understanding that the main problem of financial markets is their non-stationarity, and the problem of non-stationarity does not have a final solution at the moment. All this talk about the duration of testing, the time of successful trading - all this is empty and has been repeatedly refuted by practice, ruining Nobel laureates who did not recognise the problem of non-stationarity. The existence of the non-stationarity problem is perfectly confirmed by the market of signals on this site: all signals died, just some earlier and others much later.

We can distinguish two approaches to solving the problem of non-stationarity of financial markets:

1. Modelling of non-stationarity, which is tried to be done within the framework of GARCH models, of which there are already more than a hundred.

2. Trying to find patterns in the non-stationary input flow in the hope that these patterns will be repeated in the future. This is attempted in the framework of so-called "machine learning". For example, the RandomForest model finds a minimum of 50 patterns, with 150 patterns exhausting any time period. But the next step can change the set of patterns, and special efforts are needed to prepare the input data so that these patterns, if they change, do not change very much.

Unfortunately, the thread has descended to the discussion of the models themselves, although, in my experience, there is no problem of using models at all (Caret shell includes up to 200 models for any taste), but there is a problem of preparing input data for these models. Let's not forget the main slogan of statistics: "Garbage in - rubbish out".

 
СанСаныч Фоменко #:

For you personally, I am re-attaching a comprehensive text on formulas in a PDF file. This includes "dependencies and sources".

And about the nuances of calculations, I do not do it, because I know for sure that formulas have NOT anything to do with programming, it is an independent problem, which is solved by other people with other training and in other, scientific circles.

So read the PDF.

Thanks, I'll have a look.

So far I found a direct answer to my question here - https://blog.paperspace.com/gradient-boosting-for-classification/

Gradient Boosting for Classification | Paperspace Blog
Gradient Boosting for Classification | Paperspace Blog
  • blog.paperspace.com
Machine learning algorithms require more than just fitting models and making predictions to improve accuracy. Most winning models in the industry or in competitions have been using Ensemble Techniques or Feature Engineering to perform better. Ensemble techniques in particular have gained popularity because of their ease of use compared to...
 
The data structure reference for ONNX does not seem to be true. MT version 3602.
Документация по MQL5: ONNX модели / Структуры данных
Документация по MQL5: ONNX модели / Структуры данных
  • www.mql5.com
Структуры данных - ONNX модели - Справочник MQL5 - Справочник по языку алгоритмического/автоматического трейдинга для MetaTrader 5
 
And there is nothing about keys for OnnxRun() in the help.
Документация по MQL5: ONNX модели / OnnxRun
Документация по MQL5: ONNX модели / OnnxRun
  • www.mql5.com
OnnxRun - ONNX модели - Справочник MQL5 - Справочник по языку алгоритмического/автоматического трейдинга для MetaTrader 5
 
In the ONNX help there is no information about OnnxSetInputShape() and OnnxSetOutputShape() functions. It is not very clear what they should do.
 
Aleksey Nikolayev #:
In the ONNX help there is no information about OnnxSetInputShape() and OnnxSetOutputShape() functions. It is not very clear what they should do.


These methods set the dimensionality of the input and output data of the model. Today we will add them to the help

//+------------------------------------------------------------------+
//|                                        ONNX.Price.Prediction.mq5 |
//|                                  Copyright 2023, MetaQuotes Ltd. |
//|                                             https://www.mql5.com |
//+------------------------------------------------------------------+
#property copyright "Copyright 2023, MetaQuotes Ltd."
#property link      "https://www.mql5.com"
#property version   "1.00"

const long  ExtOutputShape[] = {1,1};
const long  ExtInputShape [] = {1,10,4};
//+------------------------------------------------------------------+
//| Script program start function                                    |
//+------------------------------------------------------------------+
int OnStart(void)
  {
   matrix rates;
//--- получаем 10 баров
   if(!rates.CopyRates("EURUSD",PERIOD_H1,COPY_RATES_OHLC,2,10))
      return(-1);
//--- на вход модели должен подаваться набор вертикальных векторов OHLC
   matrix x_norm=rates.Transpose();
   vector m=x_norm.Mean(0);               // нормируем цены
   vector s=x_norm.Std(0);
   matrix mm(10,4);
   matrix ms(10,4);

   for(int i=0; i<10; i++)
     {
      mm.Row(m,i);
      ms.Row(s,i);
     }

   x_norm-=mm;
   x_norm/=ms;
//--- создаём модель
   long handle=OnnxCreateFromBuffer(model,ONNX_DEBUG_LOGS);

   if(!OnnxSetInputShape(handle,0,ExtInputShape))
     {
      Print("failed, OnnxSetInputShape error ",GetLastError());
      OnnxRelease(handle);
      return(-1);
     }

   if(!OnnxSetOutputShape(handle,0,ExtOutputShape))
     {
      Print("failed, OnnxSetOutputShape error ",GetLastError());
      OnnxRelease(handle);
      return(-1);
     }
//--- запускаем модель
   matrixf x_normf;
   vectorf y_norm(1);

   x_normf.Assign(x_norm);
   if(!OnnxRun(handle,ONNX_DEBUG_LOGS | ONNX_NO_CONVERSION,x_normf,y_norm))
     {
      Print("failed, OnnxRun error ",GetLastError());
      OnnxRelease(handle);
      return(-1);
     }

   Print(y_norm);
//--- обратно разнормируем цену из выходного значения
   double y_pred=y_norm[0]*s[3]+m[3];

   Print("predicted ",y_pred);
//--- завершили работу
   OnnxRelease(handle);
   return(0);
  }
//+------------------------------------------------------------------+
 
mytarmailS #:
What do you mean?
On my computer I'm banned for 10 years, but from my phone I'm out of the ban)))

You probably have a "fake IP ban":

Forum on trading, automated trading systems and testing trading strategies

Question to the administration of the site mql5.com

Sergey Golubev, 2022.12.16 17:22

If you are banned and you can make posts here, it is a "fake IP ban".
You probably have a dynamic IP, and it accidentally "fell" on someone's banned IP.
When I "catch" such a ban, I just turn off my computer, turn off the router, then turn on the router and turn on my computer.
As a result, my IP changes (and I also have a dynamic IP), and the inscription about 10 years disappears.

...

 

Output to MT5 from ONNX model trained in LightGBM does not work. Errors 5808 and 5805 when setting the form of parameters. But the problem seems to be with the definition of parameter dimensions - negative values are obtained (highlighted in the code). Maybe I just messed something up. In Python 3.10 everything seems to be normal.

MQL5 output:

void OnStart()
  {
   long h = OnnxCreate("model.onnx", FILE_COMMON);
   
   //Print(OnnxGetInputCount(h));
   //Print(OnnxGetOutputCount(h));
   //Print(OnnxGetInputName(h, 0));
   //Print(OnnxGetOutputName(h, 0));
   OnnxTypeInfo otype;
   OnnxGetInputTypeInfo(h, 0, otype);
   ArrayPrint(otype.dimensions);                   // -1 8
   //Print(otype.element_type, " ", otype.type);
   OnnxGetOutputTypeInfo(h, 0, otype);
   ArrayPrint(otype.dimensions);                   // -1 1
   //Print(otype.element_type, " ", otype.type);
   
   matrix mx={{8.32520000 e+00, 4.10000000 e+01, 6.98412698 e+00, 1.02380952 e+00,
               3.22000000 e+02, 2.55555556 e+00, 3.78800000 e+01,-1.22230000 e+02},
              {8.30140000 e+00, 2.10000000 e+01, 6.23813708 e+00, 9.71880492 e-01,
               2.40100000 e+03, 2.10984183 e+00, 3.78600000 e+01,-1.22220000 e+02},
              {7.25740000 e+00, 5.20000000 e+01, 8.28813559 e+00, 1.07344633 e+00,
               4.96000000 e+02, 2.80225989 e+00, 3.78500000 e+01,-1.22240000 e+02},
              {5.64310000 e+00, 5.20000000 e+01, 5.81735160 e+00, 1.07305936 e+00,
               5.58000000 e+02, 2.54794521 e+00, 3.78500000 e+01,-1.22250000 e+02},
              {3.84620000 e+00, 5.20000000 e+01, 6.28185328 e+00, 1.08108108 e+00,
               5.65000000 e+02, 2.18146718 e+00, 3.78500000 e+01,-1.22250000 e+02}};
   matrix my={{0.0},{0.0},{0.0},{0.0},{0.0}};   
   
   const long  ExtInputShape [] = {1,5,8};
   const long  ExtOutputShape[] = {1,5};
   Print(OnnxSetInputShape(h,0,ExtInputShape));
   Print(GetLastError());                            // 5808
   ResetLastError();
   Print(OnnxSetOutputShape(h,0,ExtOutputShape));
   Print(GetLastError());                            // 5805
   
   OnnxRun(h, ONNX_DEBUG_LOGS | ONNX_NO_CONVERSION, mx, my);
   //Print(mx);
   //Print(my);
   OnnxRelease(h);
  }

Learning in python:

from lightgbm import LGBMRegressor
from sklearn.datasets import fetch_california_housing
import onnxmltools
from onnxconverter_common import *

housing = fetch_california_housing()
X, Y = housing.data, housing.target

model = LGBMRegressor()
model.fit(X, Y)
Yreal, Ypredict = Y[:5], model.predict(X[:5])
print(Yreal)
print(Ypredict)

initial_type = [('input', FloatTensorType([None, len(X[0])]))]
onnx_model = onnxmltools.convert_lightgbm(model, name='LightGBM', initial_types=initial_type)
onnxmltools.utils.save_model(onnx_model, 'model.onnx')

Output in Python:

import numpy as np
import onnxruntime as ort
from sklearn.datasets import fetch_california_housing

housing = fetch_california_housing()
X, Y = housing.data, housing.target
Xtest, Yreal = X[:5], Y[:5]

sess = ort.InferenceSession("model.onnx", providers=ort.get_available_providers())
input_name = sess.get_inputs()[0].name
Ypredict = sess.run(None, {input_name: Xtest.astype(np.float32)})[0]

print(Xtest)
print(Yreal)
print(Ypredict)