Discussion of article "Backpropagation Neural Networks using MQL5 Matrices" - page 2

 
Stanislav Korotky #:

So, you want this:

I'll think about it.

On first glance it looks okay ,yes . The calculation on the spot is faster than storage i assume .

👍

 
Lorentzos Roussos #:

On first glance it looks okay ,yes . The calculation on the spot is faster than storage i assume .

I think I know the reason why it's originally coded via output of activation function. In all my previous NN libs and some other people libs I used, the derivatives are calculated via outputs, because it's more simple and effective (during adaptation to the matrix API, I didn't pay attention to the difference). For example:

sigmoid' = sigmoid * (1 - sigmoid)
tanh' = 1 - tanh^2
softsign' = (1 - |softsign|)^2

This way we do not need to keep pre-activation arguments (matrices) or re-calculate them again during backpropagation phase (as it's done in the fix). I don't like both approaches. Calculation of "self-derivative", so to speak, looks more elegant. Hence I'd prefer to find some references with formulae for self-derivatives of all (or many) supported activation functions, and return to my original approach.

It's interesting that it's not required that the self-derivative formula is strictly derived from the activation function - any function with equivalent effect is suffice.
 
Stanislav Korotky #:

I think I know the reason why it's originally coded via output of activation function. In all my previous NN libs and some other people libs I used, the derivatives are calculated via outputs, because it's more simple and effective (during adaptation to the matrix API, I didn't pay attention to the difference). For example:

This way we do not need to keep pre-activation arguments (matrices) or re-calculate them again during backpropagation phase (as it's done in the fix). I don't like both approaches. Calculation of "self-derivative", so to speak, looks more elegant. Hence I'd prefer to find some references with formulae for self-derivatives of all (or many) supported activation functions, and return to my original approach.

Yeah , but mq has decided to do it this way ,so it applies to all activations functions. 

In simple words , instead of the .Derivative function "adapting" to the activation function (like the 3 you mentioned could receive the outputs) they have decided to have the functions receive the pre-activation values across the board anyway. That is okay , the problem is it is not in the documentation .

The default assumption by anyone is that it adapts to the AF.

This is bad for someone new (like me for example) as they "tackle" them before they even start .The thing that saved me was that i built an object based network first.

(the comparison of object based and matrix based networks would also be a very interesting article and would help many coders who are not math savvy)

Anyway i placed it in the thread a moderator has for reporting documentation issues .

(off topic : you can use this TanH , its faster and correct , i think) 

double customTanH(double of){
  double ex=MathExp(2*of);
  return((ex-1.0)/(ex+1.0));
}
It's interesting that it's not required that the self-derivative formula is strictly derived from the activation function - any function with equivalent effect is suffice.

You mean like a "substitute" ? 

For instance , a node receives an error on its output , and you know the "fluctuation" of the output so if you "shift it" to a simpler activation and derive it it will work ?

So in theory it'd be like "regularizing" the output but without doing it and just multiplying by the derivative of the regularization before the derivative of the activation ? 

For instance : 

tanh output -1 -> +1 
sigmoid output 0 -> +1 
tanh to sigmoid output = (tanh_out+1)/2.0
and you just multiply by the derivative of that which is 0.5 ? (without touching the tanh outputs at all)
mql5 documentation errors, defaults or inconsistencies.
mql5 documentation errors, defaults or inconsistencies.
  • 2023.04.07
  • www.mql5.com
Page : https://www.mql5.com/en/docs/globals/globalvariabletime This is inexact, GlobalVariableCheck() does NOT modify the last access time. 2023.04...
 

An admin responded to the moderators thread about this . You may be interested 

Forum on trading, automated trading systems and testing trading strategies

mql5 documentation errors, defaults or inconsistencies.

Rashid Umarov, 2023.04.18 11:36

Will be improved as soon as possible. For a while you can use this include file as a reference.

@Stanislav Korotky
 

Forum on trading, automated trading systems and testing trading strategies

Discussion of the article "Back propagation neural networks on MQL5 matrices"

Stanislav Korotky , 2024.04.16 17:34

To work on netting accounts, you need to specify symbol explicitly in the ClosePosition function:

 bool ClosePosition()
{
    // define empty struct
   MqlTradeRequest request = {};
   ...
   // fill in required fields
   request.action = TRADE_ACTION_DEAL;
   request.position = PositionGetInteger(POSITION_TICKET);
   request.symbol = _Symbol;
   const ENUM_ORDER_TYPE type = (ENUM_ORDER_TYPE)(PositionGetInteger(POSITION_TYPE) ^ 1);
   request.type = type;
   request.price = SymbolInfoDouble(_Symbol, type == ORDER_TYPE_BUY ? SYMBOL_ASK : SYMBOL_BID);
   request.volume = PositionGetDouble(POSITION_VOLUME);
   ...
     
   // send the request
   ...
}

Reason: