Multiple Optimization MT5 - page 3

 
Enrique Enguix #:
In this thread interesting things were discussed on the subject, it is a long thread, but I see valuable information

Nice

 

With the simple Custom Criterion Max optimization i tried balancing the buy and sell trades and their frequency , all the trades were virtual and it actually did not stop the optimization 

(although there were no trades taken!)

I just used the OnTesterPass to pass my evaluation score to the optimizer . 

This is really handy , kudos to MQ 

(i ran it on one asset , the balancing of multiple assets is a bit harder , will try later)


 
Lorentzos Roussos # :

Con la sencilla optimización Custom Criterion Max, intenté equilibrar las operaciones de compra y venta y su frecuencia, todas las operaciones eran virtuales y en realidad no detuvo la optimización. 

(¡aunque no se realizaron cambios!)

Acabo de usar OnTesterPass para pasar mi puntaje de evaluación al optimizador. 

Esto es realmente útil, felicitaciones a MQ 

(Lo ejecuté en un activo, el equilibrio de múltiples activos es un poco más difícil, lo intentaré más tarde)


Great!!! Nice work!
 
Enrique Enguix #:
Great!!! Nice work!

Tx

Although there is a potential issue . 

The balance is affecting the score by being multiplied to the p+l . 

That means that a bad setup with negative score will get its score shrank thereby ranking higher the less balanced it is.

           double balance_ratio=0.0;//if balance ratio is 1.0 it essentially carries the score as is to the next step
           //if the buys are more than the sells 
             if(total_buy_patterns>total_sell_patterns&&total_sell_patterns>0){
             balance_ratio=((double)total_sell_patterns)/((double)total_buy_patterns);
             }
           //if the sells are more than the buys 
             else if(total_sell_patterns>total_buy_patterns&&total_buy_patterns>0){
             balance_ratio=((double)total_buy_patterns)/((double)total_sell_patterns);
             }
           //if equal non zero
             else if(total_buy_patterns==total_sell_patterns&&total_buy_patterns>0){
             balance_ratio=1.0;
             }
         //adjust score 
           total_score*=MathPow(balance_ratio,2);
 

I played around , as a noob , with frames a bit . They are actually not difficult.

If you consider the solution you would need to deploy to access the results (and more statistics that are not displayed) from each optimization , 

and then you read the frames documentation that is exactly what it is . 

Here is a little low level example you can try by optimizing the only parameter from 1 to 10 in custom max mode.

#property version   "1.00"
input  int x=1;//X PARAMETER
int OnInit()
  {
  return(INIT_SUCCEEDED);
  }
double OnTester()
  {
//---
   double ret=0.0;
   /*
   Let's say you have a custom criterion for optimization which has subcomponents.
   You are running in custom max optimization.
   The optimization results will contain some default tester statistics but as far 
       as your custom value is concerned you can only see the result , i.e. the ret that 
       the gen.algo used to optimize its a**
       In my case i wanted to consider various other custom metrics too and that 
       was impossible with the default optimization table so , mq provides frames for that.
   What would your custom solution be if it were not for frames regarding this need ?
       >you'd throw your custom stats into a file , alongside the inputs.
       >you would have to get the agent name that run the test.
       >maintain custom agent counters to log the passes per agent.
       >at the end of the optimization you would then collect all the pass files from all the agents 
       >and compose a csv file (or whatever output suits you)
   This is exactly what the frames do actually .
       >Frame add ,takes your stats , whatever they are , that you send in the data array 
        and throws them in a temp file (i think its temporary as i could not access it from outside the tester)
       >The name on the frame add is what you want to call that collection of optimization data in general
       >So is the id
   So let's say that i want to log all my statistics during each pass in a collection called MyFirstFrame with ID 1
       my custom stats would look like this , just an example :
   */
   double data[]={1*x,2*x,3*x,4*x}; 
   //and the custom criterion would just be the x for this test , which is the optimized parameter 
   ret=x;
   //so now i just say add to the collection with this name and this id , the custom value and my stats :
   FrameAdd("MyFirstFrame",1,ret,data);
   //and its done 
   
  return(ret);
  }

void OnTesterDeinit(){
/*
Now when the optimizer finishes all the frames of that collection can be read 
and i can add them to a file , or a custom structure etc.
*/
read_frames("MyFirstFrame",1);
}
//read frames : this is probably very low level but it get's the example across i hope
void read_frames(string _name,ulong _id){
ResetLastError();
if(FrameFirst()){//reset the filter and the pointer of reading frames
  ResetLastError();
  bool filtered=FrameFilter(_name,_id);//then select the collection with this name and this id!
  if(filtered){
    //draft collection of results 
    string msg="";
      //this counter is not filled by the reader but is custom 
      int frames_total=0;
      //this one is the opt pass 
      ulong opt_pass=0;
      //this returns the name of the "collection"
      string frame_collection_name="";
      //and the id of the collection 
      ulong frame_collection_id=0;
      //and the presence of these 2 returns above indicates you can read all the available frames probably
      double value=0.0;//the value of the custom max will be sent here
      double data[];//and our custom stats here
      while(FrameNext(opt_pass,frame_collection_name,frame_collection_id,value,data)){
      frames_total++;
      msg+="Frame["+IntegerToString(frames_total)+"]:Pass["+IntegerToString(opt_pass)+"]:["+frame_collection_name+"]["+IntegerToString(frame_collection_id)+"]::("+DoubleToString(value,2)+")";
      //custom stats 
        for(int j=0;j<ArraySize(data);j++){
        msg+="\n---------------------("+DoubleToString(data[j],2)+")";
        }
        msg+="\n";
        ArrayFree(data);
      }
    Alert(msg);
  }else{
  Print("Filtering failed #"+IntegerToString(GetLastError()));
  }
}else{
Print("Cannot reset the frame filter #"+IntegerToString(GetLastError()));
}
}

void OnDeinit(const int reason){}
void OnTick(){}
 
Lorentzos Roussos #:

I played around , as a noob , with frames a bit . They are actually not difficult.

If you consider the solution you would need to deploy to access the results (and more statistics that are not displayed) from each optimization , 

and then you read the frames documentation that is exactly what it is . 

Here is a little low level example you can try by optimizing the only parameter from 1 to 10 in custom max mode.

Barbarian! What for you are newbie tests would take me 100 hours of thinking and executing ;)


A great advance on your part. I'm looking closely at all of this. Really interesting

 
Enrique Enguix #:

Barbarian! What for you are newbie tests would take me 100 hours of thinking and executing ;)


A great advance on your part. I'm looking closely at all of this. Really interesting

xD No where near to fxsaber's solution , i just wanted something you set up , fill up and collect at the end .

In fact , mq expected these things and allows the start of stuff at TesterInit , and the end of stuff at TesterDeinit like its a separate program that starts ontesterinit and ends on testerdeinig .Hence the extra chart to the left of the stats per 

run , it can run commands display tables etc . really handy. 


 

Okay here is my attempt .

So what on earth does it do ?

The Frames send inputs and a data array to the end of the optimization . 

So we come in and say since the tester starts a "program" on "OnTesterInit" which ends "OnTesterDeinit" we will create a structure 

which is going to contain a list of our custom (or default ,or both) statistics . 

Then we make sure to add the stats in the same sequence in the ontester function 

Which means we will receive a nice package of named statistics and inputs at the end of the test.

This nice package can be saved on the normal MQL5\Files location and the idea is to then view it and 

inspect it . (and export set files too).(both not implemented yet)

Here is an example of usage , a very simple one . 

I hope this is not confusing at all , and the include is a little bloated because of some additions during testing . Not all functions are needed (damit)

Cheers 

  1. Setup a project OnTesterInit
  2. OnTester create a replica of the statistics you maintain while adding their value too
  3. OnTesterDeinit collect all the optimization frames and inputs 
#property version   "1.00"
input int parameter_x=1;//parameter
input double parameter_y=7.2;//parameter y 
#include "OcelotFrameCat.mqh";

custom_optimization_project MyProject;

int OnInit(){return(INIT_SUCCEEDED);}

void OnDeinit(const int reason){}

void OnTick(){}

void OnTesterInit()
  {
  //SETUP THE PROJECT - This happens on Tester Init exclusively 
    //we can do a simple setup 
      MyProject.setup("MyFirstProject","FirstOptimization",1);
      //and then setup some statistics , they can be default statistics 
        MyProject.add_statistic(STAT_EQUITYMIN);
      //or they can be our statistics (that will be calculated by us in Ontester and we will pass them in)
        MyProject.add_statistic("BalanceOfEurusd");
      //or if we like we can add all the tester statistics 
        MyProject.add_all_default_stats();
        //now we have STAT_EQUITYMIN twice tho :P
    //So lets create a test project 
      MyProject.reset();
      //we have the ability to send an array of string 
      //                         or an array of tester statistics (of our choice)
      //                         or both 
      //and setup directly 
      string mystats[]={"xvalue","xMultiple","yMultiple"};
      MyProject.setup("MyFirstProject","FirstOptimization",1,mystats);
      //so let's run with this one
      //The setup is done , the next thing we need to do is fill up the stats on tester 
      //<!> NOTE IF YOU WANT TO INCLUDE THE RET return value of the tester in the stats 
      //    YOU WILL HAVE TO CREATE A SEPARATE STATISTIC , a custom one 
      //    IN THIS EXAMPLE IT IS THE xvalue 
        
  }
double OnTester()
  {
  double ret=parameter_x;
  
  //the Myproject structure is available ? let's see
    //lets calculate some of our custom stats 
      //xvalue is the ret
      //xMultiple let's set something
        double xm=3*ret;
      //yMultiple
        double ym=3*parameter_y*xm;
      //okay now we must set the values of the custom stats in the project
        /* the on tester cannot see the MyProject because it runs on a separate program essentially
           so we use a transitory structure to which we add the stats 
           in the same order as in the setup of MyProject.
        */
        custom_optimization_project Transitory;
        Transitory.setup("","FirstOptimization",1);//we provide only the collection name      
        Transitory.add_statistic("xvalue",ret);
        Transitory.add_statistic("xMultiple",xm);
        Transitory.add_statistic("yMultiple",ym);
        Transitory.store_stats(ret);//and this essentially calls the frame add
        //and we are done
      //Then we move on to tester deinit 

  return(ret);
  }

void OnTesterPass()
  {
  }

void OnTesterDeinit()
  {
  /* at this point the testing has finished and we can collect all the frames 
     and inputs 
  */
  MyProject.assemble_passes(true,"yourOutputFolder","ocl");
  /*
  Test1 [x][pass] : Does it save in the output folder ? YES! it does not have to be in the common folder either
  Test2 [x][fail] : Alert the f*cking contents + verify we can access the structure ontester ? We cannot access the same structure
  Test3 [x][pass] : Use a transitory to send the frames in a more civilized manner at least : OKAY
  
  string msg="Total Passes : "+IntegerToString(ArraySize(MyProject.optimization_passes))+"\n";
  for(int i=0;i<ArraySize(MyProject.optimization_passes);i++){
     msg+="#"+IntegerToString(MyProject.optimization_passes[i].rank)+"("+IntegerToString(MyProject.optimization_passes[i].optimization_pass)+")::STATS::";
     //stats 
       for(int j=0;j<ArraySize(MyProject.optimization_passes[i].stats);j++)
         {
         msg+=MyProject.optimization_passes[i].stats[j].get_name()+"="+DoubleToString(MyProject.optimization_passes[i].stats[j].get_value(),2)+" ";
         }
     //inputs 
       msg+="\n:::::::::::::::::INPUTS::";
       for(int j=0;j<ArraySize(MyProject.optimization_passes[i].inputs);j++)
        {
        msg+=MyProject.optimization_passes[i].inputs[j].inputName()+"="+MyProject.optimization_passes[i].inputs[j].inputValueAsText()+"opt("+MyProject.optimization_passes[i].inputs[j].inputWasOptimized()+") ";
        }
     msg+="\n";
     }
  Alert(msg);
  */
  }


 

Files:
 
Lorentzos Roussos #:

Okay here is my attempt .

So what on earth does it do ?

The Frames send inputs and a data array to the end of the optimization . 

So we come in and say since the tester starts a "program" on "OnTesterInit" which ends "OnTesterDeinit" we will create a structure 

which is going to contain a list of our custom (or default ,or both) statistics . 

Then we make sure to add the stats in the same sequence in the ontester function 

Which means we will receive a nice package of named statistics and inputs at the end of the test.

This nice package can be saved on the normal MQL5\Files location and the idea is to then view it and 

inspect it . (and export set files too).(both not implemented yet)

Here is an example of usage , a very simple one . 

I hope this is not confusing at all , and the include is a little bloated because of some additions during testing . Not all functions are needed (damit)

Cheers 

  1. Setup a project OnTesterInit
  2. OnTester create a replica of the statistics you maintain while adding their value too
  3. OnTesterDeinit collect all the optimization frames and inputs 


 

I think you've done a spectacular job with this. I'm out right now and can't prove it, although I'm sure it's something really impressive. I think you should write an article or something similar, there is a very important step forward for the community here, I hope you know how to value it. As soon as I return to work I get to work to integrate it

 
Enrique Enguix #:

I think you've done a spectacular job with this. I'm out right now and can't prove it, although I'm sure it's something really impressive. I think you should write an article or something similar, there is a very important step forward for the community here, I hope you know how to value it. As soon as I return to work I get to work to integrate it

Thanks . A library for these things already exists actually and its probably superior . I'm just anal like that , the best way to understand something is to build it.

And it fits what i need right now because i will try an approach with more than 1 custom factors , so i need to be able to see the additional custom stats as well , choose a solution, export the set file and then run it again , collect the patterns , next symbol optimization etc.