Help with Fann 2 Mql!!

 

Hi

I am programming a Neural Network but I have the following problem:

I run it once, and I get some results, then I run it again, with the same data and I get absolutely different results (in testing).

It is apparently training over and over again and is not erasing the previous data (at least I get that).. But I run the destroyer every time, dont know why this happens.

If anyone can help thanks in advance.

Print( "DESSS "+ f2M_destroy_all_anns());
    
   int i;
   double MSE;
  

   ArrayResize(trainingData,1);

   ann = f2M_create_standard(nn_layer, nn_input, nn_hidden1, nn_hidden2, nn_output);
   

   debug("f2M_create_standard()",ann);
   

        f2M_set_act_function_hidden (ann, FANN_SIGMOID_SYMMETRIC_STEPWISE);
        f2M_set_act_function_output (ann, FANN_SIGMOID_SYMMETRIC_STEPWISE);
        
        
        f2M_randomize_weights (ann, -0.77, 0.77);

        
   printDataArray();

    for(int xi=0;xi<150;xi++)
      {
         
         if(Open[xi]>(Close[xi]+50) && Open[xi+1]>(Close[xi+1]+50) )
         {
            doTrain(mfi(6+xi)-mfi(7+xi),mfi(5+xi)-mfi(6+xi),mfi(4+xi)-mfi(5+xi),mfi(3+xi)-mfi(4+xi)
      ,rsi(6+xi)-rsi(7+xi),rsi(5+xi)-rsi(6+xi),rsi(4+xi)-rsi(5+xi),rsi(3+xi)-rsi(4+xi)
       ,ma(6+xi)*100-ma(7+xi)*100,ma(5+xi)*100-ma(6+xi)*100,ma(4+xi)*100-ma(5+xi)*100,ma(3+xi)*100-ma(4+xi)*100,
       -1
      );
         }
         
         if(Open[xi]<(Close[xi]-50) && Open[xi+1]<(Close[xi+1]-50) )
         {
            doTrain(mfi(6+xi)-mfi(7+xi),mfi(5+xi)-mfi(6+xi),mfi(4+xi)-mfi(5+xi),mfi(3+xi)-mfi(4+xi)
      ,rsi(6+xi)-rsi(7+xi),rsi(5+xi)-rsi(6+xi),rsi(4+xi)-rsi(5+xi),rsi(3+xi)-rsi(4+xi)
       ,ma(6+xi)*100-ma(7+xi)*100,ma(5+xi)*100-ma(6+xi)*100,ma(4+xi)*100-ma(5+xi)*100,ma(3+xi)*100-ma(4+xi)*100,
       1);
         }
         
      }
   

      for (i=0;i<maxTraining;i++) {
      MSE = teach(); // everytime the loop run, the teach() function is activated. Check the comments associated to this function to understand more.
      if (MSE < targetMSE) { 
               i = maxTraining;       }
   }
   
 

you destroy fine, but you also start a new training every time you start. since the internal work of a neural net is quite like a blackbox you will most likely never get the same result if you train 2 different nets. as bigger the net as lower the chance of the same net.

if you want reproduce your net you have to save your weights after you have a good working net and the next time you have to load it.

i have used the fann2mql library one, and i think there are f2m_save() and f2m_load() functions.

just browse in the library...

i hope it helps..

//z

 
zzuegg:

you destroy fine, but you also start a new training every time you start. since the internal work of a neural net is quite like a blackbox you will most likely never get the same result if you train 2 different nets. as bigger the net as lower the chance of the same net.

if you want reproduce your net you have to save your weights after you have a good working net and the next time you have to load it.

i have used the fann2mql library one, and i think there are f2m_save() and f2m_load() functions.

just browse in the library...

i hope it helps..

//z


THank you I didnt know that concept, I though that if used the same input I had to get the same output as well. It sounds amazing for me that it isnt like that. I am new to NNs.