Neural networks. Questions from the experts. - page 16

 
joo:

But apparently lasso is training his bio-network to work with the artificial ones in this way.

Funny :)))) Man, Nsh is so addictive in this regard, sometimes it's extremely hard to tear yourself away.....
 
lasso:

Hello.


The classes are very mixed, even more so than in fig..,


but still distinguishable.

I am able to divide even by non-cunning linear methods, but I can't get the NS to produce a better (or at least comparable in quality) result.

I hope for adequate answers and that they will be useful for other forumers too!

Have been deprived of communication with you for some time... Let's get on with it.

..............

The problem posed on page 14 is indeed solved elementary by a neural network.

On the screenshot is one of possible configurations. You can painlessly reduce the number of neurons in a hidden layer to 7, for example.



And what was pleasant and surprising, it's that NS finds its own solutions, unlike obvious linear methods (in relation to this simple problem),

and these solutions are puzzling at first, but after extensive analysis, it turns out that NS solutions are even slightly more efficient than linear ones. O-o-o-o-o!

Moving on.... ;-)

 
lasso:

The problem posed on page 14 is indeed solved elementary with a neural network.

The screenshot is one of the possible configurations. The number of neurons in the hidden layer can be painlessly reduced to 7, for example.



And what was surprising and pleasant was the fact that NS finds its own solutions, unlike obvious linear methods (for this simple problem),

and these solutions are puzzling at first, but after a comprehensive analysis, it turns out that NS solutions are even slightly more efficient than linear ones. O-o-o-o-o!

Moving on.... ;-)


Is it a stochastic crossing problem? I don't understand what you want to achieve by solving this problem, obviously the paucity of the situation description won't allow NS to solve this problem in any "meaningful" way. Such a network will be extremely unstable and absolutely useless outside the OV. But it will do for a training task, of course...

Where do we go from here? Maybe we should get closer to the time series prediction?

 
Figar0:

Is it a stochastic crossing problem? I do not understand what you wanted to achieve by solving this problem, obviously the sparse description of the situation will not allow NS to solve this problem in any "meaningful" way. Such a network will be extremely unstable and absolutely useless outside the OV. But it will do for a training task, of course...

Where do we go from here? Maybe a little closer to our lambs - time series prediction?

I want to learn how to apply NS apparatus in analytical blocks of my TS, and to do it visually and with examples, to the benefit of others.

..................................

So, the task of classification in Statistics 6 -- solved.

But it is a third-party program, which is not directly related to trading.

Yes, it allows to build a lot of graphs, reports, save the found network to a file, generate code in C and VB, etc. Great!

But here we face a new problem!

How to transfer all this wealth correctly and make it work in MQL?

 
lasso:

Yes, it allows to build a lot of graphs and reports, save the found network to a file, generate code in C and in VB, etc. Great!

But here we face a new problem!

How to correctly transfer all this wealth and make it work in MQL?



Well, it is not a problem to attach a dll to the Expert Advisor at all. The Terminal itself in the Experts/Samples folder has a good example, plus a search on the forum using the phrase "connect dll" will help you do it in no time at all. Yes, I think there was an article on this subject... It's not a stumbling block when working with neural networks.

Much more interesting questions:

What should a neural network do? What are the inputs? What is the best way to prepare them? Choose the type and architecture of the NS, etc. All this is easier and more interesting to do on some practical task, such as trading November, and if possible December 2010 in profit as new NS data. Although, perhaps it would be better to put it in a separate branch such as "Beginners and beyond NS practice".

 
Figar0:

Well, it is not a problem to attach a dll to the Expert Advisor at all. The Terminal itself in the Experts/Samples folder has a good example, plus a search on the forum using the phrase "connect dll" will help you do it easily. Yes, I think there was an article on this subject... It's not a stumbling block when working with neural networks.

Much more interesting questions:

What should a neural network do? What are the inputs? What is the best way to prepare them? Choose the type and architecture of the NS, etc. All this is easier and more interesting to do on some practical task, such as trading November, and if possible December 2010 in profit as new NS data. Although, probably it would be more correct to put it in a separate branch such as "Beginners and not only NS practice".

I agree that the questions are interesting. And to have answers to them you need functionality working in MT environment.

......................

The question is not how to attach dll, but where to get this dll?

Does Statistics 6 generate a dll?

Or are you suggesting a novice neural network researcher to write his own NS and make it as a DLL? I don't understand you....

.......................

There is a variant of FANN library.

Are there any other variants?

 
lasso:

The question is not how to attach the dll, but where do I get the dll?

Does Statistics 6 generate a dll?

Or are you suggesting a novice neural network researcher to write the NS himself and make it into a DLL? I don't understand you....

.......................

There is a variant of FANN library.

Statistica, as well as many other NS programs (neuroshell, neurosolutions), generates C code as far as I remember. Probably it is the easiest way out for a newbie. You may write nets directly in MQL but this raises a question of training... I find FANN too cumbersome and not very easy to use.

 
Statistica generates C-source of console application of trained neural network (if you compile such source you get exe-executable file). The code can be ported to MQL4/5 like two fingers on the pavement, with slight modification. That is how I started studying neural networks.
 
lasso:


There is a version of the FANN library.

Are there any other options?


SVM .... http://www.csie.ntu.edu.tw/~cjlin/libsvm/

 
Figar0:

Statistica, like many other NS programs (neuroshell, neurosolutions), generates C code as far as I remember. This is probably the easiest way out for a beginner. You may write nets directly in MQL but this raises a question of training... I find FANN too cumbersome and not very easy to use.


joo:
Statistica generates C-source code for console application of trained neural network (if you compile such source code you get exe-executable file). The code could be ported to MQL4/5 like two fingers on the pavement, with slight modification. That is how I started studying neural networks.

Here's what my Statistics 6 generates ))

What's the joy?

The only joy is to see the signal trend from input to output in the debugger.

/* ------------------------------------------------------------------------- */


#include <stdio.h>
#include <math.h>
#include <string.h>
#include <stdlib.h>

#ifndef FALSE
#define FALSE 0
#define TRUE 1
#endif

#define MENUCODE -999


static double NNCode38Thresholds[] =
{

/* layer 1 */
-0.78576109762088242, -0.23216582173469763, -1.6708808507320108, -1.525614113040888,
1.4153558659332133, -0.77276960668316319, 2.3600992937381298, 2.473963708568014,
-0.43422405325901231, 0.68546943611132893, 0.19836417975077064, 0.26461366779934564,
-0.19131682804149783, 0.24687125804149584, -0.95588612620053504, 0.25329560565058901,
-1.0054817062488075, 1.3224622867600988, 0.88115523574528376, 0.32309684489223067,
0.52538428519764313,

/* layer 2 */
-1.8292886608617505

};

static double NNCode38Weights[] =
{

/* layer 1 */
1.8660729426318707,
1.3727568288578245,
3.1175074758006374,
3.356836518157698,
3.2574311486418068,
3.2774957848884769,
1.4284147042568165,
3.534875314491805,
2.4874577673065557,
2.1516346524000403,
1.9692127720516106,
4.3440737376517129,
2.7850179803408932,
-12.654434243399631,
2.4850018642785399,
2.1683631515554227,
1.77850226182071,
2.1342779960924272,
2.8753050022428206,
3.9464397902669828,
2.5227540467556553,

/* layer 2 */
-0.041641949353302246, -0.099151657230575702, 0.19915689162090328, -0.48586373846026099,
-0.091916813099494746, -0.16863091580772138, -0.11592356639654273, -0.55874391921850786,
0.12335845466035589, -0.022300206392803789, -0.083342117374385544, 1.550222748978116,
0.10305706982775611, 3.9280003726494575, 0.12771097131123971, -0.12144621860368633,
-0.40427171889553365, -0.072652508364580259, 0.20641498115269669, 0.1519896468808962,
0.69632055946019444

};

static double NNCode38Acts[46];

/* ---------------------------------------------------------- */
/*
  NNCode38Run - run neural network NNCode38

  Input and Output variables.
  Variable names are listed below in order, together with each
  variable's offset in the data set at the time code was
  generated (if the variable is then available).
  For nominal variables, the numeric code - class name
  conversion is shown indented below the variable name.
  To provide nominal inputs, use the corresponding numeric code.
  Input variables (Offset):
  stoch

  Выход:
  res
    1=1
    2=-1

*/
/* ---------------------------------------------------------- */

void NNCode38Run( double inputs[], double outputs[], int outputType )
{
  int i, j, k, u;
  double *w = NNCode38Weights, *t = NNCode38Thresholds;

  /* Process inputs - apply pre-processing to each input in turn,
   * storing results in the neuron activations array.
   */

  /* Input 0: standard numeric pre-processing: linear shift and scale. */
  if ( inputs[0] == -9999 )
    NNCode38Acts[0] = 0.48882189239332069;
  else
    NNCode38Acts[0] = inputs[0] * 1.0204081632653061 + 0;

  /*
   * Process layer 1.
   */

  /* For each unit in turn */
  for ( u=0; u < 21; ++u )
  {
    /*
     * First, calculate post-synaptic potentials, storing
     * these in the NNCode38Acts array.
     */

    /* Initialise hidden unit activation to zero */
    NNCode38Acts[1+u] = 0.0;

    /* Accumulate weighted sum from inputs */
    for ( i=0; i < 1; ++i )
      NNCode38Acts[1+u] += *w++ * NNCode38Acts[0+i];

    /* Subtract threshold */
    NNCode38Acts[1+u] -= *t++;

    /* Now apply the logistic activation function, 1 / ( 1 + e^-x ).
     * Deal with overflow and underflow
     */
    if ( NNCode38Acts[1+u] > 100.0 )
       NNCode38Acts[1+u] = 1.0;
    else if ( NNCode38Acts[1+u] < -100.0 )
      NNCode38Acts[1+u] = 0.0;
    else
      NNCode38Acts[1+u] = 1.0 / ( 1.0 + exp( - NNCode38Acts[1+u] ) );
  }

  /*
   * Process layer 2.
   */

  /* For each unit in turn */
  for ( u=0; u < 1; ++u )
  {
    /*
     * First, calculate post-synaptic potentials, storing
     * these in the NNCode38Acts array.
     */

    /* Initialise hidden unit activation to zero */
    NNCode38Acts[22+u] = 0.0;

    /* Accumulate weighted sum from inputs */
    for ( i=0; i < 21; ++i )
      NNCode38Acts[22+u] += *w++ * NNCode38Acts[1+i];

    /* Subtract threshold */
    NNCode38Acts[22+u] -= *t++;

    /* Now calculate negative exponential of PSP
     */
    if ( NNCode38Acts[22+u] > 100.0 )
       NNCode38Acts[22+u] = 0.0;
    else
      NNCode38Acts[22+u] = exp( -NNCode38Acts[22+u] );
  }

  /* Type of output required - selected by outputType parameter */
  switch ( outputType )
  {
    /* The usual type is to generate the output variables */
    case 0:


      /* Post-process output 0, two-state nominal output */
      if ( NNCode38Acts[22] >= 0.05449452669633785 )
        outputs[0] = 2.0;
      else
        outputs[0] = 1.0;
      break;

    /* type 1 is activation of output neurons */
    case 1:
      for ( i=0; i < 1; ++i )
        outputs[i] = NNCode38Acts[22+i];
      break;

    /* type 2 is codebook vector of winning node (lowest actn) 1st hidden layer */
    case 2:
      {
        int winner=0;
        for ( i=1; i < 21; ++i )
          if ( NNCode38Acts[1+i] < NNCode38Acts[1+winner] )
            winner=i;

        for ( i=0; i < 1; ++i )
          outputs[i] = NNCode38Weights[1*winner+i];
      }
      break;

    /* type 3 indicates winning node (lowest actn) in 1st hidden layer */
    case 3:
      {
        int winner=0;
        for ( i=1; i < 21; ++i )
          if ( NNCode38Acts[1+i] < NNCode38Acts[1+winner] )
            winner=i;

        outputs[0] = winner;
      }
      break;
  }
}

Or am I generating in the wrong place?