You are missing trading opportunities:
- Free trading apps
- Over 8,000 signals for copying
- Economic news for exploring financial markets
Registration
Log in
You agree to website policy and terms of use
If you do not have an account, please register
Dear Dear emsi
Thanks a million for your kindness !!! I had got the Installer from the perfect link you gave . And welly installed it .
Best regard
Please Help:
someone has idea to how to ADD a MA indicator (or any other indicator) to the
whole Neuro EA presented here, in order to have a MACD+MA neural?
Please help writing this piece of code if you can/know how to do.
Thanks and Regards
It is possible to have this EA with OrderClose() strategy function running, but being fixed of ticket error problem of original EA NeuroMACD.mq4?
In other words, is possible to have the EA opening and closing buy and sell orders in correctly way? In other more words it is possible
to have NeuroMACD-fixed.mq4 but with OrderClose() happening?
I found that OrderClose() function is necessary for improve profit factor, instead of the EA without OrderClose() function
and with positions just closed by hitting TP or SL.
It is possible to have re-written this cool EA with OrderClose(), TP, SL, and that it trades both Short and Long positions, both Neural filtered and unfiltered.
I found that in the version with OrderClose() function active, it won't trade buy positions while Neural filtered.
Help. That's all confusing me.
Thank you.
The package is easy to install, and your instructions are clear and understandable in this post. I am going to work on a separate EA based trainer and also an EA based indicator (not self trading ) based on the ann saved from the trainer. Once I have got it up and running I will post it here for any who might be interested to use/edit or enhance.
I am uploading an instructive trainer mql which was written bu Julien Loutre http://www.forexcomm.com, and it is helpful to get your info and his mql together for a clearer understanding of how the structure works and can be implemented.
Another very informative document written by Steffen Nissen can be found at http://www.scribd.com/doc/16956544/Easy-Neural-Networks-with-FANN
Which assists the perspective regarding the detail level required from the ANN functions. If too many tests are done the ANN gets restricted to specific details, while if too few are done it searches on too broad a level.
My intentions with the trainer is to allow a simple method of the user putting the required pattern structure to be trained with both correct and incorrect situations, and after the chosen Mean Square Error is reached, to save the file for use with the indicator.
I will post further once I begin making some progress.
Blue Mental.
See also: FANN-EA.mq4 (8.0 Kb)
https://www.mql5.com/ru/code/9386 - russian lang. only
Hi, Mariusz,
First of all I want to say thanks a lot for your great job. But I've found a bug in the library. I realize that, at least, "f2M_randomize_weights" does not work properly. I suppose it's not your bug, but fann.
The point of bug is that if you would use the same input data for two NN, even at the first train iteration both NN return the same answers. Of course, I've got the same answers after ~10 000 train iterations... It means that weights did not randomizeed properly.
And one else... Do you plan to include "activation_steepness" functions in your library? I think it would be helpful... Thanks.
I've found strange things in a network profile (.net file). I created ANN by ann = f2M_create_standard (4, AnnInputs, AnnInputs, AnnInputs / 2 + 1, 1); where AnnInputs=30.
But in the network profile I've found a line with that content:
layer_sizes=31 31 17 2
According to this record all layer_sizes was increased by one. The question is what the real sizes was used by f2M_create_standard ?
According to "f2M_randomize_weights". I saved just created ANN and realized that randomizing was done successful.
So the reason why ANNs with identical input vector returns the same answer is still incomprehensible.
I've found strange things in a network profile (.net file). I created ANN by ann = f2M_create_standard (4, AnnInputs, AnnInputs, AnnInputs / 2 + 1, 1); where AnnInputs=30.
But in the network profile I've found a line with that content:
layer_sizes=31 31 17 2
According to this record all layer_sizes was increased by one. The question is what the real sizes was used by f2M_create_standard ?