Discussion of article "Programming a Deep Neural Network from Scratch using MQL Language" - page 2

You are missing trading opportunities:
- Free trading apps
- Over 8,000 signals for copying
- Economic news for exploring financial markets
Registration
Log in
You agree to website policy and terms of use
If you do not have an account, please register
Hi Li,
I have updated the article with 2 demos files. One for MQL5 and one for MQL4. The DeepNeuralNetwork.mqh can be used for both, mql5 and mql4.
In anyway, I'm attaching this files here to show how to use it.
Let me know if you have more questions.
Thanks for the codes you shared. Tried to understand the way of your.
I have some hesitation since about yValues[0] , yValues[1], yValues[2] since they are NOT changing and always 0,33333 but _xValues[1,2,3] are changing with new bar ; so if trade based on yValues I did not see ANY TRADE while TRADE OCCURS when conditions based on _xValues.
Is it my fault or simply miscoding error in your original code ?
Update the following function to return `bool` instead of `void` and you will see that there was a bad number of weights given.
Note that you also need to update the wheights on top of the file (it's not enough to update them only when you initialize the network :P
many thanks
Nice animation.
A two-layer neural network is a "shallow" neural network, not a deep neural network. Deep neural networks include networks with more than three hidden layers. Due to the peculiarities of training such neural networks, deep learning methods have been developed.
The article as an example of programming on MCL is probably useful. For familiarisation with the topic of MLP - certainly necessary. As an example of neural network application - not complete and far behind the current state of the topic.
As a rule without optimisation of hyperparameters neural network does not give satisfactory quality.
I just don't understand why to build a bicycle from improvised means, if there is a sea of ready programmes on this topic?
Correction. There is a definition of a deep net in the article. I didn't see it.
The process of optimising the weights of a neural network using genetics is not literally "learning". It is optimisation after all. Training uses completely different methods. Although this variant of neural network use is also practiced and quite successfully.
To understand the work of a neural network, it is important to understand how a neural network is trained by back propagation of error. Well, I'm already picking on you :)
Good luck
Is there any way to incorporate error back propagation into this network?
For some reason there is an opinion that such a network would be more flexible and provide better inputs - outputs....
A network with error back propagation is not a completely different network, is it?