Discussion of article "Neural networks made easy (Part 24): Improving the tool for Transfer Learning"
And seriously, "neural networks are simple" is when there are at most two articles of this length. Not when I think that mt5 is hung up and I have been receiving notifications about the article "neural networks are simple " for a year already
Neural networks are simple (part of 500988939928177231827361823461827631827361827361827361827361284762834762834762). By the time you read the last part, you will most likely be 89 years old and neural networks will no longer be relevant.
And seriously, "neural networks are simple" is when there are at most two articles of this size. Not when I think that mt5 is hung up and I have been receiving notifications about the article "neural networks are simple " for a year already
The idea of "Neural Networks are Simple" is to show the accessibility of the technology to everyone. Yes, the series is quite long. But the practical use is available to the reader from the second article. And each separate article tells about new possibilities. And it is possible to include them in your developments right after reading the article. Whether to use them or not is a personal matter for each reader. There is no need to wait for the next article. As for the volume of the topic, I can say that science is developing and new algorithms appear every day. It is quite possible that their application can bear fruit in trading.
And so it can be said about each of the articles in the series. The more you dive into the topic, the more you realise the depth and value of such articles. The main thing for beginners is not to give up when you don't see beautiful graphs of balance growth in the article. The author smoothly leads to this. Although, the working copy is attached to several articles, you can take it and use it.
Dmitry thank you.
New article Neural networks made easy (Part 24): Improving the tool for Transfer Learning has been published:
Author: Dmitriy Gizlyk

- Free trading apps
- Over 8,000 signals for copying
- Economic news for exploring financial markets
You agree to website policy and terms of use
New article Neural networks made easy (Part 24): Improving the tool for Transfer Learning has been published:
In the previous article, we created a tool for creating and editing the architecture of neural networks. Today we will continue working on this tool. We will try to make it more user friendly. This may see, top be a step away form our topic. But don't you think that a well organized workspace plays an important role in achieving the result.
In the previous article in this series, we have created a tool to take advantage of the Transfer Learning technology. As a result of the work done, we got a tool that allows the editing of already trained models. With this tool, we can take any number of neural layers from a pre-rained model. Of course, there are limiting conditions. We take only consecutive layers starting from the initial data layer. The reason for this approach lies in the nature of neural networks. They work well only of the initial data is similar to that used when training the model.
Furthermore, the created tool allows not only editing trained models. It also allows creating completely new ones. This will allow to avoid describing the model architecture in the program code. We will only need to describe a model using the tool. Then we will trail and use the model by uploading the created neural network from a file. This enables experimenting with different architectures without changing the program code. This does not even require the recompilation of the program. You will simply need to change the model file.
Such a useful toll should also be as user friendly as possible. Thus, in this article, we will try to improve its usability.
Author: Dmitriy Gizlyk