I don't want the topic to go just to the statistics of neural network topics.
I propose to share experience and problems in working and training of non-standard neural network architectures.
Here's the first link for theory
http://cgm.computergraphics.ru/content/view/62
or a pdf file
In neural networks I grew up with Heikin. I'm telling you right off the bat the book is not easy. You have to be very good at maths and you have to be on a first-name basis with a capital T.
In addition I would like to say that even in this book from a reputable publishing house there are a lot of errors. Whether they were in it by accident or not is a third question.
On neural networks, it's better to read in the originals. And this is English...
It's strange, either almost no one is interested in it or everyone gets drunk on what they have and no one wants to distribute the place.
more likely the second)
I invite you to share your experiences and challenges in working and learning Non-standard neural network architectures....
Isn't it time to start sharing experiences?
Gentlemen, training my perceptrons with back propagation algorithm. It works, but probability of finding global extremum is 50%-70% (from 100 neurons). Recently I finished writing genetics for XORa - I was happy. But when the average perceptron multiplied and began to mate, I realized that without parallel computation I'd sit for a month! Who overcame this limitation?
Which limitation exactly are you talking about?
What kind of restriction are you talking about?
- Free trading apps
- Over 8,000 signals for copying
- Economic news for exploring financial markets
You agree to website policy and terms of use
I don't want the topic to go just to the statistics of neural network topics.
I propose to share experience and problems in working and training of non-standard neural network architectures.
Here's the first link for theory
http://cgm.computergraphics.ru/content/view/62
or pdf file