Machine learning in trading: theory, models, practice and algo-trading - page 513

 

I wrote my first serious program in assembly language on a Radio 86RK computer. )

 
Grigoriy Chaunin:

I wrote my first serious program in assembly language on a Radio 86RK computer. )


We wrote in asembler at UMKashki... when we were doing labs. After those lessons our brains were pounding...

 

Lucky you, I started "writing" something about two years ago and immediately in MQL, I liked it, but I'm really dumb because of the lack of basic knowledge. I don't even know what binary numbers differ from 16-bit numbers and what is a number with an "e" at the end.

 

Intel is going to release a processor specifically for neural networks https://newsroom.intel.com/editorials/intel-pioneers-new-technologies-advance-artificial-intelligence/

Intel Pioneers New Technologies to Advance Artificial Intelligence
Intel Pioneers New Technologies to Advance Artificial Intelligence
  • newsroom.intel.com
Today I spoke at the WSJDLive global technology conference about cognitive and artificial intelligence (AI) technology, two nascent areas that I believe will be transformative to the industry and world. These systems also offer tremendous market opportunity and are on a trajectory to reach $46 billion in industry revenue by 20201. At Intel...
 
elibrarius:

Intel is going to release a processor specifically for neural networks https://newsroom.intel.com/editorials/intel-pioneers-new-technologies-advance-artificial-intelligence/


IMHO

I'm sure that neural networks will come into our lives, but I'm also sure that it will be generally available in 20-25 years.

I can assume that right now we are at a level like a dog howling at the moon.

And large corporations at this stage of development will not share any worthwhile information on grids.

And most likely it will be not just a processor, but something completely different.

I can talk nonsense of course, but it's my opinion.

 
Vladimir Gribachev:

IMHO

I'm sure neuronics will come into our lives, but I'm also sure that it will be generally available in 20-25 years.

I can assume that right now we are at a level like a dog howling at the moon.

And large corporations at this stage of development will not share any worthwhile information on grids.

And most likely it will be not just a processor, but something completely different.

Of course I can talk nonsense, but this is my opinion.


All the information is in the public domain, tightly entered already, in all areas :) We are just as the Russian-speaking in the lagging echelon, all argue what features to pick up and what model to use, because there is no experience gained so far

 

Look what happened to Google Translator. Now it works on neural networks. But normal information on NS is hard to find, even in English. It's hard to find it in the public domain. There are a lot of paid materials. Especially books. But I don't know English. I tried to learn it all to no avail.

 
Grigoriy Chaunin:

Look what happened to Google Translator. Now it works on neural networks. But normal information on NS is hard to find, even in English. It's hard to find it in the public domain. There are a lot of paid materials. Especially books. But I don't know English. I tried to learn it all for nothing.


Well, yes, especially considering that Google gave his framework TensorFlow to people for free :) how not, a lot of information. A lot of videos on YouTube in Russian

 

There are a lot of videos, but they don't explain the subtleties.

 
Grigoriy Chaunin:

There are tons of videos, but they don't explain the subtleties.

+1

There are thousands of articles on how to train neuronics using backward descent and some complex formulas. But only by miracle do you come across rare articles that explain this process in simple language in terms of mathematics and derivatives, which is exactly what you need to __understand__ how the learning process takes place. Understanding this you can easily create neuronics in any convenient programming language, with any layers and activation functions.