Machine learning in trading: theory, models, practice and algo-trading - page 2147

 
RStudio AI Blog: FNN-VAE for noisy time series forecasting
RStudio AI Blog: FNN-VAE for noisy time series forecasting
  • 2020.07.31
  • blogs.rstudio.com
This post did not end up quite the way I’d imagined. A quick follow-up on the recent Time series prediction with FNN-LSTM, it was supposed to demonstrate how noisy time series (so common in practice) could profit from a change in architecture: Instead of FNN-LSTM, an LSTM autoencoder regularized by false nearest neighbors (FNN) loss, use...
 
mytarmailS:

I didn't see it, or I saw it, but I didn't listen to it...


Context space, how we think, how it can be implemented in code...

VAE , CNN is the same but as special cases of creation of secondary attributes from primary, he says how to work with any data

context space is implemented through recurrences and stuff. what did he invent?

 
Maxim Dmitrievsky:

there is a lot of water, I did not see any links to the code

well there's the whole topic for 12 hours ) he just wants to simulate the whole brain, not only the calculator but also the memory

 
mytarmailS:

well, the whole topic is 12 hours) he just wants to simulate the whole brain, not only the calculator but also memory

memory is modeled by latent states in recurrences

I think he's just a windbag.

 
Maxim Dmitrievsky:

context spaces are implemented through recurrences, etc. What did he invent?

I won't explain, I'm not sure I understand it...

In simple terms, some transformations are applied to signs, billions of transformations (hello MGUA), and if there is some transformation that gives some "necessary" response on signs, then we have found a "context", so there are thousands of contexts and we as though pass to a new space of signs (contexts)...

And now we can not even look at the primary signs, the recognition goes on the contexts...


Why is it so, how is it 1000 times better than the current technology, well it requires a video, but believe me it is worth watching, it is for this approach the future

He has a lot of videos

 
mytarmailS:

I do not undertake to explain, I am not sure that I myself understand ...

Well if to put it simply, some transformations are applied to signs, billions of transformations (hi MGUA), and if there is some transformation which gives some "necessary" response on signs then we have found a "context", so there are thousands of contexts and we as though pass to a new space of signs (contexts)...

And now we can not even look at the primary signs, the recognition goes on the contexts...


Why is it so, how is it 1000 times better than the current technology, well it requires a video, but believe me it is worth watching, it is for this approach the future

He has a lot of videos

I went through his videos... when I got to some rubbish about pyramids - I turned it off))

As far as I understand, he supports some kind of wave theory of the brain. esoteric, in short.

I got the impression he's a babbler.

 
Maxim Dmitrievsky:

I flipped through the videos... when I got to some rubbish about pyramids, I turned it off ))

I got the impression I was a babbler.

So don't look through it, but take a look, but not at pyramids of course. Is it bad that the man is versatile?

He's a cool guy, he's an AI researcher...



I mean, if you train a grid on SEALs

and give it a picture of a scratched-up couch, it'd be okay if it wasn't in the classroom.

But with contexts, she recognizes that there was a cat. And it explains how and why, and we think just like that, through contexts.

 
mytarmailS:

So you do not flip through it, but look, but not about the pyramids of course. Is it bad that the man is versatile?

The man is cool, a researcher on AI ...

What's to see? Where are the NS codes? It's not interesting to listen to an hour of verbal diarrhea.

 
Maxim Dmitrievsky:

What to watch? Where are the NS codes? It's not interesting to listen to an hour's worth of verbal diarrhea.

I don't know, I watched everything with my mouth open...

Even the pyramids....

 
mytarmailS:

I don't know, I watched everything with my mouth open...

Even the pyramids....

it reminded me of the ramblings of a lunatic.)