Machine learning in trading: theory, models, practice and algo-trading - page 3105

 
Aleksey Nikolayev #:

It's about time we all moved to the bright side - to the matstat!)

The dark side, as always, opposes it) Dark in the sense that it always tries to reduce everything to the dark and unclear - in the extreme version to a certain "gut feeling").

What does matstat have to do with it?

The man is exercising on STATIONARY rows, and we are discussing the clip in all seriousness! It has nothing to do with us at all, along with his null hypotheses.

 
Aleksey Nikolayev #:

It's about time we all moved to the bright side - to the matstat!)

or reproducible examples in the form of code

 
СанСаныч Фоменко #:

What's matstat got to do with it?

The video is a successful attempt to explain things important for understanding matstat on a simple but meaningful level.

SanSanych Fomenko #:

A man exercises on STATIONARY rows, and we are discussing the video in all seriousness! It has nothing to do with us at all along with his null hypotheses.

You, I recall, were exercising on garches, which are usually stationary too) And "null hypotheses" is just basic matstat terminology that you just have to know and understand.

 
Aleksey Nikolayev #:


The video is a successful attempt to explain things important for understanding matstat on a simple but meaningful level.

From a general educational point of view, of course, but it is much more important to discuss only what is applicable to financial time series.


You, I recall, were exercising with garches, which are usually stationary as well)

Since when are garchas stationary?

The premise in garchas is that the original series is NOT stationary, moreover, a differentiated time series is NOT stationary. And garch is an attempt to model the NOT stationarity of the original series. Let's look at rugarch, there in the function itself is modelling of three features of the pre-differentiated series, which (features) refer the series to non-stationary.

 
It feels (and it is not a feeling) that the negative profdeformation has reached such proportions that no material is no longer perceived "as is", but takes a complex path through the adhesions of former neuronal victories, and this enriched "truth" is ejected under pressure back through the mouth opening
 
СанСаныч Фоменко #:

Since when are garci stationary?

It has always been stationary (GARCH(p,q)) provided that the sum of all p+q coefficients is less than one.

 
What is the problem to take some other test for non-stationary series and do the same with it. Will it change the point?
 
Maxim Dmitrievsky #:
The feeling (and it is not a feeling) is that negative profdeformation has reached such proportions that no material is no longer perceived "as is", but takes a complex path through the adhesions of former neuronal victories, and this enriched "truth" is ejected back through the mouth under pressure

So true) And it shows with frightening clarity that intellectually most of us may well be replaced by AI)

 
Aleksey Nikolayev #:

It is) And it shows with frightening clarity that intellectually most of us may well be replaced by AI)

Yes, the boundaries are already being felt, it seems to me that such processes are not far off :)
 
Aleksey Nikolayev #:

that intellectually most of us could very well be replaced by AI)

Yes....

But we still have a few years, or months, to go.))


For now, there are two problems for launching a strong AI

1. Too voracious architectures

2. Too weak hardware

These are essentially two sides of the same coin...

But work is underway to solve both the first problem and the second...


They are not in a hurry to change the architecture (neural networks are our everything), but theywill have to, but with fast hardware (quantum computers) everything is much more active.