Machine learning in trading: theory, models, practice and algo-trading - page 3502

 
Aleksey Nikolayev #:

In the Mahabharata, the Bhagavadgita is important, where Krishna explains to Arjuna about karma yoga. "Do what must be done and be what will be" - very well suited to trading, for example)

Yes, it's probably the most famous and powerful text :)

 
Aleksey Vyazmikin #:

Well, this philosophy teaches humility... but it's not good for development.

Development is usually (very often) confused with a change of scenery. Especially when memory is bad.

 
Maxim Dmitrievsky #:

Hmm... try explaining how your approach differs from clustering/mode detection. What are the advantages.

Well, different approaches, my method distills the data, especially relevant in case of strong class imbalance. Along with the sifting process, the data is explored at each iteration. Initially, it is an exploratory method for selecting predictors and quant tables. An additional goal is to facilitate training other classifiers by removing examples in the sample over difficult to detect predictor ranges.

The process can be visualised as a tree like this


The tree has gone through 2 iterations and has come to a third. In the oval is the data that we will no longer explore further when building the current model - class "0".

And this is how I did 100 iterations and saw what probability to make a split that will be as effective in separating zeros from ones on new data.

 
Aleksey Vyazmikin #:

Well, different approaches, in my method there is distillation of data, especially relevant in case of strong class imbalance. Along with the sifting process, the data is explored at each iteration. Initially, it is an exploratory method for selecting predictors and quant tables. An additional goal is to facilitate training other classifiers by removing examples in the sample over difficult to detect predictor ranges.

The process can be visualised as a tree like this


The tree has gone through 2 iterations and has come to a third. In the oval is the data that we will not investigate further when building the current model - class "0".

And so I did 100 iterations and saw what the probability was of making a split that would be just as effective at separating zeros from ones on the new data.

How is it different from clustering?)
 
Aleksey Nikolayev #:
Human hubris does not lead to anything good. Especially when multiplied by Dunning-Kruger.)

My message was that radicals drive progress. Humility is already at the stage of waiting for death, accepting the inevitable and without the possibility of any other outcome.

 
Maxim Dmitrievsky #:

Usually (very often) development is confused with a change of scenery. Especially when memory is bad.

Maybe. Much depends on the context. It is not very convenient to talk about such topics on the forum - you need to feel that the interlocutor is on the same wavelength and understands the word-weaving.

 
Maxim Dmitrievsky #:
How is it different from clustering?)

You'd better tell me what you see as the similarities. Unless the goal is similar - to improve learning.

In my case, learning happens with a teacher, clustering without a teacher.

 
Aleksey Vyazmikin #:

You'd better tell me what you see as the similarities. Unless the goal is similar - to improve learning.

In my case, learning with a teacher, clustering without a teacher.

The similarity is dividing into groups and analysing those groups afterwards.

 
Aleksey Vyazmikin #:

Well, different approaches, in my method there is distillation of data, especially relevant.........

This is again not clear to anyone...

1 Make a reproducible example of the code in the format - it was like this, did this, became like this....

2 The code should be as compact and understandable as possible.

3 No unnecessary actions in the code.

4 do not need real gigabyte data, it is better to make a small but sufficient synthetic dataset.

5 Produce the code.

6 Check three more times if there is no unnecessary things in the code that may hinder understanding or may complicate code reproduction or just unreasonably complicated code.

7 Install sids.
 
Aleksey Vyazmikin #:

My message was that radicals drive progress. Humility is already at the stage of waiting for death, accepting the inevitable and without possibility of any other outcome.

Radicals make a lot of noise, they are well heard and so they are credited with a lot of things. Real progress is created in silence. The latest leap in AI is a good proof of this - the hype came just when real progress was on hold.
Reason: