Machine learning in trading: theory, models, practice and algo-trading - page 2743
You are missing trading opportunities:
- Free trading apps
- Over 8,000 signals for copying
- Economic news for exploring financial markets
Registration
Log in
You agree to website policy and terms of use
If you do not have an account, please register
in addition to Maxim Dmitrievsky .
that's what I'm talking about:
to do classification, you shouldn't just do it on up/down dir, but first at least do a cluster analysis to determine how many classes to formally allocate (what to call them is a matter of subjective taste)...
and only then determine discriminant functions on the basis of which to assign samples to this or that class.... then classification will be with a normal percentage of accuracy - when we know that the classes we divide into really exist...
PCA is only a variant of factor analysis for selecting orthogonal features, but it explains ALL variance, without selecting the main factors,
because the main components are only the raw data transformed into eigenvector coefficients ("loadings"), which, when multiplied by the raw data, give pc_scores... (something like that - I remembered it long ago - the main components are only the raw data transformed into eigenvector coefficients ("loadings"), which, when multiplied by the raw data, give pc_scores... (something like that - haven't remembered the algorithm for a long time)
- but in the end PCA explains ALL the variance, without fs... In contrast, the principal factor analysis uses "only the variation of the variable, common to other variables as well"... (I don't insist that this is the best fs -- but there are nuances everywhere)
in general FS no one can do correctly, and tries to blame the library...
PCA in combination with +/-3sq.cv. off -- can help to remove outliers, but this is only for normal distribution, and you still have to prove that your gen. population obeys the normal distribution law! - also statistically... otherwise PCA will show "oil on Repin" (and not vice versa)...
== I see a statistically adequate way to build a model approximately like this....
===
and the library is already the 10th thing (even if moderators call names without understanding what we are talking about in normal DataScience - losers always dream of a banquet and blame the whole world) -- while those who really want to understand, have long ago realised that it is not the programmer's language that matters, but the algorithms behind certain entities, implemented even in an alien library -- the essence of cause-and-effect relations does not change this (the name of the library).
p.s..
while the moderators are at the banquet(, others are working - you should take an example from them - don't spread misinformation.
PCA is only as a variant of factor analysis for selecting orthogonal features, but it explains ALL variance, without selecting the main factors,
There is PCA taking into account the target, it will isolate components that characterise the target, but the sad thing is that the target is a subjective variable and it will "float" as soon as the traine is over.... and how does it differ from normal teacher training?
in addition to Maxim Dmitrievsky .
that's what I'm talking about:
to do classification, you shouldn't just do it on up/down dir, but first at least do a cluster analysis to determine how many classes to formally allocate (what to call them is a matter of subjective taste)...
and only then determine discriminant functions on the basis of which to assign samples to this or that class.... then classification will be with a normal percentage of accuracy - when we know that the classes into which we divide really exist...
PCA is only a variant of factor analysis for selecting orthogonal features, but it explains the WHOLE variance, without selecting the main factors,
because the main components are only the original data transformed into eigenvector coefficients ("loadings"), which when multiplied by the original data give pc_scores... (something like that - I remembered long ago - it's a long time ago)... (something like that - haven't remembered the algorithm for a long time)
- but in the end PCA explains ALL the variance, without fs... In contrast, the principal factor analysis uses "only the variation of the variable, common to other variables as well"... (I don't insist that this is the best fs -- but there are nuances everywhere)
in general FS no one can do correctly, and tries to blame the library...
PCA in combination with +/-3sq.cv. off -- can help to remove outliers, but this is only for normal distribution, and you still have to prove that your gen. population obeys the normal distribution law! - also statistically... otherwise PCA will show "oil on Repin" (and not vice versa)...
== I see the statistically adequate way of building a model as follows...
===
and the library is already the 10th thing (even if moderators call names without understanding what we are talking about in normal DataScience - losers always dream of a banquet and blame the whole world) -- while those who really want to understand, have long ago realised that it is not the programmer's language that matters, but the algorithms behind certain entities, implemented even in an alien library -- the essence of cause-and-effect relations does not change this (the name of the library).
p.s..
while the moderators are at the banquet(, others are working - you should take an example from them - don't spread misinformation.
.
said by whom? by a moron who can't string three words together without breaking logic )) ahaha, it's serious....
by whom? by a moron who can't string three words together to keep logic intact )) ahaha, that's serious....
you don't understand turns of speech, you don't understand when it's written for brevity, you don't understand definitions, that is nothing.
you're just phoning it in off-topic. That's the hallmark of a college kid.
Nobody's accusing you of that, people are different. Just don't go where you're an oak, don't get involved :Dyou don't understand speech patterns
genius)) we write any rubbish, and if someone who thinks pokes you in the nose, you tell them - you don't understand speech, ptushnik.
What do you have against ptushniks? Are they not people? Or is your ex from there?
...
Write in Russian, it's impossible to read. And with samples it's even funny. Most of your posts I usually don't read at all because of this.
To summarise Sanych's theory (since he himself failed to formalise it properly and give examples):
After Sanych's explanations, I stopped understanding a bit what significant predictors mean in the end. According to his explanation, they are frequently occurring and their magnitude correlates with the result. But these are apparently general signs of the series, over the entire training period. I can't seem to match what it is in the series model. It turns out that these are predictors that work always, if quite simplified, or most often. In general, it is clear that using settings that work most often will give a more positive result than using settings that work only on a certain segment....
I don't get the picture of what is ultimately being searched for, and why.
genius)) we write any rubbish, and if someone who thinks pokes you in the nose, you tell them - you don't understand speech patterns, ptuschestvo.
What do you have against ptushniks? Are they not people? Or is your ex from there?
genius)) we write any rubbish, and if someone who thinks pokes you in the nose, you tell them - you don't understand speech patterns, ptuschestvo.
What do you have against ptushniks? Are they not people? Or is your ex from there?
you don't understand turns of speech, you don't understand when it is written for brevity, you don't understand definitions, i.e. nothing
you're just phoning it in off-topic. That's the hallmark of a ptuschnik.
Nobody's accusing you of that, people are different. Just don't go where you're stupid, don't get involved :DLet me be a ptushnik, blame everything on me, and you will cool down and more on the matter as it would be better, with arguments and if with jokes, then without childish teasing)))).