Not the Grail, just a regular one - Bablokos!!! - page 427

 
Aleksandr Volotko:

Come on, then some other thing will come along that you will definitely have to beat and then!... to the factory, if you don't beat the next thing, of course, and then the next one, etc. etc.

Another 20-30 years of suffering to test and then definitely to the factory.


What are some possible ways to bypass/minimize the effects of phase variability?

  • traditional: over-optimization in hope for inertia of parameters at least a part of optimization length
  • predictive: attempt to determine the drift of the optimum and get ahead of it in advance
  • oscillatory: the idea of oscillatory phases and betting on an antiphase in advance
  • statistical: determine frequencies of optimums in zones and bet on zones with higher frequency
  • Dynamical-statistical: the same, but taking into account previous values of optimums (Bayesian scheme)
  • Fundamental: assess future market sentiment and select the flat/break phase manually
  • occult-magical: indescribable godlike rituals and sacrifices
  • autoregressive: autoregressive model + additional factors
  • non-parametric: use of datamining/II methods to estimate future phase
  • Labour: stop speculative trading and go to the factory
  • timing: phase length analysis and start of trading cycle only after some interval of duration

What is the problem with statistical approaches - even a countable number of parameters can already be a problem, for too simple models you may not catch significant factors of phase differentiation, but with any addition of a new variable the Cartesian space increases dramatically, not to mention the speed of calculations, and the MT5 tester is honestly not the fastest, because there are many advanced things there, modelling ticks, delays and so on, In the portfolio it is done for all symbols, and many times, so you can even sacrifice accuracy and calculate in advance point values for history intervals, it may speed up the work. I heard in passing that there are people who spent a lot of money on cloud optimization, denied themselves food and clothes, hehe... A separate task is to optimize code with pre-caching all necessary data, this should also noticeably improve test speed...

 
You could have just written: everything is fading.
 

17 synthetics have started to move through their 3-week channels in the right direction

001

 
sbmill:

17 synthetics have started to move through their 3 week channels in the right direction


great with a retest even

 
transcendreamer:

What is the problem with statistical approaches - even a countable number of parameters can already be a problem, for too simple models one may not catch significant phase differentiation factors, but with any addition of a new variable the Cartesian space increases dramatically, not to mention the speed of the calculations

Yes, as an option, to test some ideas, once the necessary statistics have been uploaded, you can use MS EXCEL, to optimise the parameters. Or a more advanced feature of the statistical packages, of course, need to understand why use it, and what you need to look for. EXCEL is quite enough, even for dynamic plots. However...

Even in EXCEL if you run optimization by parameters - can take a very long time, (sometimes you think of a quantum computer), so sometimes you have to find a balance between workload of optimized parameters and data processing time.

 
Ilmir Galiev:

Yes, as an option, to test some ideas, after the necessary statistical data has been uploaded, you can use MS EXCEL to optimise the parameters. Or more advanced functionality of statistical packages, but of course it is necessary to understand why to use it, and what to look for. EXCEL is quite enough, even for dynamic plots. However...

Even in EXCEL if you run optimization by parameters - can take a very long time (sometimes you think of a quantum computer), so sometimes you have to find a balance between workload of optimized parameters and data processing time.

It is possible to check in R if the language is overpowering.

 
transcendreamer:

great with a retest even

Yeah, and a poker (1-2-3, ross hook) would be great :)
 
Aleksander:
Yeah, you can also stick a poker on it, and it'll be perfect :)

it's a pattern, how do you stick it on?

will be and will be ;)

 
transcendreamer:


What are some options for bypassing/minimising the effects of phase variability?

  • Conventional: over-optimisation in the hope of inertia of the parameters at least part of the optimisation length
holy crap, not an option, parameters are lost much faster than you think, relying on inertia is a dead end

  • predictive: attempt to determine the drift of the optimum and get ahead of it in advance
even more of a dead end than the previous one, 50/50.

  • oscillatory: the idea of oscillatory phases and betting on an antiphase in advance
futility arising from the previous futility is also 50/50 - you will either meet a dinosaur in the street or not at all

  • statistical: determine frequencies of optimums in zones and bet on zones with higher frequency
statistical crap

  • Dynamical-statistical: the same, but taking into account previous values of optimums (Bayesian scheme)
dynamic-statistical crap

  • Fundamental: assess future market sentiment and select the flat/break phase manually
general fog, hand-jobbing

  • occult-magical: indescribable godlike rituals and sacrifices
artistic futility

  • autoregressive: autoregressive model + additional factors
again...

  • non-parametric: use of datamining/II methods to estimate future phase
and again futility

  • Labour: stop speculative trading and go to the factory
it would seem not to be a cakewalk, but cakewalk all the same, for there is no profit there - only misery, and a complete lack of understanding on the deathbed as to why one should live such a life, in fact...

  • timing: phase length analysis and start of trading cycle only after some interval of duration
"A cakewalk, hello from the dinosaurs above.
---
you forgot the market analysis under drugs and other mushrooms, but that too is a cakewalk, and you know it yourself, but why didn't you say so?

What is the problem with statistical approaches - even a countable number of parameters can already be a problem, for too simple models you may not catch significant factors of phase differentiation, but with any addition of a new variable the Cartesian space increases dramatically, not to mention the speed of calculations, and the MT5 tester is frankly not the fastest, because there are many advanced things there, modelling ticks, delays and so on, In the portfolio it is done for all symbols, and many times, so you can even sacrifice accuracy and calculate in advance point values for history intervals, it may speed up the work. I've heard in passing that there are people who spent a lot of money on cloud optimization, denied themselves food and clothes, hehe... Separate task is code optimization with pre-caching all needed data, this should also noticeably improve speed of tests...

Acceleration of optimizations is also useless, because there is no profit there either, no matter how fast you optimize uselessness will be uselessness at the output, a real one...

 
Aleksandr Volotko:
Holy crap, not an option, the parameters are gone much earlier than you think, relying on inertia is a dead end

even more of a dead end than the previous one, 50/50

futility arising from the previous futility is also 50/50 - you may or may not encounter a dinosaur in the street.

statistical crap

dynamic-statistical crap

general fog, hand-jobbing

artistic futility

again...

and again futility

it would seem not to be a cakewalk, but cakewalk all the same, for there is no profit there - only misery, and a complete lack of understanding on the deathbed as to why one should live such a life, in fact...

"A cakewalk, hello from the dinosaurs above.
---
you forgot the market analysis under the drugs and other mushrooms, but that too is a cakewalk, and you know it yourself, but why didn't you say so?

Acceleration of optimizations is also a waste, because there is no profit there either, no matter how fast you optimize the waste, it will be a waste in the output, a noble one.

And if you do it in combination. Say, I ate mushrooms, trained NS and optimized on statistics.