Machine learning in trading: theory, models, practice and algo-trading - page 160

 
Dmitry:

10% is the deposit load.

If you have a deposit of $1,000, you load it up by 10% - you open a trade for $100.

Now, WARNING, depending on the leverage provided by your broker/coach you can buy different lots - $10,000 (1:100), $5,000 (1:50), $20,000 (1:200).

P.S. fuckerbaby........


Mmmmm, I feel a "breakthrough" soon and everything will come down to "game on everything with a deposit and doubling when you lose" hee-hee ...

When I look at the brokers, I don't see what's going on in the portfolio and what's the point of the capital, at arbitrage there may be more leverage, at trend following there may be less, but hardly anyone, if not a sucker, will risk more than 2-3% of the capital for one trade in the portfolio, a portfolio may have hundreds of positions and be loaded by more than two thirds, but even theoretically it should not be so that some news suddenly took more than 10% of the capital, this is nonsense, this is propaganda from gambling kitchens about "deposit acceleration" and such nonsense

 
mytarmailS:

Thanks a lot, but the script does not work the way I thought it would, the levels are even lower than with the first method....

I understand that I should not be tied to high-low prices but do something like this

but just round the price scale, like now we have a min. move of 1 pip and we do a min. move of say 20 pips, but in every move of 20 pips there is a sum of volume that has passed within these 20 pips..... I would better draw it, I can't understand anything I've written

here's the link to the figure.http://prntscr.com/ct8kgg

I tried it 10 times

I think it's impossible to do this for a non-tick chart, isn't it?
 

I think SanSanych was asking about the balance of learning and retraining. At the link the guy talks about interesting ideas on how to do it based on Bayesian probabilities.

https://postnauka.ru/video/55303

Построение сложных вероятностных моделей
Построение сложных вероятностных моделей
  • postnauka.ru
Математик Дмитрий Ветров о теореме Байеса, целях машинного обучения и сложных вероятностных моделях
 
sibirqk:

I think SanSanych was interested in the balance of learning and retraining. At this link the guy talks about interesting ideas how to do it based on Bayesian probabilities.

https://postnauka.ru/video/55303

Thanks, I read it.

I think the author is too optimistic.

The problem of overlearning is not solvable in principle. The fact is that "overlearning" is a methodological problem of science as such. All science is aimed at finding some generalizing laws that would, on the one hand, describe a single phenomenon well with some accuracy, and, on the other hand, cover a large enough range of similar phenomena.

Take Newton's law of gravitation.

At the domestic level works quite accurately for a steel ball, as well as for all other compact bodies of materials with high specific gravity. But for poplar fuzz it does not work at all.

Where is the boundary of this law?

For machine learning models applied to financial markets, here on this thread I formulated such a boundary: only predictors for the target variable should be used that are "relevant" to the target variable. Can a Bayesian approach be applied to "have a relationship"? I don't know.

But I will note that my formulation is not a revelation by any means. In statistics, the basic rule is: garbage in, garbage out. But the problem is, in defining "have a relation" in statistics, we rely on the concept of "correlation," which always has some meaning. And there is no such thing as "no correlation. That's why I write "have a relation," which must necessarily have a meaning of "no relation," and then some qualitative gradation.

For the most frequent methods of dealing with overfitting in machine modeling is the principle of coarsening, which can be most clearly explained in the following example.

We take a polynomial and, by increasing its degree, decrease the fitting error. We get the error of 5%, for example. Then, by discarding the last term of the polynomial with the highest power, we have coarsened the model and increased the error, but this polynomial can be applied to a much larger number of cases.

To my mind, if the input predictors are not first cleared from the noise ones, i.e. "irrelevant" to the target variable, then the "coarsening" method does not work, nor do other methods that use the concept of "importance" of predictors. Particularly bad are algorithms that calculate "importance" on the basis of how often the predictor is used in model fitting.

I do not know what place the method proposed in the article takes to solve the problem I mentioned.

 
SanSanych Fomenko:

...

Take Newton's law of gravitation.

At the domestic level, it works quite accurately for a steel ball, as well as for all other compact bodies of materials with high specific gravity. But for poplar fuzz it does not work at all.

Where is the boundary of this law?

...

Here sits a science writer on the forum.

He says scientific phrases. Almost believes himself.

And all because he "writes" on a forum where no one can contradict him.

Wrong forum. Not a scientific. And Newton will not answer...

And gives birth to such a "writer" phrases like: "At the household level it works quite accurately for a steel ball, as well as for all other compact bodies of materials with high specific gravity. But for poplar fuzz it does not work at all."

One word - echonometrist...


 
Vladimir Sus:

One word - echonometrist...

STOPUFF

Newton's law doesn't apply to poplar fuzz... it's a real pain in the ass.

 

Hello. Newton is out, I'm in for him.

Andrey Dik:

Newton's law doesn't apply to poplar fuzz... you're going to get mad.

Read it carefully, please:

SanSanych Fomenko:

... On a domestic level... for poplitee fluff doesn't work at all...

You see, vacuum is not a household level. Well, or you live somewhere in space, then of course, yes, commonplace
 
Vladimir Sus:

One word - echonometrist...

Do you have something against econometrics? Look at the average salaries for this profession, in the U.S. an econometrician can easily earn six figures a year.
 
Dr. Trader:

Hello. Newton is out, I'm in for him.

Read more carefully plz:

You see, a vacuum is not a household level. Well, or you live somewhere in space, then of course, yes, the commonplace
The law works, including for fluff. But when you look at things "at the domestic level", then you get what you get....
 
Andrey Dik:
The law works, including for the fluff. But when you look at things "on a domestic level", then you get what you get....

What if I told you that there is no such thing as "Newton's Law" in nature? And it's just a formula derived to simplify calculations. And the phrase "Newton's Law works or it doesn't work" implies that this formula can be used to calculate some process, or vice versa it cannot be applied due to the complexity of the problem and the chaotic nature of the world.

Suppose there is a steel ball. Knowing its mass, you can determine how fast it will fall, how fast it will reach the ground, etc., all quite accurately. In the case of a fluff, however, there are so many influences on it that applying Newton's laws will not help you calculate where and when the fluff will fall. Even if you shut yourself up in a windless room at the bottom of the ocean, even any seismic activity will change and the fluff will not fall where you calculated. Even such a complex experiment is already beyond the bounds of ordinariness, but it is still not accurate enough.

The fluff behavior is an analogy to the trading symbol behavior in Forex. You can make an Expert Advisor with thousands of formulas, but all they will describe only the phenomena you observe in this process. You will never fully understand the underlying processes, and that's why no matter how accurate the formulas are, they will work only under ideal conditions, describing only the previously observed phenomena. And in fact something unexpected will happen, and the market will go against all your formulas and take out all your stops.