What to feed to the input of the neural network? Your ideas... - page 55

 
Ivan Butko #:

We have come to the fundamental paradigms of trading:

1) Patterns are the same for buy and sell, just mirror

2) Patterns for buy and sell are different


Yes, indeed, as long as there is no evidence, we can rely on some beliefs or facts.

In this case, as I said above, I rely on the fact that in all known TSs the conditions for BUY and for SELL are the same, mirror-like.

This applies to both draining (99.9....%) TSs and successful ones. I emphasise - successful ones. They all have mirror rules.


The fact that any discrimination of one of the types of deals negatively affects forwards and backs also plays against the second position in my opinion.

For example, my trick about the range - if it is not mirrored, but different (i.e. from -1 to 0 and from 0 to 1 will be completely different areas with different weights) - then the optimisation and training itself will look both on the test period - scary and cramped, and on the forward and back - scary and cramped.

And if it is mirrored - then smooth transitions are more likely to occur. Imho, subjective.


Also against the second option plays the fact, as mentioned above, if you teach in 2020 - then in 2021 pours. And these are two years opposite in direction, starting right from the New Year.
So, an NS without a mirror, or a separate NS for BUY, which is trained separately, is guaranteed to pour in all sets of optimisation in 2021. You press them right after each other, they all peaked. All of them have learnt BUY, and in 2021 they open BUY wherever they can, they don't know how to get out of it, and only a little bit SELL, and then - it's unclear where.


But I don't reject this variant and test everything that comes to hand ))
Because every day something new shows up

Any theory has a right to be, of course).

Take an index or a stock, they have been going up for years. There are clearly buy and sell patterns, I think. Forex symbols, of course, are flat by and large, but they also show long movements. In short, everything should be checked - as always).

 
Andrey Dik #:

Has a right to be any theory, of course.)

Take an index or a stock, it's been going up for years. There are clearly different buy and sell patterns here, I think. Forex symbols, of course, are flat by and large, but they also show long movements. In short, everything should be checked - as always).

It is important to understand: "different" and "absent" are not the same thing.

A mirror pattern on SELL in an upward trend may simply be absent. And appear on a correction. At the right time - in the right place.

For example, pattern:

1) MA fast crosses MA slow up - open BUY
2) MA fast crosses MA slow down - open SELL

SELL pattern will not appear on a flat upward trend without deep corrections. But when the conditions come - it will appear. But the conditions are the same, they are mirror-like.

And the NS, which is trained separately for BUY, will look for BUY even in corrections.
And on a deep trend down it will pour, trying to open BUY, because it has "learnt enough" and has never seen a trend down.

Another variant - training on all trends of the whole section. It would seem - this is logical, but the quirk is that all NS, as it is properly called (approximation?) - train the trade better that makes the most money. And fits weights to them.

At least the simple architectures, including RNN, LSTM, CNN and the like. If they have few neurons, layers and filters - they do exactly the same thing as MLPs, even worse sometimes. If there are a lot of neurons (Python, third-party software) - they turn into a lagging moving average that predicts the value 1 bar back.

 
Ivan Butko #:

Here it's important to realise: "different" and "missing" are not the same thing.

That's what I'm talking about.

 

feed the grid input SMA shifted by half a period - but as a teacher. Don't teach neuronics by taking trades or zigzags from the ceiling.

If it can predict the future average with sufficient probability and deviations, it will be the grail.

 

EMA (exponencial moving average) has such an interesting feature.

When bar/candle close is above/below EMA, EMA is directed upwards/downwards accordingly.

 
What is "training"?

 

Standard Bollinger and created by neural network.

Channel width was fed to the neural network input.
 
Sergey Pavlov #:

Standard Bollinger and created by neural network.

The input of the neural network was the channel width.

Fun. The point of replacing the standard BB is to get the values earlier?

What does "training" mean to you?

 
Andrey Dik#:
What is "training"?

Wonderful question

If it is fully disclosed in the context of a virtual environment (information field), then presumably you can go in the right direction, rather than broadcasting academic knowledge and textbooks

Every time picking at architectures I asked myself the question "why like this? Why? Why did they decide to do it this way?" No, just take - and translate the architecture, which was written by smart uncles-mathematicians.


I even asked the chat room, why the LSTM block has the form it has? In reply - nonsense from MO textbooks: it is a long short-term memory, blah blah blah,adapted to learning on classification tasks and blah blah blah blah.

I ask "so why exactly so?", the answer is like "mathematicians decided so". No theory, no information theory, no information processing theory, no definitions of learning, learning theories, etc. Stupid postulates.


From the third time the chat started to talk about fades and gradient spikes. LSTM solves these problems. Well, OK, how does it solve it? - Withgates!
What gates!?!? What gates? -

a key element that stores information through the entire sequence.

What information???? The numbers on the input? But you are converting it from the incoming numbers into a traily-valley distorted incomprehensible gibberish, which turns the incoming numbers-colours RGB into something unreadable, a black box of mush.

Well, let's say, converting some numbers into others, but learning in what? Memorisation? So it's memorisation! And how does it differ from learning?


In the end, it's unclear what they try to apply to unclear what in the second degree - non-stationary market.



In general, the question is great, it was asked for a long time ago. Its unfolding is something extremely interesting.

 
Andrey Dik #:
What is "training"?

Wonderful question

If it is fully disclosed in the context of a virtual environment (information field), then presumably it is possible to get to the right direction, rather than broadcasting academic knowledge and textbooks

Every time picking at architectures I asked myself the question "why like this? Why? Why did they decide to do it this way?" No, just take - and translate the architecture, which was written by smart uncles-mathematicians.


I even asked the chat room, why the LSTM block has the form it has? In reply - nonsense from MO textbooks: it is a long short-term memory, blah blah blah,adapted to learning on classification tasks and blah blah blah blah.

I ask "so why exactly so?", the answer is like "mathematicians decided so". No theory, no information theory, no information processing theory, no definitions of learning, learning theories, etc. Stupid postulates.


From the third time the chat started to talk about fades and gradient spikes. LSTM solves these problems. Well, OK, how does it solve it? - Withgates!
What gates!?!? What gates? -

a key element that stores information through the entire sequence.

What information???? The numbers on the input? But you are converting it from the incoming numbers into a traily-valley distorted incomprehensible gibberish, which turns the incoming numbers-colours RGB into something unreadable, a black box of mush.

Well, let's say, converting some numbers into others, but learning in what? Memorisation? So it's memorisation! And how does it differ from learning?


In the end, it's unclear what they try to apply to unclear what in the second degree - non-stationary market.



In general, the question is great, it was asked for a long time ago. Its unfolding is something extremely interesting.