Machine learning in trading: theory, models, practice and algo-trading - page 2882
You are missing trading opportunities:
- Free trading apps
- Over 8,000 signals for copying
- Economic news for exploring financial markets
Registration
Log in
You agree to website policy and terms of use
If you do not have an account, please register
Happy New Year!!!
it's important to carefully consider the trade-offs between accuracy and efficiency is from the AI
In general this strikes me as the axim rule of life. There is no precision in measurement without error. And the error depends on the time of measurement and averaging.... so to speak...
Well, they've rubbed it in and banned it. The main thing erased, how to make a Dutch SIM for a couple of bucks))))
and thank God.)
Are there any mani-management scripts for mt5 ?
like
1) if you entered a deal, stop is set automatically.
2) if you get a loss on the account, it does not allow you to open any more trades, etc.
Is there any maniengent scripts for mt5 ?
like
1) if you enter a trade, the stop is set automatically
2) if you get a loss on your account, it does not allow you to open any more trades, etc.
It's funny, just yesterday I was thinking "What if MO is used not for the direction of market entry, but for mangement, in particular, to manage the transfer to breakeven, set stops and take-outs".
There should be a lot of such things in the code base - look for them.
Alexey, I think you know, but maybe not, so I will show you how algorithms that take variable length sheets/vectors as input work
We have a sheet with vectors of variable length, one vector == one observation.
What does the model, the algorithm "under the bonnet" do when it takes this data? It turns it into a matrix.
But since for real data the matrix is huge, the algorithm turns the data into a memory-efficient sparse matrix.
So it's still a matrix under the bonnet.) (careful data)
All baskets of goods are naturally reduced to vectors of known fixed size, equal to the number of items in the shop.
Our case looks quite different. For simplicity, let prices be a sequence of renko bars, each labelled 1 or -1. For each bar at location number N, the feature vector is all previous bars - a vector of 1's and -1's of length N-1. There are no a priori constraints on the length of the feature vector. Using a given (by us) fixed number of bars for the features is a forced measure. We want to move away from this restriction and build algorithms that can handle vectors of arbitrary length.
I find recursive functions as a source mathematical material for such algorithms. They take as input a vector of any size, but they are defined through functions with a fixed number of arguments. The simplest example is the exponential mean.
All product baskets are naturally reduced to vectors of known pre-fixed size equal to the number of items of all products in the shop.
Our case looks quite different. For simplicity, let prices be a sequence of renko bars, each labelled as 1 or -1. For each bar at location number N, the feature vector is all previous bars - a vector of 1's and -1's of length N-1. There are no a priori constraints on the length of the feature vector. Using a given (by us) fixed number of bars for the features is a forced measure. We would like to move away from this restriction and build algorithms that can handle vectors of arbitrary length.
I find recursive functions as a source mathematical material for such algorithms. They take as input a vector of any size, but they are defined through functions with a fixed number of arguments. The simplest example is the exponential mean.
What exactly do you want to search for and in what way?
For example, we have a pattern, three peaks, or whatever (rule, event, pattern, cluster).
Anything can happen between them, we take it as noise and don't consider it.
We take a noisy vector/matrix as input and check whether there is a pattern or not....
Are you considering this concept or something else?
========================================================
I see it as a sequence of events that must happen, and they are described by log. rules...
event == logical rule.
For example : if event1 happened and there was no event 2, then we wait for event 3 etc....
So there are two kinds of rules/events, "go" events when search continues and "stop" events when everything is cancelled.
The architecture is as follows
1) rules are generated by grammar
2) gen. algorithm searches and improves rules by fitness functions.
=========================
Here is an example of a simple grammar for multidimensional data, in this case OHLC.
the rules that the grammar generates.
This block of rules is like a single rule with a bunch of conditions.
"X" is a matrix with attributes, the loop "i" walks on it and chooses what it likes, a very flexible system.
In principle there is already implemented everything, if there is interest I can throw it to you.
PS there is no restriction on the size of the matrix of attributes, each instance can be of any size, the main thing here is that all the rules worked in order, there is no binding to time.
What exactly do you want to look for and in what way?
For example, we have a pattern, three peaks, or whatever (rule, event, pattern, cluster).
Anything can happen between them, we take it as noise and don't consider it.
So we take a noisy vector/matrix as input and check whether there is a pattern or not....
Is that a concept you're considering or something else?
My concept is as broad as possible, as there are no special restrictions imposed, and many things fit it. Your example probably fits into it too. Especially important is the fact that there is no rigidly defined pattern length.
In any case, for me the point is that at SB the probability of 1 or -1 is always 0.5 and you should look for places where the probability (frequency) deviates strongly from this value. In your pattern this could be, for example, the slopes of the third peak.
I suppose the rule "we are on the left slope of the third peak" can be expressed through recursive functions. But I don't really believe that these functions can be easily written out explicitly, so you need MO algorithms to construct them.
But I don't really believe that these functions can be easily written out explicitly, so we need MO algorithms to construct them.
Well, I have proposed you an algorithm that suits your requirements
1) no time bound, as we write what we need ourselves
2) any logic of searching for regularities, as we write what we need ourselves
3) any choice of description of the regularity, either by log. rules or by functions , because we writewhat we need ourselves.
So in my proposed concept.
these patterns will be equivalent, and the patterns themselves can be of any complexity.
and no AMO can do that.
and there are "stop" rules, and no AMO can do that either.
I mean a general-purpose AMO with tabular data as input.
What exactly do you want to look for and in what way?
For example, we have a pattern, three peaks, or whatever (rule, event, pattern, cluster).
It's three zigzag peaks with anything in between.