Machine learning in trading: theory, models, practice and algo-trading - page 585
You are missing trading opportunities:
- Free trading apps
- Over 8,000 signals for copying
- Economic news for exploring financial markets
Registration
Log in
You agree to website policy and terms of use
If you do not have an account, please register
It's hard to evaluate the trading ones in this way, because there's deal duration and stop-loss levels have to be added to everything else, and it also needs to be retrained periodically... so what a mess :)
I have seen it for a long time. I have seen it for a long time. It is OK in itself, but cloudiness is not very good for my TS.
You can sell signals:)) access through the api, if the model is cool
Sitting. reading a pdf of the MoD monograph. Quote:
It turns out that there's no need to twitch either, NS seems to be the best option.
Sitting. reading a pdf of the MoD monograph. Quote:
It turns out there's no need to twitch either, NS seems to be the best option.
And I read Haykin and simultaneously watched
The movie is atmospheric... what will win in the end? Protein life or artificial life, or will something in between be created? :)
by the way, some sources say that probabilistic NN is in fashion nowadays. My friend whispered... but he is very experienced in them, he takes part in cagle contests
And I read Heikin and watched
The movie is atmospheric... what will win in the end? Protein life or artificial life, or will something in between be created? :)
By the way, some sources say that probabilistic NN is in fashion nowadays. My friend whispered... but he knows a lot about them, he participates in contests at Cagle
Yesterday I found convolutional NN - usually used for image recognition. Naturally, there are all the utilities - training, etc. Made for use in Python.
There are also recurrent, etc., but that's not very interesting yet.
Since the convolutional network is not fully connected, we can greatly increase the number of neurons without loss of performance. But I need to understand the details, I haven't got into it yet.
Popular description -https://geektimes.ru/post/74326/Yesterday I found a convolutional NS - usually used for image recognition. Naturally, there are all the utilities - training, etc. Made for use in Python.
There are also recurrence, etc., but that's not very interesting yet.
Since the convolutional network is not fully connected, we can greatly increase the number of neurons without loss of performance. But I still need to understand the details - I haven't got into it yet.
Popular description -https://geektimes.ru/post/74326/Well, this is deep, they are mainly used for images and computer vision. You need a lot of examples and layers to make it work. The architecture itself copies the visual system
Try PNN in python, it makes more sense to predict time series.
https://habrahabr.ru/post/276355/
Well, this is the backwoods, they are mainly used for images and computer vision. It takes a lot of examples and layers to make it work. The architecture itself copies the visual system
Try PNN in python, they make more sense for time series prediction.
https://habrahabr.ru/post/276355/
Once again, I'm not predicting anything. I only have a classification.
I've been looking for an incomplete network for a long time. MLP is all good, but every neuron has all inputs at once. Ah, that's exactly what we need, so that only 5-6 inputs with a shift go to the neuron, and this is the convolutional NS.
There's nothing complicated here, and you only need 100-150 neurons, so the structure is simple, and the speed will be like the MLP with 60 neurons, due to a smaller number of inputs from neurons.
Once again, I am not predicting anything. I only have a classification.
I have been looking for an incompletely connected network for a long time. MLP is all good, but all inputs go to each neuron at once. Ah, that's exactly what we need, so that only 5-6 inputs with a shift go to the neuron, and this is the convolutional NS.
There's nothing complicated here, and you only need 100-150 neurons, so the structure is simple, and speed will be like MLP with 60 neurons, due to a smaller number of inputs from neurons.
Well, there's a classifier, and what prevents you from looking for an incomplete one. I just like this scheme, for example:
I like this scheme, for example: "I'll make all the screenshots of the book :)
Once again, I am not predicting anything. I only have a classification.
I have been looking for an incompletely connected network for a long time. MLP is all good, but all inputs go to each neuron at once. Ah, that's exactly what we need, so that only 5-6 inputs with a shift go to the neuron, and this is the convolutional NS.
There's nothing complicated here, and we only need 100-150 neurons, so the structure is simple, and speed will be like MLP with 60 neurons, due to less number of inputs from neurons.
The idea to use convolutional layers has been simmering for a long time. I think they can give good results.
But don't throw away the multilayer perseptron. Converging networks are not learning anything by themselves, they just give some compact image of the input information.
Well, there is a classifier, and what prevents you from looking for incomplete.