Machine learning in trading: theory, models, practice and algo-trading - page 2454

 
mytarmailS #:

It's just for 5, it's a new package, the very name is mt5R

Yes, I understand, I was just looking for multiobjective optimization.

My simple fitness function just looks for index of a point in vector that is a minimum from the point of view of the algorithm.

So ideally the algorithm should output two indexes, these two indexes will be the indexes of the minimum values in the vector

I thought there is no difference to look for two minima in one vector or one minimum in two vectors

My simple fitness is not a model of my problem, I just wanted to make the most simple and clear comparison of the algorithms for myself

And what does your fitness function do? The code seems to be clear, I know everything, but I can't understand the essence)

This is your example and your fitness function. There are plenty of other methods for finding extrema in vectors. Please formulate your task clearly. Then the solution will come.

I just showed that your problem cannot be solved with these packages.

Good luck

 
Vladimir Perervenko #:

This is your example and your fitness function. There are a lot of other methods for finding extremes in vectors. You formulate the problem clearly for yourself. Then the solution will come.

I just showed that your problem cannot be solved with these packages.

Good luck

The task is to compare two types of multicriteria optimization, quickly, simply and clearly...

Of course, you can find the extremum by other means, for example call the min() function for a vector, but that's not what we're talking about.

I thought that to find the extremum of a function (to find the minimum in a vector) is the best way, to be honest I still think we missed each other somewhere...

============

Here, you can find the minimum without any problems with ordinary genetics.

set.seed(123)
x <- cumsum(rnorm(100))

fit <- function(i)  x[ floor(i) ] /-1  # (/-1) потому что GA максимизирует
library(GA)
GA <- ga(type = "real-valued", 
         fitness =  fit,
         lower = 1, upper = length(x) , 
         popSize = 50, maxiter = 100)
id <- c(floor(tail(GA@solution,1)))

plot(x,t="l")
points(id,x[id],col=2,lwd=5)


==============================

What prevents us from doing the same for multi-criteria optimization, we are just looking for several points instead of one.

Besides "mco" (genetics) was good at it, while "GPareto" (Gaussian optimetric) was not "hello" at all, though it was supposed to be the most intellectual one...

 

It's just amazing how genetics finds a solution even with the most stunted settings

The population is 10 individuals,

10 iterations,

the data is 1 million points.

The algorithm has only 100 attempts (10*10) to interact with the data, and it finds a good solution.

It's amazing.

 
mytarmailS #:

The task is to compare two types of multicriteria optimization , fast, simple and clear...

Adam or SGD take from any MO package
 
Maxim Dmitrievsky #:
Adam or SGD take from any MO package

I wanted to compare them, they are for multi-criteria optimization

adam , even from "any MO package" they are definitely not multi-criteria

I don't have a problem with the choice of algorithms))) quite the contrary - it's not python for you)))

 
mytarmailS #:

The task is to compare two types of multicriteria optimization, fast, simple and clear...

It is clear that we can find the extremum by other means, for example, call the function min() for a vector, but it's not about that

I thought that to find the extremum of a function (to find the minimum in a vector) is the best way, to be honest I still think we missed each other somewhere...

============

Here, you can find the minimum without any problems with ordinary genetics.


==============================

What prevents us from doing the same for multi-criteria optimization, we are just looking for several points instead of one.

Besides "mco" (genetics) didn't handle it badly, but "GPareto" (Gaussian optimum) didn't do it at all, though it is supposed to be the most intellectual one...

You probably misunderstand the term MULTICRITERIAL OPTIMIZATION. It is optimization according to several criteria at the same time. For example: we have a balance curve as a result of neural network operation. We may optimize it by maximum balance or minimum drawdown. And if we want to simultaneously optimize by balance and drawdown - it will be multi-criteria optimization. You have one criterion - the minimum of a function. Find all minima of this function and select the ones you need.

Good luck

 
Vladimir Perervenko #:

You probably don't quite understand the term MULTI-CRITERIAL OPTIMIZATION. It is optimization by several criteria at the same time. For example: we have a balance curve as a result of neural network operation. We may optimize it by maximum balance or minimum drawdown. And if we need to simultaneously optimize by balance and drawdown - it will be multi-criteria optimization. You have one criterion - the minimum of a function. Find all minima of this function and select the ones you need.

Good luck

I understand you, we don't understand each other, but thank you for clarification...

 
mytarmailS #:

I know we don't understand each other, but thanks for the clarification.

I agree. You're welcome.

 
Andrey Dik #:

Is the average value of neural network weights, taken modulo, an indicator of the quality of its training?

Suppose there are two identical neurons, trained on the same data, one has a value of 0.87 and the other 0.23, which one is trained better?

The closer the average response to 1, the better, this is from practice. I can't explain why, but for me it's one of the main signs of a good model.
 
I noticed that this model is kind of more universal. With the same results in the test sample, the model with the higher mean response value works better in real life.
Reason: