Machine learning in trading: theory, models, practice and algo-trading - page 3394

 

Requiem for RL and ode to causal transformer

*any RL-algorithm can be thought of as any global optimiser

https://ai.plainenglish.io/reinforcement-learning-is-dead-long-live-the-transformer-228835689841

 
Maxim Dmitrievsky #:

Requiem for RL and an ode to the causal transformer

*any RL-algorithm can be perceived as any global optimiser

https://ai.plainenglish.io/reinforcement-learning-is-dead-long-live-the-transformer-228835689841

Unfortunately... no schmooze either

https://www.mql5.com/ru/articles/13712

Нейросети — это просто (Часть 63): Предварительное обучение Трансформера решений без учителя (PDT)
Нейросети — это просто (Часть 63): Предварительное обучение Трансформера решений без учителя (PDT)
  • www.mql5.com
Продолжаем рассмотрение семейства методов Трансформера решений. Из предыдущих работ мы уже заметили, что обучение трансформера, лежащего в основе архитектуры данных методов, довольно сложная задача и требует большого количества размеченных обучающих данных. В данной статье мы рассмотрим алгоритм использования не размеченных траекторий для предварительного обучения моделей.
 

LLMs are probably linguists' favourite toys right now :)


 
Maxim Dmitrievsky #:
Are you saying you're cooler than me?

since the mate said it's the grail,

please give me an objective assessment.

in the trailer information

and link:

GitHub - alfiyandyhr/nn_ga_benchmark: NN+GA: A Surrogate-Based Optimisation Framework Using Neural Network and Genetic Algorithm

GitHub - alfiyandyhr/nn_ga_benchmark: NN+GA: A Surrogate-Based Optimization Framework Using Neural Network and Genetic Algorithm
GitHub - alfiyandyhr/nn_ga_benchmark: NN+GA: A Surrogate-Based Optimization Framework Using Neural Network and Genetic Algorithm
  • alfiyandyhr
  • github.com
NN+GA: A Surrogate-Based Optimization Framework Using Neural Network and Genetic Algorithm - GitHub - alfiyandyhr/nn_ga_benchmark: NN+GA: A Surrogate-Based Optimization Framework Using Neural Netwo...
 
Renat Akhtyamov #:

because the mate said it was the grail,

please give an objective assessment

in the trailer info

and link:

GitHub - alfiyandyhr/nn_ga_benchmark: NN+GA: A Surrogate-Based Optimisation Framework Using Neural Network and Genetic Algorithm

It's impossible to say anything when you don't know what is being optimised and why. The method itself is fine, but it can be slow, like the stochastic descent method. That is, it can take a long time to converge.

 
Maxim Dmitrievsky #:

It is impossible to say anything when you don't know what is being optimised and why. The method itself is fine, but it can be slow, like the stochastic descent method. That is, it can take a long time to converge.

Genetic algorithms + neural networks = the best of both worlds (skine.ru)
Генетические алгоритмы + нейронные сети = лучшее из обоих миров
  • skine.ru
Узнайте, как можно ускорить обучение нейронной сети с помощью генетических алгоритмов!
 
Maxim Dmitrievsky #:

Yes hat

most likely

a mate had a signal that went to livantos with natural success.

 
Renat Akhtyamov #:

most likely

a mate had a signal that went into the livantos with natural success.

First it says it'll speed up the learning curve. Then it says it takes a lot of deductible resources. That's your style of info.


Usually hyperparameter optimisation over a grid is used, NN+GA is different, there weights should be picked up via GA, not some standard solver like adam.

The article in the link is confusing.

 
Maxim Dmitrievsky #:
first it says it'll speed up the learning curve. Then it says you need a lot of deductible resources. That's your style of info.


Usually hyperparameter optimisation on the grid is used, NN+GA is different, there weights should be selected through GA, not through some standard solver like adam.

The article in the link is confusing.

not at all

my grail works a few seconds a day and for the 2nd year it's been giving more and more.

It's fully automatic.

The computer switches on and off by itself.

Reason: