Discussing the article: "Population optimization algorithms: Simulated Annealing (SA) algorithm. Part I"

 

Check out the new article: Population optimization algorithms: Simulated Annealing (SA) algorithm. Part I.

The Simulated Annealing algorithm is a metaheuristic inspired by the metal annealing process. In the article, we will conduct a thorough analysis of the algorithm and debunk a number of common beliefs and myths surrounding this widely known optimization method. The second part of the article will consider the custom Simulated Isotropic Annealing (SIA) algorithm.

The Simulated Annealing algorithm was developed by Scott Kirkpatrick, George Gelatt and Mario Vecchi in 1983. When studying the properties of liquids and solids at high temperatures, it was found that the metal transforms into a liquid state and the particles are distributed randomly, while the state with minimum energy is achieved under the condition of a sufficiently high initial temperature and a sufficiently long cooling time. If this condition is not fulfilled, then the material will find itself in a metastable state with non-minimum energy - this is called hardening, which consists of sharp cooling of the material. In this case, the atomic structure has no symmetry (anisotropic state, or uneven properties of the material inside the crystal lattice).

During the slow annealing process, the material also turns into a solid state, but with organized atoms and with symmetry, so it was proposed to use this process to develop an optimization algorithm that could find a global optimum in complex problems. The algorithm has also been proposed as a method for solving combinatorial optimization problems.

Thus, the main idea of the algorithm is based on a mathematical analogue of the metal annealing process. During the annealing process, in order to evenly distribute its internal energy, the metal is heated to a high temperature and then slowly cooled, allowing the metal molecules to move and order into more stable states, while internal stresses in the metal are relieved and intercrystalline defects are removed. The term "annealing" is also associated with thermodynamic free energy, which is an attribute of the material and depends on its state.

The simulated annealing optimization algorithm uses a similar process. The algorithm applies operations similar to heating and cooling the material. The algorithm begins its work with an initial solution, which can be random or obtained from previous iterations. Then it applies operations to change the state of the solution, which can be random or controlled, to obtain a new state, even if it is worse than the current one. The probability of making a worse decision is determined by a "cooling" function, which reduces the probability of making a worse decision over time, allowing the algorithm to temporarily "jump out" of local optima and look for better solutions elsewhere in the search space.

Author: Andrey Dik

 
A good reference book on optimisation algorithms, Thanks!
 
fxsaber #:
A good reference book on optimisation algorithms, Thank you!

Thank you.

 
Bro phenomenal content, I love how you express the algorithm in such a compact manner that's easy to read at the same time.

Quick question I want to ask you related to the test objective function. How can we create an objective function that will return the historical profit or loss of our expert advisor under its current settings, that way we optimise the expert parameters for profit. I hope I expressed the question clearly. 
 
Gamuchirai Zororo Ndawana #:
Bro phenomenal content, I love how you express the algorithm in such a compact manner that's easy to read at the same time.

Quick question I want to ask you related to the test objective function. How can we create an objective function that will return the historical profit or loss of our expert advisor under its current settings, that way we optimise the expert parameters for profit. I hope I expressed the question clearly. 

If you don't mind to dig into a bit cryptic source codes from fxsaber then look into this implemenation published in fxsaber's blog (may require a language translation).

Optimization - самостоятельная оптимизация торгового советника.
Optimization - самостоятельная оптимизация торгового советника.
  • 2024.03.26
  • www.mql5.com
После появления своего тикового тестера логичным продолжением было применить его на множестве алгоритмов оптимизации . Другими словами, научиться оптимизировать торговые советники самостоятельно - без
 
Gamuchirai Zororo Ndawana #:
Phenomenal content, I love how you lay out the algorithm in such a compact manner that is easy to read at the same time.

I would like to ask you a question related to test objective function. How can we create an objective function that will return the historical profit or loss of our EA at its current settings, so we optimise the EA parameters for profit. I hope I have expressed the question clearly.

Thank you for your kind words, glad you like the article. I hope you were helped by @Stanislav Korotky's comment.

TesterStatistics () may be useful for compiling custom fitness functions for use in OnTester ().

 

is there an example of how to implement these algorithms in an EA?

Thank you

 
SergioTForex #:

is there an example of how to implement these algorithms in an EA?

Thanks

https://www.mql5.com/ru/articles/14183
Использование алгоритмов оптимизации для настройки параметров советника "на лету"
Использование алгоритмов оптимизации для настройки параметров советника "на лету"
  • www.mql5.com
В статье рассматриваются практические аспекты использования алгоритмов оптимизации для поиска наилучших параметров советников "на лету", виртуализация торговых операций и логики советника. Данная статья может быть использована как своеобразная инструкция для внедрения алгоритмов оптимизации в торгового советника.
Reason: