All (not yet) about Strategy Tester, Optimization and Cloud - page 13

 

Population optimization algorithms: Changing shape, shifting probability distributions and testing on Smart Cephalopod (SC)

Population optimization algorithms: Changing shape, shifting probability distributions and testing on Smart Cephalopod (SC)

In this article, we will look at various types of probability distributions, their properties and practical implementation in the form of corresponding functions in code. When generating random numbers with various types of distributions, you can encounter a large number of problems, such as infinite tail lengths or shifting probabilities when setting dispersion boundaries. When designing and creating optimization algorithms, there is often a need to shift probabilities relative to the mathematical expectation. The goal of this article is to solve these problems and create working functions for dealing with probabilities for their subsequent use in optimization algorithms.
Population optimization algorithms: Changing shape, shifting probability distributions and testing on Smart Cephalopod (SC)
Population optimization algorithms: Changing shape, shifting probability distributions and testing on Smart Cephalopod (SC)
  • www.mql5.com
The article examines the impact of changing the shape of probability distributions on the performance of optimization algorithms. We will conduct experiments using the Smart Cephalopod (SC) test algorithm to evaluate the efficiency of various probability distributions in the context of optimization problems.
 

Population optimization algorithms: Evolution Strategies, (μ,λ)-ES and (μ+λ)-ES

The name "Evolutionary Strategies" may be misleading, as researchers might think it is a general name for a class of evolutionary algorithms. However, this is not the case. In fact, it is a specific group of algorithms that use ideas of evolution to solve optimization problems.

Population optimization algorithms: Evolution Strategies, (μ,λ)-ES and (μ+λ)-ES
Population optimization algorithms: Evolution Strategies, (μ,λ)-ES and (μ+λ)-ES
  • www.mql5.com
The article considers a group of optimization algorithms known as Evolution Strategies (ES). They are among the very first population algorithms to use evolutionary principles for finding optimal solutions. We will implement changes to the conventional ES variants and revise the test function and test stand methodology for the algorithms.
 

Population optimization algorithms: Bacterial Foraging Optimization - Genetic Algorithm (BFO-GA)

Population optimization algorithms: Bacterial Foraging Optimization - Genetic Algorithm (BFO-GA)

The BFO-GA hybrid optimization algorithm combines two different optimization algorithms: the foraging optimization (BFO) algorithm and the genetic algorithm (GA). This hybrid algorithm was created to improve optimization efficiency and overcome some of the shortcomings of each of the individual algorithms.

BFO (Bacterial Foraging Optimization) is an optimization algorithm inspired by the foraging behavior of bacteria. It was proposed in 2002 by Rahul K. Kujur. BFO models bacterial movement using three main mechanisms: transitions, diffusion, and position update. Each bacterium in the algorithm represents a solution to the optimization problem, and food corresponds to the optimal solution. Bacteria move through search space to find the best food.

Genetic algorithm (GA) is an optimization algorithm inspired by the principles of natural selection and genetics. It was developed by John Holland in the 1970s. GA works with a population of individuals representing solutions to an optimization problem. Individuals undergo the operations of crossing (combining genetic information) and mutation (random changes in genetic information) to create new generations. After several generations, GA strives to find the optimal solution.

Population optimization algorithms: Bacterial Foraging Optimization - Genetic Algorithm (BFO-GA)
Population optimization algorithms: Bacterial Foraging Optimization - Genetic Algorithm (BFO-GA)
  • www.mql5.com
The article presents a new approach to solving optimization problems by combining ideas from bacterial foraging optimization (BFO) algorithms and techniques used in the genetic algorithm (GA) into a hybrid BFO-GA algorithm. It uses bacterial swarming to globally search for an optimal solution and genetic operators to refine local optima. Unlike the original BFO, bacteria can now mutate and inherit genes.
 

Population optimization algorithms: Micro Artificial immune system (Micro-AIS)

The immune system is an amazing mechanism that plays an important role in protecting our body from external threats. Like an invisible shield, it fights bacteria, viruses and fungi, keeping our body healthy. But what if we could use this powerful mechanism to solve complex optimization and learning problems? This is exactly the approach used in the Artificial Immune System (AIS) optimization method.

Artificial Immune System (AIS) optimization method was proposed in the 1990s. Early research on this method dates back to the mid-1980s, with significant contributions by Farmer, Packard, Perelson (1986) and Bersini and Varela (1990).

Population optimization algorithms: Micro Artificial immune system (Micro-AIS)
Population optimization algorithms: Micro Artificial immune system (Micro-AIS)
  • www.mql5.com
The article considers an optimization method based on the principles of the body's immune system - Micro Artificial Immune System (Micro-AIS) - a modification of AIS. Micro-AIS uses a simpler model of the immune system and simple immune information processing operations. The article also discusses the advantages and disadvantages of Micro-AIS compared to conventional AIS.
 

A Generic Optimization Formulation (GOF) to Implement Custom Max with Constraints

A Generic Optimization Formulation (GOF) to Implement Custom Max with Constraints

In general terms, there are two main types of optimization algorithms. The first type is the more classical, based on the calculation of gradients of all functions involved in the optimization problem (this dates back to Isaac Newton’s times). The second type is more recent (since the ~1970’s) that does not use gradient information at all. In between, there may be algorithms that combine the two approaches mentioned, but we don’t need to address them here. The MetaTrader 5 algorithm called “Fast Genetic based Algorithm”---in the MetaTrader 5 terminal Settings tab---belongs to the second type. This allows us to skip the need for the computation gradients for objective and constraint functions. Even more, thanks to the gradient-less nature of the MetaTrader 5 algorithm, we were able to account for constraints functions that would not had been appropriate with gradient-based algorithms. More on this will be discussed below.

One important point is that the MetaTrader 5 algorithm called “Slow Complete Algorithm” is not actually an optimization algorithm but a brute force, exhaustive evaluation of all possible combinations of values for all the input variables within the side constraints.

A Generic Optimization Formulation (GOF) to Implement Custom Max with Constraints
A Generic Optimization Formulation (GOF) to Implement Custom Max with Constraints
  • www.mql5.com
In this article we will present a way to implement optimization problems with multiple objectives and constraints when selecting "Custom Max" in the Setting tab of the MetaTrader 5 terminal. As an example, the optimization problem could be: Maximize Profit Factor, Net Profit, and Recovery Factor, such that the Draw Down is less than 10%, the number of consecutive losses is less than 5, and the number of trades per week is more than 5.
Reason: