# Evolution Strategy

## General Remarks

Evolution Strategies (ES) correspond to a class of optimization algorithms which rely on analogies to natural processes, i.e. natural evolution. It solves the problem:

min $f_{obj}(\mathbf{x})$

subject to $x \in \mathbb{R}^n$

ES are one of three main streams of the so-called Evolutionary Algorithms for optimization, while the other two streams are Evolutionary Programming and Genetic Algorithms, respectively . The ES algorithm employs a population of possible solutions rather than a unique solution. The population evolves through the optimization procedure following selection processes based on the fitness of the individuals and some recombination and/or mutation operators. ES perform a stochastic search in the space of the design variables and, eventually, the algorithm may be able to identify the global minimum of a given problem involving several local minima. ES are a gradient-free optimization algorithm.

ES have been used extensively in deterministic (see, e.g. ) and uncertain structural optimization problems (see, e.g. ).

ES are conceptually simple and easy to implement. One of the strong points of the technique is its versatility: the same algorithm can be applied for problems involving continuous, discrete or mixed design variables with minor changes .

ES have several variants such as the simple mutation-selection scheme, multimembered, etc. For a summary on the different existant variants, see e.g. . Nonetheless, all variants follows the same principles. Therefore, only the multimembered ES are described in the following.

## Algorithm

To solve an optimization problem, ES follow the following algorithm.

1. Generate a set of $\mu$ individuals; each individual is composed of a random realization of the design vector plus some strategy parameters. These individuals constitute the so-called initial population (or parents).
2. Evaluate the fitness of each individual of the initial population, i.e. evaluate the objective function value associated to each member.
3. By means of recombination and mutation rules, generate $\lambda$ individuals (the so-called offspring population). Each of these individuals is composed by a realization of the design vector and some strategy parameters.
4. Evaluate the fitness of each individual of the offspring population.
5. Select $\mu$ best individuals (i.e., with highest fitness) from either (a) the population conformed by the parents and offspring or (b) the offspring population; the $\mu$ individuals which are chosen are denoted as the survivors.
6. Check for convergence with the survivors. If convergence has been achieved, terminate the algorithm; otherwise, replace the parent population with the survivors and go to step 1.

In the aforementioned algorithm, two main phases may be recognized. On one hand, the recombination and mutation steps constitute the stochastic exploration phase of the algorithm; recombination may be seen as a mechanism in which different individuals share their information, while mutation is a random perturbation. On the other hand, selection is a deterministic step where the best individuals are chosen to form the new population of individuals.