Evolution Strategies (ES) correspond to a class of optimization algorithms which rely on analogies to natural processes, i.e. natural evolution. It solves the problem:
ES are one of three main streams of the so-called Evolutionary Algorithms for optimization, while the other two streams are Evolutionary Programming and Genetic Algorithms, respectively . The ES algorithm employs a population of possible solutions rather than a unique solution. The population evolves through the optimization procedure following selection processes based on the fitness of the individuals and some recombination and/or mutation operators. ES perform a stochastic search in the space of the design variables and, eventually, the algorithm may be able to identify the global minimum of a given problem involving several local minima. ES are a gradient-free optimization algorithm.
ES are conceptually simple and easy to implement. One of the strong points of the technique is its versatility: the same algorithm can be applied for problems involving continuous, discrete or mixed design variables with minor changes .
ES have several variants such as the simple mutation-selection scheme, multimembered, etc. For a summary on the different existant variants, see e.g. . Nonetheless, all variants follows the same principles. Therefore, only the multimembered ES are described in the following.
To solve an optimization problem, ES follow the following algorithm.
- Generate a set of individuals; each individual is composed of a random realization of the design vector plus some strategy parameters. These individuals constitute the so-called initial population (or parents).
- Evaluate the fitness of each individual of the initial population, i.e. evaluate the objective function value associated to each member.
- By means of recombination and mutation rules, generate individuals (the so-called offspring population). Each of these individuals is composed by a realization of the design vector and some strategy parameters.
- Evaluate the fitness of each individual of the offspring population.
- Select best individuals (i.e., with highest fitness) from either (a) the population conformed by the parents and offspring or (b) the offspring population; the individuals which are chosen are denoted as the survivors.
- Check for convergence with the survivors. If convergence has been achieved, terminate the algorithm; otherwise, replace the parent population with the survivors and go to step 1.
In the aforementioned algorithm, two main phases may be recognized. On one hand, the recombination and mutation steps constitute the stochastic exploration phase of the algorithm; recombination may be seen as a mechanism in which different individuals share their information, while mutation is a random perturbation. On the other hand, selection is a deterministic step where the best individuals are chosen to form the new population of individuals.
- T. Bäck, G. Rudolph and H.P. Schwefel. Evolutionary programming and evolution strategies: Similarities and differences. In D.B. Fogel and W. Atmar, editors, Proceedings of the 2nd Annual Conference on Evolutionary Programming, La Jolla, CA, 1993. Evolutionary Programming Society.
- J. Cai and G. Thierauf. Evolution strategies for solving discrete optimization problems. Advances in Engineering Software, 25(2-3): pp. 177-183, 1996.
- O. Hasancebi. Adaptive evolution strategies in structural optimization: Enhancing their computational performance with applications to large-scale structures. Computers and Structures, 86(1-3): pp. 119-132, 2008.
- M. Papadrakakis and N.D. Lagaros. Reliability-based structural optimization using neural networks and monte carlo simulation. Computer Methods in Applied Mechanics and Engineering, 191:pp. 3491-3507, 2002.
- H.-G. Beyer and H.-P. Schwefel. Evolution strategies - A comprehensive introduction. Natural Computing, 2002, 1(1), pp. 3-52. 
- F. Hoffmeister and T. Bäck. Genetic Algorithms and Evolution Strategies: Similarities and Differences. Department of Computer Science, University of Dortmund, 1992.