Evolution Strategy

From COSSAN Wiki
Jump to: navigation, search

General Remarks

Evolution Strategies (ES) correspond to a class of optimization algorithms which rely on analogies to natural processes, i.e. natural evolution. It solves the problem:

min

f_{obj}(\mathbf{x})

subject to

x \in \mathbb{R}^n

ES are one of three main streams of the so-called Evolutionary Algorithms for optimization, while the other two streams are Evolutionary Programming and Genetic Algorithms, respectively [1]. The ES algorithm employs a population of possible solutions rather than a unique solution. The population evolves through the optimization procedure following selection processes based on the fitness of the individuals and some recombination and/or mutation operators. ES perform a stochastic search in the space of the design variables and, eventually, the algorithm may be able to identify the global minimum of a given problem involving several local minima. ES are a gradient-free optimization algorithm.

ES have been used extensively in deterministic (see, e.g. [2][3]) and uncertain structural optimization problems (see, e.g. [4]).

ES are conceptually simple and easy to implement. One of the strong points of the technique is its versatility: the same algorithm can be applied for problems involving continuous, discrete or mixed design variables with minor changes [5].

ES have several variants such as the simple mutation-selection scheme, multimembered, etc. For a summary on the different existant variants, see e.g. [6]. Nonetheless, all variants follows the same principles. Therefore, only the multimembered ES are described in the following.

Algorithm

To solve an optimization problem, ES follow the following algorithm.

  1. Generate a set of \mu individuals; each individual is composed of a random realization of the design vector plus some strategy parameters. These individuals constitute the so-called initial population (or parents).
  2. Evaluate the fitness of each individual of the initial population, i.e. evaluate the objective function value associated to each member.
  3. By means of recombination and mutation rules, generate \lambda individuals (the so-called offspring population). Each of these individuals is composed by a realization of the design vector and some strategy parameters.
  4. Evaluate the fitness of each individual of the offspring population.
  5. Select \mu best individuals (i.e., with highest fitness) from either (a) the population conformed by the parents and offspring or (b) the offspring population; the \mu individuals which are chosen are denoted as the survivors.
  6. Check for convergence with the survivors. If convergence has been achieved, terminate the algorithm; otherwise, replace the parent population with the survivors and go to step 1.

In the aforementioned algorithm, two main phases may be recognized. On one hand, the recombination and mutation steps constitute the stochastic exploration phase of the algorithm; recombination may be seen as a mechanism in which different individuals share their information, while mutation is a random perturbation. On the other hand, selection is a deterministic step where the best individuals are chosen to form the new population of individuals.

Notes

  1. T. Bäck, G. Rudolph and H.P. Schwefel. Evolutionary programming and evolution strategies: Similarities and differences. In D.B. Fogel and W. Atmar, editors, Proceedings of the 2nd Annual Conference on Evolutionary Programming, La Jolla, CA, 1993. Evolutionary Programming Society.[1]
  2. J. Cai and G. Thierauf. Evolution strategies for solving discrete optimization problems. Advances in Engineering Software, 25(2-3): pp. 177-183, 1996.[2]
  3. O. Hasancebi. Adaptive evolution strategies in structural optimization: Enhancing their computational performance with applications to large-scale structures. Computers and Structures, 86(1-3): pp. 119-132, 2008.[3]
  4. M. Papadrakakis and N.D. Lagaros. Reliability-based structural optimization using neural networks and monte carlo simulation. Computer Methods in Applied Mechanics and Engineering, 191:pp. 3491-3507, 2002.[4][5]
  5. H.-G. Beyer and H.-P. Schwefel. Evolution strategies - A comprehensive introduction. Natural Computing, 2002, 1(1), pp. 3-52. [6][7]
  6. F. Hoffmeister and T. Bäck. Genetic Algorithms and Evolution Strategies: Similarities and Differences. Department of Computer Science, University of Dortmund, 1992.

See Also