Genetic Algorithms

From COSSAN Wiki
Jump to: navigation, search

General Remarks

Genetic Algorithms (GA) are a gradient-free method for optimization. GA attempt to solve the problem:

\begin{align}
&\mbox{min}~~f(\mathbf{x}),~~\mathbf{x}^T=\left\langle x_1,x_2,\ldots,x_n\right\rangle\\
&\mbox{subject to}\\
&g_j(\mathbf{x})=0,~j=1,\ldots,m_e\\
&g_j(\mathbf{x})\leq0,~j=m_e+1,\ldots,m
\end{align}

where \mathbf{x} is the design vector (which can be real-valued, mixed or discrete), f(\mathbf{x}) is the objective function and g_j(\mathbf{x}) represents the equality and inequality constraints.

GA own their name to the fact that their functioning is inspired by the rules of the natural selection: correspondingly, the adopted language contains many terms borrowed from biology, which need to be suitably redefined to fit the algorithmic context.

GA have been introduced to solve problems where the objective functions do not possess nice properties such as continuity and differentiability [1] [2] [3] [4]. These algorithms maintain and manipulate a family, or population of solutions and implement a survival of the fittest strategy in their search for better solutions. This provides an implicit as well as explicit parallelism that allow for the exploitation of several promising areas of the solution space at the same time.

GA differ from most optimization techniques because of their global searching from one population of solutions rather than from one single solution. Every proposal of solution is represented by a vector \mathbf{x} of the independent variables, which is coded in a chromosome (string of numbers, generally sequences of binary digits 0 and 1), constituted by as many genes (sub-strings of assigned lengths) as the number of independent variables of the problem.

Algorithm

In its general form, GA works through the following steps:

  1. Creation of a random initial population of N_p potential solutions to the problem and evaluation of these individuals in terms of their fitness, i.e. of their corresponding objective function values.
  2. Selection of a pair of individuals as parents.
  3. Crossover of the parents, with generation of two children.
  4. Replacement in the population, so as to maintain the population number N_p constant.
  5. Genetic mutation.

Every time a new solution \mathbf{x} is proposed by the GA, the objective function is evaluated and a ranking of the individuals in the current population is dynamically updated, based on their fitness values. This ranking is used in the selection procedure which is performed in such a way that in the long run the best individuals will have a greater probability to be selected as parents, in resemblance to the natural principles of the survival of the fittest. Similarly, the ranking is used in the replacement procedures to decide who, among the parents and the children, should survive in the next population. An algorithm based on these procedures is often referred to as a steady-state GA [5]. When using GA, sufficient genetic diversity among solutions in the population should be guaranteed. Lack of such diversity would lead to a reduction of the search space spanned by the GA and consequently to a degradation of its optimization performance with selection of mediocre individuals resulting in premature convergence to a local minimum. On the contrary, an excess of genetic diversity, especially at later generations, may lead to a degradation of the optimization performance, resulting in very late, or even no convergence.

More details of the GA procedure can be found in [6].

Notes

  1. D.E. Goldberg. Genetic Algorithms in Search, Optimization and Machine Learning. Addison Wesley,1989. ISBN 0201157675
  2. J.H. Holland. Adaptation in Natural and Artificial Systems: An Introductory Analysis with Applications to Biology, Control, and Artificial Intelligence. The MIT Press, 1975. ISBN 0262581116
  3. D. Beasley, D.R. Bull and R.R. Martin. An Overview of Genetic Algorithms: Part 1, Fundamentals. University Computing, 1993, 15(2), pp. 58-69. [1]
  4. D. Beasley, D.R. Bull and R.R. Martin. An Overview of Genetic Algorithms: Part 2, Research Topics. University Computing, 1993, 15(4), pp. 170-181. [2]
  5. S. Martorell, S. Carlos, A. Sanchez, V. Serradell. Constrained optimization of test intervals using a steady-state genetic algorithm. Reliability Engineering and System Safety, 2000, 67(3), pp. 215-232. [3]
  6. P.G. Busacca, M. Marseguerra and E. Zio. Multiobjective optimization and genetic algorithms: application to safety systems. Reliability Engineering and System Safety, 2001, 72(1), pp. 59-74.[4][5]


See Also