Overview
This book has presented various algorithms and applications where the optimizer was primarily gradient-based (i.e., the search direction is governed by gradient and/or Hessian information). This chapter introduces an entirely different class of optimization algorithms called the evolutionary algorithms (EA). Evolutionary algorithms imitate natural selection processes to develop powerful computational algorithms to select optimal solutions. Genetic algorithms (GA), simulated annealing (SA), ant colony optimization (ACO), particle swarm optimization (PSO), and tabu search (TS) are some of the popular techniques that fall under the umbrella of evolutionary algorithms.
The motivation for using biologically-inspired computational approaches stems from two key observation. First, the mathematical optimization algorithms in solving complex problems in engineering, computing, and other fields suffer strong limitations. The common challenges in these areas revolve around the lack of mathematical models that define the physical phenomena, discontinuous functions, and high nonlinearity. Second, many complex problems encountered in engineering already exist in nature in some relevant form. Optimization is inherent in nature, such as in the process of adaptation performed by biological organisms in order to survive. Engineers and scientists continue to explore the various efficient problem-solving techniques employed by nature to optimize natural systems.
The relative advantages and limitations of EAs vs. traditional optimization methods are as follows:
Traditional algorithms typically generate a single candidate optimum at each iteration that progresses toward the optimal solution. Evolutionary algorithms generate a population of points at each iteration. The best point in the population approaches an optimal solution.
Traditional algorithms calculate the candidate optimal point at the next iteration by a deterministic computation. EAs usually select the next population by a combination of operations that use random number generators.
Traditional algorithms require gradient and/or Hessian information to proceed, while EAs usually require only function values. As a result, EAs can solve a variety of optimization problems in which the objective function is not smooth and potentially discontinuous.[…]
Review the options below to login to check your access.
Log in with your Cambridge Aspire website account to check access.
If you believe you should have access to this content, please contact your institutional librarian or consult our FAQ page for further information about accessing our content.