Clever Algorithms: Nature-Inspired Programming Recipes

A book by Jason Brownlee

Home | Read Online | Amazon | GoodReads | Google Books | PDF (code) | GitHub



Differential Evolution

Differential Evolution, DE.

Taxonomy

Differential Evolution is a Stochastic Direct Search and Global Optimization algorithm, and is an instance of an Evolutionary Algorithm from the field of Evolutionary Computation. It is related to sibling Evolutionary Algorithms such as the Genetic Algorithm, Evolutionary Programming, and Evolution Strategies, and has some similarities with Particle Swarm Optimization.

Strategy

The Differential Evolution algorithm involves maintaining a population of candidate solutions subjected to iterations of recombination, evaluation, and selection. The recombination approach involves the creation of new candidate solution components based on the weighted difference between two randomly selected population members added to a third population member. This perturbs population members relative to the spread of the broader population. In conjunction with selection, the perturbation effect self-organizes the sampling of the problem space, bounding it to known areas of interest.

Procedure

Differential Evolution has a specialized nomenclature that describes the adopted configuration. This takes the form of DE/x/y/z, where x represents the solution to be perturbed (such a random or best). The y signifies the number of difference vectors used in the perturbation of x, where a difference vectors is the difference between two randomly selected although distinct members of the population. Finally, z signifies the recombination operator performed such as bin for binomial and exp for exponential.

Algorithm (below) provides a pseudocode listing of the Differential Evolution algorithm for minimizing a cost function, specifically a DE/rand/1/bin configuration. Algorithm (below) provides a pseudocode listing of the NewSample function from the Differential Evolution algorithm.

Input: $Population_{size}$, $Problem_{size}$, $Weighting_{factor}$, $Crossover_{rate}$
Output: $S_{best}$
Population $\leftarrow$ InitializePopulation{$Population_{size}$, $Problem_{size}$}
EvaluatePopulation{Population}
$S_{best}$ $\leftarrow$ GetBestSolution{Population}
While ( $\neg$ StopCondition())
    NewPopulation $\leftarrow \emptyset$
    For ($P_{i}$ $\in$ Population)
        $S_{i}$ $\leftarrow$ NewSample{$P_{i}$, Population, $Problem_{size}$, $Weighting_{factor}$, $Crossover_{rate}$}
        If (Cost{$S_{i}$} $\leq$ Cost{$P_{i}$})
            NewPopulation $\leftarrow$ $S_{i}$
        Else
            NewPopulation $\leftarrow$ $P_{i}$
        End
    End
    Population $\leftarrow$ NewPopulation
    EvaluatePopulation{Population}
    $S_{best}$ $\leftarrow$ GetBestSolution{Population}
End
Return ($S_{best}$)
Pseudocode for Differential Evolution.
Input: $P_{0}$, Population, NP, F, CR
Output: $S$
Repeat
    $P_{1}$ $\leftarrow$ RandomMember{Population}
Until $P_{1}$ $\neq$ $P_{0}$
Repeat
    $P_{2}$ $\leftarrow$ RandomMember{Population}
Until $P_{2}$ $\neq$ $P_{0}$ $\vee$ $P_{2}$ $\neq$ $P_{1}$
Repeat
    $P_{3}$ $\leftarrow$ RandomMember{Population}
Until $P_{3}$ $\neq$ $P_{0}$ $\vee$ $P_{3}$ $\neq$ $P_{1}$ $\vee$ $P_{3}$ $\neq$ $P_{2}$
CutPoint $\leftarrow$ RandomPosition{NP}
$S$ $\leftarrow0$
For ($i$ To NP)
    If ($i \equiv$ CutPoint $\wedge$ Rand() $<$ CR)
        $S_{i}$ $\leftarrow$ $P_{3_i}$ + F $\times$ ($P_{1_i}$ - $P_{2_i}$)
    Else
        $S_{i}$ $\leftarrow$ $P_{0_i}$
    End
End
Return ($S$)
Pseudocode for the NewSample function.

Heuristics

Code Listing

Listing (below) provides an example of the Differential Evolution algorithm implemented in the Ruby Programming Language. The demonstration problem is an instance of a continuous function optimization that seeks $\min f(x)$ where $f=\sum_{i=1}^n x_{i}^2$, $-5.0\leq x_i \leq 5.0$ and $n=3$. The optimal solution for this basin function is $(v_0,\ldots,v_{n-1})=0.0$. The algorithm is an implementation of Differential Evolution with the DE/rand/1/bin configuration proposed by Storn and Price [Storn1997].

def objective_function(vector)
  return vector.inject(0.0) {|sum, x| sum +  (x ** 2.0)}
end

def random_vector(minmax)
  return Array.new(minmax.size) do |i|
    minmax[i][0] + ((minmax[i][1] - minmax[i][0]) * rand())
  end
end

def de_rand_1_bin(p0, p1, p2, p3, f, cr, search_space)
  sample = {:vector=>Array.new(p0[:vector].size)}
  cut = rand(sample[:vector].size-1) + 1
  sample[:vector].each_index do |i|
    sample[:vector][i] = p0[:vector][i]
    if (i==cut or rand() < cr)
      v = p3[:vector][i] + f * (p1[:vector][i] - p2[:vector][i])
      v = search_space[i][0] if v < search_space[i][0]
      v = search_space[i][1] if v > search_space[i][1]
      sample[:vector][i] = v
    end
  end
  return sample
end

def select_parents(pop, current)
  p1, p2, p3 = rand(pop.size), rand(pop.size), rand(pop.size)
  p1 = rand(pop.size) until p1 != current
  p2 = rand(pop.size) until p2 != current and p2 != p1
  p3 = rand(pop.size) until p3 != current and p3 != p1 and p3 != p2
  return [p1,p2,p3]
end

def create_children(pop, minmax, f, cr)
  children = []
  pop.each_with_index do |p0, i|
    p1, p2, p3 = select_parents(pop, i)
    children << de_rand_1_bin(p0, pop[p1], pop[p2], pop[p3], f, cr, minmax)
  end
  return children
end

def select_population(parents, children)
  return Array.new(parents.size) do |i|
    (children[i][:cost]<=parents[i][:cost]) ? children[i] : parents[i]
  end
end

def search(max_gens, search_space, pop_size, f, cr)
  pop = Array.new(pop_size) {|i| {:vector=>random_vector(search_space)}}
  pop.each{|c| c[:cost] = objective_function(c[:vector])}
  best = pop.sort{|x,y| x[:cost] <=> y[:cost]}.first
  max_gens.times do |gen|
    children = create_children(pop, search_space, f, cr)
    children.each{|c| c[:cost] = objective_function(c[:vector])}
    pop = select_population(pop, children)
    pop.sort!{|x,y| x[:cost] <=> y[:cost]}
    best = pop.first if pop.first[:cost] < best[:cost]
    puts " > gen #{gen+1}, fitness=#{best[:cost]}"
  end
  return best
end

if __FILE__ == $0
  # problem configuration
  problem_size = 3
  search_space = Array.new(problem_size) {|i| [-5, +5]}
  # algorithm configuration
  max_gens = 200
  pop_size = 10*problem_size
  weightf = 0.8
  crossf = 0.9
  # execute the algorithm
  best = search(max_gens, search_space, pop_size, weightf, crossf)
  puts "done! Solution: f=#{best[:cost]}, s=#{best[:vector].inspect}"
end
Differential Evolution in Ruby
Download: differential_evolution.rb.

References

Primary Sources

The Differential Evolution algorithm was presented by Storn and Price in a technical report that considered DE1 and DE2 variants of the approach applied to a suite of continuous function optimization problems [Storn1995]. An early paper by Storn applied the approach to the optimization of an IIR-filter (Infinite Impulse Response) [Storn1996a]. A second early paper applied the approach to a second suite of benchmark problem instances, adopting the contemporary nomenclature for describing the approach, including the DE/rand/1/* and DE/best/2/* variations [Storn1996b]. The early work including technical reports and conference papers by Storn and Price culminated in a seminal journal article [Storn1997].

Learn More

A classical overview of Differential Evolution was presented by Price and Storn [Price1997], and terse introduction to the approach for function optimization is presented by Storn [Storn1996]. A seminal extended description of the algorithm with sample applications was presented by Storn and Price as a book chapter [Price1999]. Price, Storn, and Lampinen released a contemporary book dedicated to Differential Evolution including theory, benchmarks, sample code, and numerous application demonstrations [Price2005]. Chakraborty also released a book considering extensions to address complexities such as rotation invariance and stopping criteria [Chakraborty2008].

Bibliography

[Chakraborty2008] U. K. Chakraborty, "Advances in Differential Evolution", Springer, 2008.
[Price1997] K. Price and R. Storn, "Differential Evolution: Numerical Optimization Made Easy", Dr. Dobb's Journal, 1997.
[Price1999] K. V. Price, "An introduction to differential evolution", in New Ideas in Optimization, pages 79–108, McGraw-Hill Ltd., UK, 1999.
[Price2005] K. V. Price and R. M. Storn and J. A. Lampinen, "Differential evolution: A practical approach to global optimization", Springer, 2005.
[Storn1995] R. Storn and K. Price, "Differential Evolution: A Simple and Efficient Adaptive Scheme for\n\tGlobal Optimization over Continuous Spaces", Technical Report TR-95-012, International Computer Science Institute, Berkeley, CA, 1995.
[Storn1996] R. Storn, "On the Usage of Differential Evolution for Function Optimization", in Proceedings Fuzzy Information Processing Society, 1996 Biennial Conference\n\tof the North American, 1996.
[Storn1996a] R. Storn, "Differential evolution design of an IIR-filter", in Proceedings IEEE Conference Evolutionary Computation, 1996.
[Storn1996b] R. Storn and K. Price, "Minimizing the real functions of the ICEC'96 contest by differential\n\tevolution", in Proceedings of IEEE International Conference on Evolutionary Computation, 1996.
[Storn1997] R. Storn and K. Price, "Differential evolution: A simple and efficient heuristic for global\n\toptimization over continuous spaces", Journal of Global Optimization, 1997.



Please Note: This content was automatically generated from the book content and may contain minor differences.