Clever Algorithms: Nature-Inspired Programming Recipes

A book by Jason Brownlee

Home | Read Online | Amazon | GoodReads | Google Books | PDF (code) | GitHub



Compact Genetic Algorithm

Compact Genetic Algorithm, CGA, cGA.

Taxonomy

The Compact Genetic Algorithm is an Estimation of Distribution Algorithm (EDA), also referred to as Population Model-Building Genetic Algorithms (PMBGA), an extension to the field of Evolutionary Computation. The Compact Genetic Algorithm is the basis for extensions such as the Extended Compact Genetic Algorithm (ECGA). It is related to other EDAs such as the Univariate Marginal Probability Algorithm, the Population-Based Incremental Learning algorithm, and the Bayesian Optimization Algorithm.

Inspiration

The Compact Genetic Algorithm is a probabilistic technique without an inspiration. It is related to the Genetic Algorithm and other Evolutionary Algorithms that are inspired by the biological theory of evolution by means of natural selection.

Strategy

The information processing objective of the algorithm is to simulate the behavior of a Genetic Algorithm with a much smaller memory footprint (without requiring a population to be maintained). This is achieved by maintaining a vector that specifies the probability of including each component in a solution in new candidate solutions. Candidate solutions are probabilistically generated from the vector and the components in the better solution are used to make small changes to the probabilities in the vector.

Procedure

The Compact Genetic Algorithm maintains a real-valued prototype vector that represents the probability of each component being expressed in a candidate solution. Algorithm (below) provides a pseudocode listing of the Compact Genetic Algorithm for maximizing a cost function. The parameter $n$ indicates the amount to update probabilities for conflicting bits in each algorithm iteration.

Input: $Bits_{num}$, $n$
Output: $S_{best}$
$V$ $\leftarrow$ InitializeVector{$Bits_{num}$, 0.5}
$S_{best}$ $\leftarrow$ $\emptyset$
While ($\neg$StopCondition())
    $S_{1}$ $\leftarrow$ GenerateSamples{$V$}
    $S_{2}$ $\leftarrow$ GenerateSamples{$V$}
    $S_{winner}$, $S_{loser}$ $\leftarrow$ SelectWinnerAndLoser{$S_{1}$, $S_{2}$}
    If (Cost{$S_{winner}$} $\leq$ Cost{$S_{best}$})
        $S_{best}$ $\leftarrow$ $S_{winner}$
    End
    For ($i$ To $Bits_{num}$)
        If ($S_{winner}^i$ $\neq$ $S_{loser}^i$)
            If ($S_{winner}^i$ $\equiv$ 1)
                $V_{i}^i$ $\leftarrow$ $V_{i}^i$ $+$ $\frac{1}{$n$}$
            Else
                $V_{i}^i$ $\leftarrow$ $V_{i}^i$ $-$ $\frac{1}{$n$}$
            End
        End
    End
End
Return ($S_{best}$)
Pseudocode for the cGA.

Heuristics

Code Listing

Listing (below) provides an example of the Compact Genetic Algorithm implemented in the Ruby Programming Language. The demonstration problem is a maximizing binary optimization problem called OneMax that seeks a binary string of unity (all '1' bits). The objective function only provides an indication of the number of correct bits in a candidate string, not the positions of the correct bits. The algorithm is an implementation of Compact Genetic Algorithm that uses integer values to represent 1 and 0 bits in a binary string representation.

def onemax(vector)
  return vector.inject(0){|sum, value| sum + value}
end

def generate_candidate(vector)
  candidate = {}
  candidate[:bitstring] = Array.new(vector.size)
  vector.each_with_index do |p, i|
    candidate[:bitstring][i] = (rand()<p) ? 1 : 0
  end
  candidate[:cost] = onemax(candidate[:bitstring])
  return candidate
end

def update_vector(vector, winner, loser, pop_size)
  vector.size.times do |i|
    if winner[:bitstring][i] != loser[:bitstring][i]
      if winner[:bitstring][i] == 1
        vector[i] += 1.0/pop_size.to_f
      else
        vector[i] -= 1.0/pop_size.to_f
      end
    end
  end
end

def search(num_bits, max_iterations, pop_size)
  vector = Array.new(num_bits){0.5}
  best = nil
  max_iterations.times do |iter|
    c1 = generate_candidate(vector)
    c2 = generate_candidate(vector)
    winner, loser = (c1[:cost] > c2[:cost] ? [c1,c2] : [c2,c1])
    best = winner if best.nil? or winner[:cost]>best[:cost]
    update_vector(vector, winner, loser, pop_size)
    puts " >iteration=#{iter}, f=#{best[:cost]}, s=#{best[:bitstring]}"
    break if best[:cost] == num_bits
  end
  return best
end

if __FILE__ == $0
  # problem configuration
  num_bits = 32
  # algorithm configuration
  max_iterations = 200
  pop_size = 20
  # execute the algorithm
  best = search(num_bits, max_iterations, pop_size)
  puts "done! Solution: f=#{best[:cost]}/#{num_bits}, s=#{best[:bitstring]}"
end
Compact Genetic Algorithm in Ruby
Download: compact_genetic_algorithm.rb.

References

Primary Sources

The Compact Genetic Algorithm was proposed by Harik, Lobo, and Goldberg in 1999 [Harik1999], based on a random walk model previously introduced by Harik et al. [Harik1997]. In the introductory paper, the cGA is demonstrated to be comparable to the Genetic Algorithm on standard binary string optimization problems.

Learn More

Harik et al. extended the Compact Genetic Algorithm (called the Extended Compact Genetic Algorithm) to generate populations of candidate solutions and perform selection (much like the Univariate Marginal Probabilist Algorithm), although it used Marginal Product Models [Harik1999a] [Harik2006]. Sastry and Goldberg performed further analysis into the Extended Compact Genetic Algorithm applying the method to a complex optimization problem [Sastry2000].

Bibliography

[Harik1997] G. R. Harik and E. Cant&uacute;–Paz and D. E. Goldberg and \n\tB. L. Miller, "The gambler's ruin problem, genetic algorithms, and the sizing of\n\tpopulations", in IEEE International Conference on Evolutionary Computation, 1997.
[Harik1999] G. R. Harik and F. G. Lobo and D. E. Goldberg, "The compact genetic algorithm", IEEE Transactions on Evolutionary Computation, 1999.
[Harik1999a] G. R. Harik, "Linkage Learning via Probabilistic Modeling in the Extended Compact\n\tGenetic Algorithm (ECGA)", Technical Report 99010, Illinois Genetic Algorithms Laboratory, Department of General Engineering,\n\tUniversity of Illinois, 1999.
[Harik2006] G. R. Harik and F. G. Lobo and K. Sastry, "Linkage Learning via Probabilistic Modeling in the Extended Compact\n\tGenetic Algorithm (ECGA)", in Scalable Optimization via Probabilistic Modeling, pages 39–61, Springer, 2006.
[Sastry2000] K. Sastry and D. E. Goldberg, "On Extended Compact Genetic Algorithm", in Late Breaking Paper in Genetic and Evolutionary Computation Conference, 2000.



Please Note: This content was automatically generated from the book content and may contain minor differences.