Clever Algorithms: Nature-Inspired Programming Recipes

A book by Jason Brownlee

Home | Read Online | Amazon | GoodReads | Google Books | PDF (code) | GitHub



Population-Based Incremental Learning

Population-Based Incremental Learning, PBIL.

Taxonomy

Population-Based Incremental Learning is an Estimation of Distribution Algorithm (EDA), also referred to as Population Model-Building Genetic Algorithms (PMBGA) an extension to the field of Evolutionary Computation. PBIL is related to other EDAs such as the Compact Genetic Algorithm, the Probabilistic Incremental Programing Evolution Algorithm, and the Bayesian Optimization Algorithm. The fact the the algorithm maintains a single prototype vector that is updated competitively shows some relationship to the Learning Vector Quantization algorithm.

Inspiration

Population-Based Incremental Learning is a population-based technique without an inspiration. It is related to the Genetic Algorithm and other Evolutionary Algorithms that are inspired by the biological theory of evolution by means of natural selection.

Strategy

The information processing objective of the PBIL algorithm is to reduce the memory required by the genetic algorithm. This is done by reducing the population of a candidate solutions to a single prototype vector of attributes from which candidate solutions can be generated and assessed. Updates and mutation operators are also performed to the prototype vector, rather than the generated candidate solutions.

Procedure

The Population-Based Incremental Learning algorithm maintains a real-valued prototype vector that represents the probability of each component being expressed in a candidate solution. Algorithm (below) provides a pseudocode listing of the Population-Based Incremental Learning algorithm for maximizing a cost function.

Input: $Bits_{num}$, $Samples_{num}$, $Learn_{rate}$, $P_{mutation}$, $Mutation_{factor}$
Output: $S_{best}$
$V$ $\leftarrow$ InitializeVector{$Bits_{num}$}
$S_{best}$ $\leftarrow$ $\emptyset$
While ($\neg$StopCondition())
    $S_{current}$ $\leftarrow$ $\emptyset$
    For ($i$ To $Samples_{num}$)
        $S_i$ $\leftarrow$ GenerateSamples{$V$}
        If (Cost{$S_i$} $\leq$ Cost{$S_{current}$})
            $S_{current}$ $\leftarrow$ $S_i$
            If (Cost{$S_i$} $\leq$ Cost{$S_{best}$})
                $S_{best}$ $\leftarrow$ $S_i$
            End
        End
    End
    For ($S_{bit}^i$ $\in$ $S_{current}$)
        $V_{bit}^i$ $\leftarrow$ $V_{bit}^i$ $\times$ (1.0 $-$ $Learn_{rate}$) $+$ $S_{bit}^i$ $\times$ $Learn_{rate}$
        If (Rand() $<$ $P_{mutation}$)
            $V_{bit}^i$ $\leftarrow$ $V_{bit}^i$ $\times$ (1.0 $-$ $Mutation_{factor}$) $+$ Rand() $\times$ $Mutation_{factor}$
        End
    End
End
Return ($S_{best}$)
Pseudocode for PBIL.

Heuristics

Code Listing

Listing (below) provides an example of the Population-Based Incremental Learning algorithm implemented in the Ruby Programming Language. The demonstration problem is a maximizing binary optimization problem called OneMax that seeks a binary string of unity (all '1' bits). The objective function only provides an indication of the number of correct bits in a candidate string, not the positions of the correct bits. The algorithm is an implementation of the simple PBIL algorithm that updates the prototype vector based on the best candidate solution generated each iteration.

def onemax(vector)
  return vector.inject(0){|sum, value| sum + value}
end

def generate_candidate(vector)
  candidate = {}
  candidate[:bitstring] = Array.new(vector.size)
  vector.each_with_index do |p, i|
    candidate[:bitstring][i] = (rand()<p) ? 1 : 0
  end
  return candidate
end

def update_vector(vector, current, lrate)
  vector.each_with_index do |p, i|
    vector[i] = p*(1.0-lrate) + current[:bitstring][i]*lrate
  end
end

def mutate_vector(vector, current, coefficient, rate)
  vector.each_with_index do |p, i|
    if rand() < rate
      vector[i] = p*(1.0-coefficient) + rand()*coefficient
    end
  end
end

def search(num_bits, max_iter, num_samples, p_mutate, mut_factor, l_rate)
  vector = Array.new(num_bits){0.5}
  best = nil
  max_iter.times do |iter|
    current = nil
    num_samples.times do
      candidate = generate_candidate(vector)
      candidate[:cost] = onemax(candidate[:bitstring])
      current = candidate if current.nil? or candidate[:cost]>current[:cost]
      best = candidate if best.nil? or candidate[:cost]>best[:cost]
    end
    update_vector(vector, current, l_rate)
    mutate_vector(vector, current, mut_factor, p_mutate)
    puts " >iteration=#{iter}, f=#{best[:cost]}, s=#{best[:bitstring]}"
    break if best[:cost] == num_bits
  end
  return best
end

if __FILE__ == $0
  # problem configuration
  num_bits = 64
  # algorithm configuration
  max_iter = 100
  num_samples = 100
  p_mutate = 1.0/num_bits
  mut_factor = 0.05
  l_rate = 0.1
  # execute the algorithm
  best=search(num_bits, max_iter, num_samples, p_mutate, mut_factor, l_rate)
  puts "done! Solution: f=#{best[:cost]}/#{num_bits}, s=#{best[:bitstring]}"
end
Population-Based Incremental Learning in Ruby
Download: pbil.rb.

References

Primary Sources

The Population-Based Incremental Learning algorithm was proposed by Baluja in a technical report that proposed the base algorithm as well as a number of variants inspired by the Learning Vector Quantization algorithm [Baluja1994].

Learn More

Baluja and Caruana provide an excellent overview of PBIL and compare it to the standard Genetic Algorithm, released as a technical report [Baluja1995] and later published [Baluja1995a]. Baluja provides a detailed comparison between the Genetic algorithm and PBIL on a range of problems and scales in another technical report [Baluja1995b]. Greene provided an excellent account on the applicability of PBIL as a practical optimization algorithm [Greene1996]. Höhfeld and Rudolph provide the first theoretical analysis of the technique and provide a convergence proof [Hohfeld1997].

Bibliography

[Baluja1994] S. Baluja, "Population-Based Incremental Learning: A Method for Integrating Genetic\n\tSearch Based Function Optimization and Competitive Learning", Technical Report CMU-CS-94-163, School of Computer Science, Carnegie Mellon University, 1994.
[Baluja1995] S. Baluja and R. Caruana, "Removing the Genetics from the Standard Genetic Algorithm", Technical Report CMU-CS-95-141, School of Computer Science Carnegie Mellon University, 1995.
[Baluja1995a] S. Baluja and R. Caruana, "Removing the Genetics from the Standard Genetic Algorithm", in Proceedings of the International Conference on Machine Learning, 1995.
[Baluja1995b] S. Baluja, "An Empirical Comparison of Seven Iterative and Evolutionary Function\n\tOptimization Heuristics", Technical Report CMU-CS-95-193, School of Computer Science Carnegie Mellon University, 1995.
[Greene1996] J. R. Greene, "Population-based incremental learning as a simple versatile tool\n\tfor engineering optimization", in Proceedings of the First International Conference on Evolutionary\n\tComputation and Its Applications, 1996.
[Hohfeld1997] M. H\&ouml;hfeld and G. Rudolph, "Towards a theory of population based incremental learning", in Proceedings of the IEEE Conference on Evolutionary Computation, 1997.



Please Note: This content was automatically generated from the book content and may contain minor differences.