Clever Algorithms: Nature-Inspired Programming Recipes

A book by Jason Brownlee

Home | Read Online | Amazon | GoodReads | Google Books | PDF (code) | GitHub



Stochastic Hill Climbing

Stochastic Hill Climbing, SHC, Random Hill Climbing, RHC, Random Mutation Hill Climbing, RMHC.

Taxonomy

The Stochastic Hill Climbing algorithm is a Stochastic Optimization algorithm and is a Local Optimization algorithm (contrasted to Global Optimization). It is a direct search technique, as it does not require derivatives of the search space. Stochastic Hill Climbing is an extension of deterministic hill climbing algorithms such as Simple Hill Climbing (first-best neighbor), Steepest-Ascent Hill Climbing (best neighbor), and a parent of approaches such as Parallel Hill Climbing and Random-Restart Hill Climbing.

Strategy

The strategy of the Stochastic Hill Climbing algorithm is iterate the process of randomly selecting a neighbor for a candidate solution and only accept it if it results in an improvement. The strategy was proposed to address the limitations of deterministic hill climbing techniques that were likely to get stuck in local optima due to their greedy acceptance of neighboring moves.

Procedure

Algorithm (below) provides a pseudocode listing of the Stochastic Hill Climbing algorithm for minimizing a cost function, specifically the Random Mutation Hill Climbing algorithm described by Forrest and Mitchell applied to a maximization optimization problem [Forrest1993].

Input: $Iter_{max}$, ProblemSize
Output: Current
Current $\leftarrow$ RandomSolution{ProblemSize}
For ($iter_i \in$ $Iter_{max}$)
    Candidate $\leftarrow$ RandomNeighbor{Current}
    If (Cost{Candidate} $\geq$ Cost{Current})
        Current $\leftarrow$ Candidate
    End
End
Return (Current)
Pseudocode for Stochastic Hill Climbing.

Heuristics

Code Listing

Listing (below) provides an example of the Stochastic Hill Climbing algorithm implemented in the Ruby Programming Language, specifically the Random Mutation Hill Climbing algorithm described by Forrest and Mitchell [Forrest1993]. The algorithm is executed for a fixed number of iterations and is applied to a binary string optimization problem called 'One Max'. The objective of this maximization problem is to prepare a string of all '1' bits, where the cost function only reports the number of bits in a given string.

def onemax(vector)
  return vector.inject(0.0){|sum, v| sum + ((v=="1") ? 1 : 0)}
end

def random_bitstring(num_bits)
  return Array.new(num_bits){|i| (rand<0.5) ? "1" : "0"}
end

def random_neighbor(bitstring)
  mutant = Array.new(bitstring)
  pos = rand(bitstring.size)
  mutant[pos] = (mutant[pos]=='1') ? '0' : '1'
  return mutant
end

def search(max_iterations, num_bits)
  candidate = {}
  candidate[:vector] = random_bitstring(num_bits)
  candidate[:cost] = onemax(candidate[:vector])
  max_iterations.times do |iter|
    neighbor = {}
    neighbor[:vector] = random_neighbor(candidate[:vector])
    neighbor[:cost] = onemax(neighbor[:vector])
    candidate = neighbor if neighbor[:cost] >= candidate[:cost]
    puts " > iteration #{(iter+1)}, best=#{candidate[:cost]}"
    break if candidate[:cost] == num_bits
  end
  return candidate
end

if __FILE__ == $0
  # problem configuration
  num_bits = 64
  # algorithm configuration
  max_iterations = 1000
  # execute the algorithm
  best = search(max_iterations, num_bits)
  puts "Done. Best Solution: c=#{best[:cost]}, v=#{best[:vector].join}"
end
Stochastic Hill Climbing in Ruby
Download: stochastic_hill_climbing.rb.

References

Primary Sources

Perhaps the most popular implementation of the Stochastic Hill Climbing algorithm is by Forrest and Mitchell, who proposed the Random Mutation Hill Climbing (RMHC) algorithm (with communication from Richard Palmer) in a study that investigated the behavior of the genetic algorithm on a deceptive class of (discrete) bit-string optimization problems called 'royal road' functions [Forrest1993]. The RMHC was compared to two other hill climbing algorithms in addition to the genetic algorithm, specifically: the Steepest-Ascent Hill Climber, and the Next-Ascent Hill Climber. This study was then followed up by Mitchell and Holland [Mitchell1993].

Jules and Wattenberg were also early to consider stochastic hill climbing as an approach to compare to the genetic algorithm [Juels1994]. Skalak applied the RMHC algorithm to a single long bit-string that represented a number of prototype vectors for use in classification [Skalak1994].

Learn More

The Stochastic Hill Climbing algorithm is related to the genetic algorithm without crossover. Simplified version's of the approach are investigated for bit-string based optimization problems with the population size of the genetic algorithm reduced to one. The general technique has been investigated under the names Iterated Hillclimbing [Muhlenbein1991], ES(1+1,m,hc) [Muhlenbein1992], Random Bit Climber [Davis1991], and (1+1)-Genetic Algorithm [Back1993]. This main difference between RMHC and ES(1+1) is that the latter uses a fixed probability of a mutation for each discrete element of a solution (meaning the neighborhood size is probabilistic), whereas RMHC will only stochastically modify one element.

Bibliography

[Back1993] T. B\&auml;ck, "Optimal Mutation Rates in Genetic Search", in Proceedings of the Fifth International Conference on Genetic Algorithms, 1993.
[Davis1991] L. Davis, "Bit-climbing, representational bias, and test suite design", in Proceedings of the fourth international conference on genetic algorithms, 1991.
[Forrest1993] S. Forrest and M. Mitchell, "Relative building-block fitness and the building-block hypothesis", in Foundations of Genetic Algorithms 2, 1993.
[Juels1994] A. Juels and M. Wattenberg, "Stochastic hill climbing as a baseline method for evaluating genetic\n\talgorithms", University of California, Berkeley, 1994.
[Mitchell1993] M. Mitchell and J. H. Holland, "When Will a Genetic Algorithm Outperform Hill Climbing?", in Proceedings of the 5th International Conference on Genetic Algorithms, 1993.
[Muhlenbein1991] H. M\&uuml;hlenbein, "Evolution in time and space - the parallel genetic algorithm", in Foundations of Genetic Algorithms, 1991.
[Muhlenbein1992] H. M\&uuml;hlenbein, "How Genetic Algorithms Really Work: I. Mutation and Hillclimbing", in Parallel Problem Solving from Nature 2, 1992.
[Skalak1994] D. B. Skalak, "Prototype and Feature Selection by Sampling and Random Mutation Hill\n\tClimbing Algorithms", in Proceedings of the eleventh international conference on machine learning, 1994.



Please Note: This content was automatically generated from the book content and may contain minor differences.