Home | Read Online | Amazon | GoodReads | Google Books | PDF (code) | GitHub
Scatter Search, SS.
Scatter search is a Metaheuristic and a Global Optimization algorithm. It is also sometimes associated with the field of Evolutionary Computation given the use of a population and recombination in the structure of the technique. Scatter Search is a sibling of Tabu Search, developed by the same author and based on similar origins.
The objective of Scatter Search is to maintain a set of diverse and high-quality candidate solutions. The principle of the approach is that useful information about the global optima is stored in a diverse and elite set of solutions (the reference set) and that recombining samples from the set can exploit this information. The strategy involves an iterative process, where a population of diverse and high-quality candidate solutions that are partitioned into subsets and linearly recombined to create weighted centroids of sample-based neighborhoods. The results of recombination are refined using an embedded heuristic and assessed in the context of the reference set as to whether or not they are retained.
Algorithm (below) provides a pseudocode listing of the Scatter Search algorithm for minimizing a cost function. The procedure is based on the abstract form presented by Glover as a template for the general class of technique [Glover1998a], with influences from an application of the technique to function optimization by Glover [Glover1998a].
Input
:
$DiverseSet_{size}$, $ReferenceSet_{size}$
Output
:
ReferenceSet
InitialSet
$\leftarrow$ ConstructInitialSolution
{$DiverseSet_{size}$}RefinedSet
$\leftarrow \emptyset$For
($S_{i}$ $\in$ InitialSet
)RefinedSet
$\leftarrow$ LocalSearch
{$S_{i}$}End
ReferenceSet
$\leftarrow$ SelectInitialReferenceSet
{$ReferenceSet_{size}$}While
($\neg$ StopCondition
())Subsets
$\leftarrow$ SelectSubset
{ReferenceSet
}CandidateSet
$\leftarrow \emptyset$For
($Subset_{i}$ $\in$ Subsets
)RecombinedCandidates
$\leftarrow$ RecombineMembers
{$Subset_{i}$}For
($S_{i}$ $\in$ RecombinedCandidates
)CandidateSet
$\leftarrow$ LocalSearch
{$S_{i}$}End
End
ReferenceSet
$\leftarrow$ Select
{ReferenceSet
, CandidateSet
, $ReferenceSet_{size}$}End
Return
(ReferenceSet
)ReferenceSet
, such as 10 or 20 members.ReferenceSet
at the end of each iteration favors solutions with higher quality and may also promote diversity.ReferenceSet
may be updated at the end of an iteration, or dynamically as candidates are created (a so-called steady-state population in some evolutionary computation literature).ReferenceSet
may be used as a signal to stop the current search, and potentially restart the search with a newly initialized ReferenceSet
.Listing (below) provides an example of the Scatter Search algorithm implemented in the Ruby Programming Language. The example problem is an instance of a continuous function optimization that seeks $\min f(x)$ where $f=\sum_{i=1}^n x_{i}^2$, $-5.0\leq x_i \leq 5.0$ and $n=3$. The optimal solution for this basin function is $(v_1,\ldots,v_{n})=0.0$.
The algorithm is an implementation of Scatter Search as described in an application of the technique to unconstrained non-linear optimization by Glover [Glover2003b]. The seeds for initial solutions are generated as random vectors, as opposed to stratified samples. The example was further simplified by not including a restart strategy, and the exclusion of diversity maintenance in the ReferenceSet
. A stochastic local search algorithm is used as the embedded heuristic that uses a stochastic step size in the range of half a percent of the search space.
def objective_function(vector) return vector.inject(0) {|sum, x| sum + (x ** 2.0)} end def rand_in_bounds(min, max) return min + ((max-min) * rand()) end def random_vector(minmax) return Array.new(minmax.size) do |i| rand_in_bounds(minmax[i][0], minmax[i][1]) end end def take_step(minmax, current, step_size) position = Array.new(current.size) position.size.times do |i| min = [minmax[i][0], current[i]-step_size].max max = [minmax[i][1], current[i]+step_size].min position[i] = rand_in_bounds(min, max) end return position end def local_search(best, bounds, max_no_improv, step_size) count = 0 begin candidate = {:vector=>take_step(bounds, best[:vector], step_size)} candidate[:cost] = objective_function(candidate[:vector]) count = (candidate[:cost] < best[:cost]) ? 0 : count+1 best = candidate if candidate[:cost] < best[:cost] end until count >= max_no_improv return best end def construct_initial_set(bounds, set_size, max_no_improv, step_size) diverse_set = [] begin cand = {:vector=>random_vector(bounds)} cand[:cost] = objective_function(cand[:vector]) cand = local_search(cand, bounds, max_no_improv, step_size) diverse_set << cand if !diverse_set.any? {|x| x[:vector]==cand[:vector]} end until diverse_set.size == set_size return diverse_set end def euclidean_distance(c1, c2) sum = 0.0 c1.each_index {|i| sum += (c1[i]-c2[i])**2.0} return Math.sqrt(sum) end def distance(v, set) return set.inject(0){|s,x| s + euclidean_distance(v, x[:vector])} end def diversify(diverse_set, num_elite, ref_set_size) diverse_set.sort!{|x,y| x[:cost] <=> y[:cost]} ref_set = Array.new(num_elite){|i| diverse_set[i]} remainder = diverse_set - ref_set remainder.each{|c| c[:dist] = distance(c[:vector], ref_set)} remainder.sort!{|x,y| y[:dist]<=>x[:dist]} ref_set = ref_set + remainder.first(ref_set_size-ref_set.size) return [ref_set, ref_set[0]] end def select_subsets(ref_set) additions = ref_set.select{|c| c[:new]} remainder = ref_set - additions remainder = additions if remainder.nil? or remainder.empty? subsets = [] additions.each do |a| remainder.each{|r| subsets << [a,r] if a!=r && !subsets.include?([r,a])} end return subsets end def recombine(subset, minmax) a, b = subset d = Array.new(a[:vector].size) {|i|(b[:vector][i]-a[:vector][i])/2.0} children = [] subset.each do |p| direction, r = ((rand<0.5) ? +1.0 : -1.0), rand child = {:vector=>Array.new(minmax.size)} child[:vector].each_index do |i| child[:vector][i] = p[:vector][i] + (direction * r * d[i]) child[:vector][i]=minmax[i][0] if child[:vector][i]<minmax[i][0] child[:vector][i]=minmax[i][1] if child[:vector][i]>minmax[i][1] end child[:cost] = objective_function(child[:vector]) children << child end return children end def explore_subsets(bounds, ref_set, max_no_improv, step_size) was_change = false subsets = select_subsets(ref_set) ref_set.each{|c| c[:new] = false} subsets.each do |subset| candidates = recombine(subset, bounds) improved = Array.new(candidates.size) do |i| local_search(candidates[i], bounds, max_no_improv, step_size) end improved.each do |c| if !ref_set.any? {|x| x[:vector]==c[:vector]} c[:new] = true ref_set.sort!{|x,y| x[:cost] <=> y[:cost]} if c[:cost] < ref_set.last[:cost] ref_set.delete(ref_set.last) ref_set << c puts " >> added, cost=#{c[:cost]}" was_change = true end end end end return was_change end def search(bounds, max_iter, ref_set_size, div_set_size, max_no_improv, step_size, max_elite) diverse_set = construct_initial_set(bounds, div_set_size, max_no_improv, step_size) ref_set, best = diversify(diverse_set, max_elite, ref_set_size) ref_set.each{|c| c[:new] = true} max_iter.times do |iter| was_change = explore_subsets(bounds, ref_set, max_no_improv, step_size) ref_set.sort!{|x,y| x[:cost] <=> y[:cost]} best = ref_set.first if ref_set.first[:cost] < best[:cost] puts " > iter=#{(iter+1)}, best=#{best[:cost]}" break if !was_change end return best end if __FILE__ == $0 # problem configuration problem_size = 3 bounds = Array.new(problem_size) {|i| [-5, +5]} # algorithm configuration max_iter = 100 step_size = (bounds[0][1]-bounds[0][0])*0.005 max_no_improv = 30 ref_set_size = 10 diverse_set_size = 20 no_elite = 5 # execute the algorithm best = search(bounds, max_iter, ref_set_size, diverse_set_size, max_no_improv, step_size, no_elite) puts "Done. Best Solution: c=#{best[:cost]}, v=#{best[:vector].inspect}" end
A form of the Scatter Search algorithm was proposed by Glover for integer programming [Glover1977], based on Glover's earlier work on surrogate constraints. The approach remained idle until it was revisited by Glover and combined with Tabu Search [Glover1994a]. The modern canonical reference of the approach was proposed by Glover who provides an abstract template of the procedure that may be specialized for a given application domain [Glover1998a].
The primary reference for the approach is the book by Laguna and Martí that reviews the principles of the approach in detail and presents tutorials on applications of the approach on standard problems using the C programming language [Laguna2003]. There are many review articles and chapters on Scatter Search that may be used to supplement an understanding of the approach, such as a detailed review chapter by Glover [Glover1999], a review of the fundamentals of the approach and its relationship to an abstraction called 'path linking' by Glover, Laguna, and Martí [Glover2000], and a modern overview of the technique by Martí, Laguna, and Glover [Martia2006].
[Glover1977] | F. Glover, "Heuristics for integer programming using surrogate constraints", Decision Sciences, 1977. |
[Glover1994a] | F. Glover, "Tabu Search for Nonlinear and Parametric Optimization (with Links\n\tto Genetic Algorithms)", Discrete Applied Mathematics, 1994. |
[Glover1998a] | F. Glover, "A Template For Scatter Search And Path Relinking", in Artificial Evolution, pages 13, Sprinter, 1998. |
[Glover1999] | F. Glover, "Scatter search and path relinking", in New Ideas in Optimization, pages 297–316, McGraw-Hill Ltd., 1999. |
[Glover2000] | F. Glover and M. Laguna and R. Martí, "Fundamentals of Scatter Search and Path Relinking", Control and Cybernetics, 2000. |
[Glover2003b] | F. Glover and M. Laguna and R. Martí, "Scatter Search", in Advances in Evolutionary Computation: Theory and Applications, pages 519–537, Springer-Verlag, 2003. |
[Laguna2003] | M. Laguna and R. Martí, "Scatter search: methodology and implementations in C", Kluwer Academic Publishers, 2003. |
[Martia2006] | R. Martí and M. Laguna and F. Glover, "Principles of Scatter Search", European Journal of Operational Research, 2006. |
Please Note: This content was automatically generated from the book content and may contain minor differences.