Home | Read Online | Amazon | GoodReads | Google Books | PDF (code) | GitHub
Strength Pareto Evolutionary Algorithm, SPEA, SPEA2.
Strength Pareto Evolutionary Algorithm is a Multiple Objective Optimization (MOO) algorithm and an Evolutionary Algorithm from the field of Evolutionary Computation. It belongs to the field of Evolutionary Multiple Objective (EMO) algorithms. Refer to for more information and references on Multiple Objective Optimization. Strength Pareto Evolutionary Algorithm is an extension of the Genetic Algorithm for multiple objective optimization problems. It is related to sibling Evolutionary Algorithms such as Non-dominated Sorting Genetic Algorithm (NSGA), Vector-Evaluated Genetic Algorithm (VEGA), and Pareto Archived Evolution Strategy (PAES). There are two versions of SPEA, the original SPEA algorithm and the extension SPEA2. Additional extensions include SPEA+ and iSPEA.
The objective of the algorithm is to locate and and maintain a front of non-dominated solutions, ideally a set of Pareto optimal solutions. This is achieved by using an evolutionary process (with surrogate procedures for genetic recombination and mutation) to explore the search space, and a selection process that uses a combination of the degree to which a candidate solution is dominated (strength) and an estimation of density of the Pareto front as an assigned fitness. An archive of the non-dominated set is maintained separate from the population of candidate solutions used in the evolutionary process, providing a form of elitism.
Algorithm (below) provides a pseudocode listing of the Strength Pareto Evolutionary Algorithm 2 (SPEA2) for minimizing a cost function.
The CalculateRawFitness
function calculates the raw fitness as the sum of the strength values of the solutions that dominate a given candidate, where strength is the number of solutions that a give solution dominate.
The CandidateDensity
function estimates the density of an area of the Pareto front as where is the Euclidean distance of the objective values between a given solution the $k$th nearest neighbor of the solution, and is the square root of the size of the population and archive combined.
The PopulateWithRemainingBest
function iteratively fills the archive with the remaining candidate solutions in order of fitness.
The RemoveMostSimilar
function truncates the archive population removing those members with the smallest values as calculated against the archive.
The SelectParents
function selects parents from a population using a Genetic Algorithm selection method such as binary tournament selection. The CrossoverAndMutation
function performs the crossover and mutation genetic operators from the Genetic Algorithm.
Input
:
, , ProblemSize
, ,
Output
:
Archive
Population
InitializePopulation
{, ProblemSize
}Archive
While
(StopCondition
())For
( Population
)CalculateObjectives
{}End
Union
Population
Archive
For
( Union
)CalculateRawFitness
{, Union
}CalculateSolutionDensity
{, Union
}End
Archive
GetNonDominated
{Union
}If
(Size
{Archive
} )PopulateWithRemainingBest
{Union
, Archive
, }ElseIf
(Size
{Archive
} )RemoveMostSimilar
{Archive
, }End
Selected
SelectParents
{Archive
, }Population
CrossoverAndMutation
{Selected
, , }End
Return
(GetNonDominated{Archive
})Listing (below) provides an example of the Strength Pareto Evolutionary Algorithm 2 (SPEA2) implemented in the Ruby Programming Language. The demonstration problem is an instance of continuous multiple objective function optimization called SCH (problem one in [Deb2002]). The problem seeks the minimum of two functions: and , and . The optimal solutions for this function are . The algorithm is an implementation of SPEA2 based on the presentation by Zitzler, Laumanns, and Thiele [Zitzler2002]. The algorithm uses a binary string representation (16 bits per objective function parameter) that is decoded and rescaled to the function domain. The implementation uses a uniform crossover operator and point mutations with a fixed mutation rate of , where is the number of bits in a solution's binary string.
def objective1(vector)
return vector.inject(0.0) {|sum, x| sum + (x**2.0)}
end
def objective2(vector)
return vector.inject(0.0) {|sum, x| sum + ((x-2.0)**2.0)}
end
def decode(bitstring, search_space, bits_per_param)
vector = []
search_space.each_with_index do |bounds, i|
off, sum = i*bits_per_param, 0.0
param = bitstring[off...(off+bits_per_param)].reverse
param.size.times do |j|
sum += ((param[j].chr=='1') ? 1.0 : 0.0) * (2.0 ** j.to_f)
end
min, max = bounds
vector << min + ((max-min)/((2.0**bits_per_param.to_f)-1.0)) * sum
end
return vector
end
def point_mutation(bitstring, rate=1.0/bitstring.size)
child = ""
bitstring.size.times do |i|
bit = bitstring[i].chr
child << ((rand()<rate) ? ((bit=='1') ? "0" : "1") : bit)
end
return child
end
def binary_tournament(pop)
i, j = rand(pop.size), rand(pop.size)
j = rand(pop.size) while j==i
return (pop[i][:fitness] < pop[j][:fitness]) ? pop[i] : pop[j]
end
def crossover(parent1, parent2, rate)
return ""+parent1 if rand()>=rate
child = ""
parent1.size.times do |i|
child << ((rand()<0.5) ? parent1[i].chr : parent2[i].chr)
end
return child
end
def reproduce(selected, pop_size, p_cross)
children = []
selected.each_with_index do |p1, i|
p2 = (i.modulo(2)==0) ? selected[i+1] : selected[i-1]
p2 = selected[0] if i == selected.size-1
child = {}
child[:bitstring] = crossover(p1[:bitstring], p2[:bitstring], p_cross)
child[:bitstring] = point_mutation(child[:bitstring])
children << child
break if children.size >= pop_size
end
return children
end
def random_bitstring(num_bits)
return (0...num_bits).inject(""){|s,i| s<<((rand<0.5) ? "1" : "0")}
end
def calculate_objectives(pop, search_space, bits_per_param)
pop.each do |p|
p[:vector] = decode(p[:bitstring], search_space, bits_per_param)
p[:objectives] = []
p[:objectives] << objective1(p[:vector])
p[:objectives] << objective2(p[:vector])
end
end
def dominates?(p1, p2)
p1[:objectives].each_index do |i|
return false if p1[:objectives][i] > p2[:objectives][i]
end
return true
end
def weighted_sum(x)
return x[:objectives].inject(0.0) {|sum, x| sum+x}
end
def euclidean_distance(c1, c2)
sum = 0.0
c1.each_index {|i| sum += (c1[i]-c2[i])**2.0}
return Math.sqrt(sum)
end
def calculate_dominated(pop)
pop.each do |p1|
p1[:dom_set] = pop.select {|p2| p1!=p2 and dominates?(p1, p2) }
end
end
def calculate_raw_fitness(p1, pop)
return pop.inject(0.0) do |sum, p2|
(dominates?(p2, p1)) ? sum + p2[:dom_set].size.to_f : sum
end
end
def calculate_density(p1, pop)
pop.each do |p2|
p2[:dist] = euclidean_distance(p1[:objectives], p2[:objectives])
end
list = pop.sort{|x,y| x[:dist]<=>y[:dist]}
k = Math.sqrt(pop.size).to_i
return 1.0 / (list[k][:dist] + 2.0)
end
def calculate_fitness(pop, archive, search_space, bits_per_param)
calculate_objectives(pop, search_space, bits_per_param)
union = archive + pop
calculate_dominated(union)
union.each do |p|
p[:raw_fitness] = calculate_raw_fitness(p, union)
p[:density] = calculate_density(p, union)
p[:fitness] = p[:raw_fitness] + p[:density]
end
end
def environmental_selection(pop, archive, archive_size)
union = archive + pop
environment = union.select {|p| p[:fitness]<1.0}
if environment.size < archive_size
union.sort!{|x,y| x[:fitness]<=>y[:fitness]}
union.each do |p|
environment << p if p[:fitness] >= 1.0
break if environment.size >= archive_size
end
elsif environment.size > archive_size
begin
k = Math.sqrt(environment.size).to_i
environment.each do |p1|
environment.each do |p2|
p2[:dist] = euclidean_distance(p1[:objectives], p2[:objectives])
end
list = environment.sort{|x,y| x[:dist]<=>y[:dist]}
p1[:density] = list[k][:dist]
end
environment.sort!{|x,y| x[:density]<=>y[:density]}
environment.shift
end until environment.size <= archive_size
end
return environment
end
def search(search_space, max_gens, pop_size, archive_size, p_cross, bits_per_param=16)
pop = Array.new(pop_size) do |i|
{:bitstring=>random_bitstring(search_space.size*bits_per_param)}
end
gen, archive = 0, []
begin
calculate_fitness(pop, archive, search_space, bits_per_param)
archive = environmental_selection(pop, archive, archive_size)
best = archive.sort{|x,y| weighted_sum(x)<=>weighted_sum(y)}.first
puts ">gen=#{gen}, objs=#{best[:objectives].join(', ')}"
break if gen >= max_gens
selected = Array.new(pop_size){binary_tournament(archive)}
pop = reproduce(selected, pop_size, p_cross)
gen += 1
end while true
return archive
end
if __FILE__ == $0
# problem configuration
problem_size = 1
search_space = Array.new(problem_size) {|i| [-10, 10]}
# algorithm configuration
max_gens = 50
pop_size = 80
archive_size = 40
p_cross = 0.90
# execute the algorithm
pop = search(search_space, max_gens, pop_size, archive_size, p_cross)
puts "done!"
end
Zitzler and Thiele introduced the Strength Pareto Evolutionary Algorithm as a technical report on a multiple objective optimization algorithm with elitism and clustering along the Pareto front [Zitzler1998]. The technical report was later published [Zitzler1999]. The Strength Pareto Evolutionary Algorithm was developed as a part of Zitzler's PhD thesis [Zitzler1999a]. Zitzler, Laumanns, and Thiele later extended SPEA to address some inefficiencies of the approach, the algorithm was called SPEA2 and was released as a technical report [Zitzler2001] and later published [Zitzler2002]. SPEA2 provides fine-grained fitness assignment, density estimation of the Pareto front, and an archive truncation operator.
Zitzler, Laumanns, and Bleuler provide a tutorial on SPEA2 as a book chapter that considers the basics of multiple objective optimization, and the differences from SPEA and the other related Multiple Objective Evolutionary Algorithms [Zitzler2004].
[Deb2002] | K. Deb and A. Pratap and S. Agarwal and T. Meyarivan, "A Fast and Elitist Multiobjective Genetic Algorithm: NSGA–II", IEEE Transactions on Evolutionary Computation, 2002. |
[Zitzler1998] | E. Zitzler and L. Thiele, "An evolutionary algorithm for multiobjective optimization: The strength\n\tpareto approach", Technical Report 43, Computer Engineering and Networks Laboratory (TIK), Swiss Federal\n\tInstitute of Technology (ETH) Zurich, 1998. |
[Zitzler1999] | E. Zitzler and L. Thiele, "Multiobjective evolutionary algorithms: A comparative case study\n\tand the strength pareto approach", IEEE Transactions on Evolutionary Computation, 1999. |
[Zitzler1999a] | E. Zitzler, "Evolutionary Algorithms for Multiobjective Optimization: Methods\n\tand Applications", [PhD Thesis] Shaker Verlag, Aachen, Germany, 1999. |
[Zitzler2001] | E. Zitzler and M. Laumanns and L. Thiele, "SPEA2: Improving the strength Pareto evolutionary algorithm", Technical Report 103, Computer Engineering and Networks Laboratory (TIK), Swiss Federal\n\tInstitute of Technology (ETH) Zurich, 2001. |
[Zitzler2002] | E. Zitzler and M. Laumanns and L. Thiele, "SPEA2: Improving the strength pareto evolutionary algorithm for\n\tmultiobjective optimization", in Evolutionary Methods for Design, Optimisation and Control with Application\n\tto Industrial Problems (EUROGEN 2001), 2002. |
[Zitzler2004] | E. Zitzler and M. Laumanns and S. Bleuler, "A Tutorial on Evolutionary Multiobjective Optimization", in Metaheuristics for Multiobjective Optimisation, pages 3–37, Springer, 2004. |
Please Note: This content was automatically generated from the book content and may contain minor differences.