Nondominated Sorting Genetic AlgorithmNondominated Sorting Genetic Algorithm, Nondominated Sorting Genetic Algorithm, Fast Elitist Nondominated Sorting Genetic Algorithm, NSGA, NSGAII, NSGAII. TaxonomyThe Nondominated Sorting Genetic Algorithm is a Multiple Objective Optimization (MOO) algorithm and is an instance of an Evolutionary Algorithm from the field of Evolutionary Computation. Refer to for more information and references on Multiple Objective Optimization. NSGA is an extension of the Genetic Algorithm for multiple objective function optimization. It is related to other Evolutionary Multiple Objective Optimization Algorithms (EMOO) (or Multiple Objective Evolutionary Algorithms MOEA) such as the VectorEvaluated Genetic Algorithm (VEGA), Strength Pareto Evolutionary Algorithm (SPEA), and Pareto Archived Evolution Strategy (PAES). There are two versions of the algorithm, the classical NSGA and the updated and currently canonical form NSGAII. StrategyThe objective of the NSGA algorithm is to improve the adaptive fit of a population of candidate solutions to a Pareto front constrained by a set of objective functions. The algorithm uses an evolutionary process with surrogates for evolutionary operators including selection, genetic crossover, and genetic mutation. The population is sorted into a hierarchy of subpopulations based on the ordering of Pareto dominance. Similarity between members of each subgroup is evaluated on the Pareto front, and the resulting groups and similarity measures are used to promote a diverse front of nondominated solutions. Procedure
Algorithm (below) provides a pseudocode listing of the Nondominated Sorting Genetic Algorithm II (NSGAII) for minimizing a cost function.
The Input :
$Population_{size}$, ProblemSize , $P_{crossover}$, $P_{mutation}$
Output :
Children
Population $\leftarrow$ InitializePopulation ($Population_{size}$, ProblemSize )EvaluateAgainstObjectiveFunctions (Population )FastNondominatedSort (Population )Selected $\leftarrow$ SelectParentsByRank (Population , $Population_{size}$)Children $\leftarrow$ CrossoverAndMutation (Selected , $P_{crossover}$, $P_{mutation}$)While ($\neg$StopCondition ())EvaluateAgainstObjectiveFunctions (Children )Union $\leftarrow$ Merge (Population , Children )Fronts $\leftarrow$ FastNondominatedSort (Union )Parents $\leftarrow \emptyset$$Front_L$ $\leftarrow \emptyset$ For ($Front_i$ $\in$ Fronts )CrowdingDistanceAssignment ($Front_i$)If (Size (Parents )+Size ($Front_i$) > $Population_{size}$)$Front_L$ $\leftarrow$ $i$ Break ()Else Parents $\leftarrow$ Merge (Parents , $Front_i$)End End If (Size (Parents )<$Population_{size}$)$Front_L$ $\leftarrow$ SortByRankAndDistance ($Front_L$)For ($P_1$ To $P_{Population_{size}  Size{Front_L}}$)Parents $\leftarrow$ $Pi$End End Selected $\leftarrow$ SelectParentsByRankAndDistance (Parents , $Population_{size}$)Population $\leftarrow$ Children Children $\leftarrow$ CrossoverAndMutation (Selected , $P_{crossover}$, $P_{mutation}$)End Return (Children )Heuristics
Code ListingListing (below) provides an example of the Nondominated Sorting Genetic Algorithm II (NSGAII) implemented in the Ruby Programming Language. The demonstration problem is an instance of continuous multiple objective function optimization called SCH (problem one in [Deb2002]). The problem seeks the minimum of two functions: $f1=\sum_{i=1}^n x_{i}^2$ and $f2=\sum_{i=1}^n (x_{i}2)^2$, $10\leq x_i \leq 10$ and $n=1$. The optimal solution for this function are $x \in [0,2]$. The algorithm is an implementation of NSGAII based on the presentation by Deb et al. [Deb2002]. The algorithm uses a binary string representation (16 bits per objective function parameter) that is decoded and rescaled to the function domain. The implementation uses a uniform crossover operator and point mutations with a fixed mutation rate of $\frac{1}{L}$, where $L$ is the number of bits in a solution's binary string. def objective1(vector) return vector.inject(0.0) {sum, x sum + (x**2.0)} end def objective2(vector) return vector.inject(0.0) {sum, x sum + ((x2.0)**2.0)} end def decode(bitstring, search_space, bits_per_param) vector = [] search_space.each_with_index do bounds, i off, sum = i*bits_per_param, 0.0 param = bitstring[off...(off+bits_per_param)].reverse param.size.times do j sum += ((param[j].chr=='1') ? 1.0 : 0.0) * (2.0 ** j.to_f) end min, max = bounds vector << min + ((maxmin)/((2.0**bits_per_param.to_f)1.0)) * sum end return vector end def random_bitstring(num_bits) return (0...num_bits).inject(""){s,i s<<((rand<0.5) ? "1" : "0")} end def point_mutation(bitstring, rate=1.0/bitstring.size) child = "" bitstring.size.times do i bit = bitstring[i].chr child << ((rand()<rate) ? ((bit=='1') ? "0" : "1") : bit) end return child end def crossover(parent1, parent2, rate) return ""+parent1 if rand()>=rate child = "" parent1.size.times do i child << ((rand()<0.5) ? parent1[i].chr : parent2[i].chr) end return child end def reproduce(selected, pop_size, p_cross) children = [] selected.each_with_index do p1, i p2 = (i.modulo(2)==0) ? selected[i+1] : selected[i1] p2 = selected[0] if i == selected.size1 child = {} child[:bitstring] = crossover(p1[:bitstring], p2[:bitstring], p_cross) child[:bitstring] = point_mutation(child[:bitstring]) children << child break if children.size >= pop_size end return children end def calculate_objectives(pop, search_space, bits_per_param) pop.each do p p[:vector] = decode(p[:bitstring], search_space, bits_per_param) p[:objectives] = [objective1(p[:vector]), objective2(p[:vector])] end end def dominates(p1, p2) p1[:objectives].each_index do i return false if p1[:objectives][i] > p2[:objectives][i] end return true end def fast_nondominated_sort(pop) fronts = Array.new(1){[]} pop.each do p1 p1[:dom_count], p1[:dom_set] = 0, [] pop.each do p2 if dominates(p1, p2) p1[:dom_set] << p2 elsif dominates(p2, p1) p1[:dom_count] += 1 end end if p1[:dom_count] == 0 p1[:rank] = 0 fronts.first << p1 end end curr = 0 begin next_front = [] fronts[curr].each do p1 p1[:dom_set].each do p2 p2[:dom_count] = 1 if p2[:dom_count] == 0 p2[:rank] = (curr+1) next_front << p2 end end end curr += 1 fronts << next_front if !next_front.empty? end while curr < fronts.size return fronts end def calculate_crowding_distance(pop) pop.each {p p[:dist] = 0.0} num_obs = pop.first[:objectives].size num_obs.times do i min = pop.min{x,y x[:objectives][i]<=>y[:objectives][i]} max = pop.max{x,y x[:objectives][i]<=>y[:objectives][i]} rge = max[:objectives][i]  min[:objectives][i] pop.first[:dist], pop.last[:dist] = 1.0/0.0, 1.0/0.0 next if rge == 0.0 (1...(pop.size1)).each do j pop[j][:dist]+=(pop[j+1][:objectives][i]pop[j1][:objectives][i])/rge end end end def crowded_comparison_operator(x,y) return y[:dist]<=>x[:dist] if x[:rank] == y[:rank] return x[:rank]<=>y[:rank] end def better(x,y) if !x[:dist].nil? and x[:rank] == y[:rank] return (x[:dist]>y[:dist]) ? x : y end return (x[:rank]<y[:rank]) ? x : y end def select_parents(fronts, pop_size) fronts.each {f calculate_crowding_distance(f)} offspring, last_front = [], 0 fronts.each do front break if (offspring.size+front.size) > pop_size front.each {p offspring << p} last_front += 1 end if (remaining = pop_sizeoffspring.size) > 0 fronts[last_front].sort! {x,y crowded_comparison_operator(x,y)} offspring += fronts[last_front][0...remaining] end return offspring end def weighted_sum(x) return x[:objectives].inject(0.0) {sum, x sum+x} end def search(search_space, max_gens, pop_size, p_cross, bits_per_param=16) pop = Array.new(pop_size) do i {:bitstring=>random_bitstring(search_space.size*bits_per_param)} end calculate_objectives(pop, search_space, bits_per_param) fast_nondominated_sort(pop) selected = Array.new(pop_size) do better(pop[rand(pop_size)], pop[rand(pop_size)]) end children = reproduce(selected, pop_size, p_cross) calculate_objectives(children, search_space, bits_per_param) max_gens.times do gen union = pop + children fronts = fast_nondominated_sort(union) parents = select_parents(fronts, pop_size) selected = Array.new(pop_size) do better(parents[rand(pop_size)], parents[rand(pop_size)]) end pop = children children = reproduce(selected, pop_size, p_cross) calculate_objectives(children, search_space, bits_per_param) best = parents.sort!{x,y weighted_sum(x)<=>weighted_sum(y)}.first best_s = "[x=#{best[:vector]}, objs=#{best[:objectives].join(', ')}]" puts " > gen=#{gen+1}, fronts=#{fronts.size}, best=#{best_s}" end union = pop + children fronts = fast_nondominated_sort(union) parents = select_parents(fronts, pop_size) return parents end if __FILE__ == $0 # problem configuration problem_size = 1 search_space = Array.new(problem_size) {i [10, 10]} # algorithm configuration max_gens = 50 pop_size = 100 p_cross = 0.98 # execute the algorithm pop = search(search_space, max_gens, pop_size, p_cross) puts "done!" end Download: nsgaii.rb.
ReferencesPrimary SourcesSrinivas and Deb proposed the NSGA inspired by Goldberg's notion of a nondominated sorting procedure [Srinivas1994]. Goldberg proposed a nondominated sorting procedure in his book in considering the biases in the Pareto optimal solutions provided by VEGA [Goldberg1989]. Srinivas and Deb's NSGA used the sorting procedure as a ranking selection method, and a fitness sharing niching method to maintain stable subpopulations across the Pareto front. Deb et al. later extended NSGA to address three criticism of the approach: the $O(mN^3)$ time complexity, the lack of elitism, and the need for a sharing parameter for the fitness sharing niching method [Deb2000] [Deb2002]. Learn MoreDeb provides in depth coverage of Evolutionary Multiple Objective Optimization algorithms in his book, including a detailed description of the NSGA in Chapter 5 [Deb2001]. Bibliography

Free CourseGet one algorithm per week...
Own A CopyThis 438page ebook has...


Please Note: This content was automatically generated from the book content and may contain minor differences. 

Do you like Clever Algorithms? Buy the book now. 