Home | Read Online | Amazon | GoodReads | Google Books | PDF (code) | GitHub
Genetic Algorithm, GA, Simple Genetic Algorithm, SGA, Canonical Genetic Algorithm, CGA.
The Genetic Algorithm is an Adaptive Strategy and a Global Optimization technique. It is an Evolutionary Algorithm and belongs to the broader study of Evolutionary Computation. The Genetic Algorithm is a sibling of other Evolutionary Algorithms such as Genetic Programming, Evolution Strategies, Evolutionary Programming, and Learning Classifier Systems. The Genetic Algorithm is a parent of a large number of variant techniques and sub-fields too numerous to list.
The Genetic Algorithm is inspired by population genetics (including heredity and gene frequencies), and evolution at the population level, as well as the Mendelian understanding of the structure (such as chromosomes, genes, alleles) and mechanisms (such as recombination and mutation). This is the so-called new or modern synthesis of evolutionary biology.
Individuals of a population contribute their genetic material (called the genotype) proportional to their suitability of their expressed genome (called their phenotype) to their environment, in the form of offspring. The next generation is created through a process of mating that involves recombination of two individuals genomes in the population with the introduction of random copying errors (called mutation). This iterative process may result in an improved adaptive-fit between the phenotypes of individuals in a population and the environment.
The objective of the Genetic Algorithm is to maximize the payoff of candidate solutions in the population against a cost function from the problem domain. The strategy for the Genetic Algorithm is to repeatedly employ surrogates for the recombination and mutation genetic mechanisms on the population of candidate solutions, where the cost function (also known as objective or fitness function) applied to a decoded representation of a candidate governs the probabilistic contributions a given candidate solution can make to the subsequent generation of candidate solutions.
Algorithm (below) provides a pseudocode listing of the Genetic Algorithm for minimizing a cost function.
Input
:
$Population_{size}$, $Problem_{size}$, $P_{crossover}$, $P_{mutation}$
Output
:
$S_{best}$
Population
$\leftarrow$ InitializePopulation
{$Population_{size}$, $Problem_{size}$}EvaluatePopulation
{Population
}GetBestSolution
{Population
}While
($\neg$StopCondition
())Parents
$\leftarrow$ SelectParents
{Population
, $Population_{size}$}Children
$\leftarrow \emptyset$For
($Parent_{1}$, $Parent_{2}$ $\in$ Parents
)Crossover
{$Parent_{1}$, $Parent_{2}$, $P_{crossover}$}Children
$\leftarrow$ Mutate
{$Child_{1}$, $P_{mutation}$}Children
$\leftarrow$ Mutate
{$Child_{2}$, $P_{mutation}$}End
EvaluatePopulation
{Children
}GetBestSolution
{Children
}Population
$\leftarrow$ Replace
{Population
, Children
}End
Return
($S_{best}$)Listing (below) provides an example of the Genetic Algorithm implemented in the Ruby Programming Language. The demonstration problem is a maximizing binary optimization problem called OneMax that seeks a binary string of unity (all '1' bits). The objective function provides only an indication of the number of correct bits in a candidate string, not the positions of the correct bits.
The Genetic Algorithm is implemented with a conservative configuration including binary tournament selection for the selection operator, one-point crossover for the recombination operator, and point mutations for the mutation operator.
def onemax(bitstring) sum = 0 bitstring.size.times {|i| sum+=1 if bitstring[i].chr=='1'} return sum end def random_bitstring(num_bits) return (0...num_bits).inject(""){|s,i| s<<((rand<0.5) ? "1" : "0")} end def binary_tournament(pop) i, j = rand(pop.size), rand(pop.size) j = rand(pop.size) while j==i return (pop[i][:fitness] > pop[j][:fitness]) ? pop[i] : pop[j] end def point_mutation(bitstring, rate=1.0/bitstring.size) child = "" bitstring.size.times do |i| bit = bitstring[i].chr child << ((rand()<rate) ? ((bit=='1') ? "0" : "1") : bit) end return child end def crossover(parent1, parent2, rate) return ""+parent1 if rand()>=rate point = 1 + rand(parent1.size-2) return parent1[0...point]+parent2[point...(parent1.size)] end def reproduce(selected, pop_size, p_cross, p_mutation) children = [] selected.each_with_index do |p1, i| p2 = (i.modulo(2)==0) ? selected[i+1] : selected[i-1] p2 = selected[0] if i == selected.size-1 child = {} child[:bitstring] = crossover(p1[:bitstring], p2[:bitstring], p_cross) child[:bitstring] = point_mutation(child[:bitstring], p_mutation) children << child break if children.size >= pop_size end return children end def search(max_gens, num_bits, pop_size, p_crossover, p_mutation) population = Array.new(pop_size) do |i| {:bitstring=>random_bitstring(num_bits)} end population.each{|c| c[:fitness] = onemax(c[:bitstring])} best = population.sort{|x,y| y[:fitness] <=> x[:fitness]}.first max_gens.times do |gen| selected = Array.new(pop_size){|i| binary_tournament(population)} children = reproduce(selected, pop_size, p_crossover, p_mutation) children.each{|c| c[:fitness] = onemax(c[:bitstring])} children.sort!{|x,y| y[:fitness] <=> x[:fitness]} best = children.first if children.first[:fitness] >= best[:fitness] population = children puts " > gen #{gen}, best: #{best[:fitness]}, #{best[:bitstring]}" break if best[:fitness] == num_bits end return best end if __FILE__ == $0 # problem configuration num_bits = 64 # algorithm configuration max_gens = 100 pop_size = 100 p_crossover = 0.98 p_mutation = 1.0/num_bits # execute the algorithm best = search(max_gens, num_bits, pop_size, p_crossover, p_mutation) puts "done! Solution: f=#{best[:fitness]}, s=#{best[:bitstring]}" end
Holland is the grandfather of the field that became Genetic Algorithms. Holland investigated adaptive systems in the late 1960s proposing an adaptive system formalism and adaptive strategies referred to as 'adaptive plans' [Holland1962] [Holland1962a] [Holland1969]. Holland's theoretical framework was investigated and elaborated by his Ph.D. students at the University of Michigan. Rosenberg investigated a chemical and molecular model of a biological inspired adaptive plan [Rosenberg1967]. Bagley investigated meta-environments and a genetic adaptive plan referred to as a genetic algorithm applied to a simple game called hexapawn [Bagley1967]. Cavicchio further elaborated the genetic adaptive plan by proposing numerous variations, referring to some as 'reproductive plans' [Cavicchio1970].
Other important contributions were made by Frantz who investigated what were referred to as genetic algorithms for search [Frantz1972], and Hollstien who investigated genetic plans for adaptive control and function optimization [Hollstien1971]. De Jong performed a seminal investigation of the genetic adaptive model (genetic plans) applied to continuous function optimization and his suite of test problems adopted are still commonly used [Jong1975]. Holland wrote the the seminal book on his research focusing on the proposed adaptive systems formalism, the reproductive and genetic adaptive plans, and provided a theoretical framework for the mechanisms used and explanation for the capabilities of what would become genetic algorithms [Holland1975].
The field of genetic algorithms is very large, resulting in large numbers of variations on the canonical technique. Goldberg provides a classical overview of the field in a review article [Goldberg1994], as does Mitchell [Mitchell1995]. Whitley describes a classical tutorial for the Genetic Algorithm covering both practical and theoretical concerns [Whitley1994].
The algorithm is highly-modular and a sub-field exists to study each sub-process, specifically: selection, recombination, mutation, and representation. The Genetic Algorithm is most commonly used as an optimization technique, although it should also be considered a general adaptive strategy [Jong1992]. The schema theorem is a classical explanation for the power of the Genetic Algorithm proposed by Holland [Holland1975], and investigated by Goldberg under the name of the building block hypothesis [Goldberg1989].
The classical book on genetic algorithms as an optimization and machine learning technique was written by Goldberg and provides an in-depth review and practical study of the approach [Goldberg1989]. Mitchell provides a contemporary reference text introducing the technique and the field [Mitchell1998]. Finally, Goldberg provides a modern study of the field, the lessons learned, and reviews the broader toolset of optimization algorithms that the field has produced [Goldberg2002].
Please Note: This content was automatically generated from the book content and may contain minor differences.