Home | Read Online | Amazon | GoodReads | Google Books | PDF (code) | GitHub
Clonal Selection Algorithm, CSA, CLONALG.
The Clonal Selection Algorithm (CLONALG) belongs to the field of Artificial Immune Systems. It is related to other Clonal Selection Algorithms such as the Artificial Immune Recognition System, the B-Cell Algorithm (BCA), and the Multi-objective Immune System Algorithm (MISA). There are numerious extensions to CLONALG including tweaks such as the CLONALG1 and CLONALG2 approaches, a version for classification called CLONCLAS, and an adaptive version called Adaptive Clonal Selection (ACS).
The Clonal Selection algorithm is inspired by the Clonal Selection theory of acquired immunity. The clonal selection theory credited to Burnet was proposed to account for the behavior and capabilities of antibodies in the acquired immune system [Burnet1957] [Burnet1959]. Inspired itself by the principles of Darwinian natural selection theory of evolution, the theory proposes that antigens select-for lymphocytes (both B and T-cells). When a lymphocyte is selected and binds to an antigenic determinant, the cell proliferates making many thousands more copies of itself and differentiates into different cell types (plasma and memory cells). Plasma cells have a short lifespan and produce vast quantities of antibody molecules, whereas memory cells live for an extended period in the host anticipating future recognition of the same determinant. The important feature of the theory is that when a cell is selected and proliferates, it is subjected to small copying errors (changes to the genome called somatic hypermutation) that change the shape of the expressed receptors and subsequent determinant recognition capabilities of both the antibodies bound to the lymphocytes cells surface, and the antibodies that plasma cells produce.
The theory suggests that starting with an initial repertoire of general immune cells, the system is able to change itself (the compositions and densities of cells and their receptors) in response to experience with the environment. Through a blind process of selection and accumulated variation on the large scale of many billions of cells, the acquired immune system is capable of acquiring the necessary information to protect the host organism from the specific pathogenic dangers of the environment. It also suggests that the system must anticipate (guess) at the pathogen to which it will be exposed, and requires exposure to pathogen that may harm the host before it can acquire the necessary information to provide a defense.
The information processing principles of the clonal selection theory describe a general learning strategy. This strategy involves a population of adaptive information units (each representing a problem-solution or component) subjected to a competitive processes for selection, which together with the resultant duplication and variation ultimately improves the adaptive fit of the information units to their environment.
Algorithm (below) provides a pseudocode listing of the Clonal Selection Algorithm (CLONALG) for minimizing a cost function. The general CLONALG model involves the selection of antibodies (candidate solutions) based on affinity either by matching against an antigen pattern or via evaluation of a pattern by a cost function. Selected antibodies are subjected to cloning proportional to affinity, and the hypermutation of clones inversely-proportional to clone affinity. The resultant clonal-set competes with the existent antibody population for membership in the next generation. In addition, low-affinity population members are replaced by randomly generated antibodies. The pattern recognition variation of the algorithm includes the maintenance of a memory solution set which in its entirety represents a solution to the problem. A binary-encoding scheme is employed for the binary-pattern recognition and continuous function optimization examples, and an integer permutation scheme is employed for the Traveling Salesman Problem (TSP).
Input
:
$Population_{size}$, $Selection_{size}$, $Problem_{size}$, $RandomCells_{num}$, $Clone_{rate}$, $Mutation_{rate}$
Output
:
Population
Population
$\leftarrow$ CreateRandomCells
{$Population_{size}$, $Problem_{size}$}While
($\neg$StopCondition
())For
($p_i \in$ Population
)Affinity
{$p_i$}End
Select
{Population
, $Selection_{size}$}For
($p_i \in Population_{select}$)Clone
{$p_i$, $Clone_{rate}$}End
For
($p_i \in Population_{clones}$)Hypermutate
{$p_i$, $Mutation_{rate}$}Affinity
{$p_i$}End
Population
$\leftarrow$ Select
{Population
, $Population_{clones}$, $Population_{size}$}CreateRandomCells
{$RandomCells_{num}$}Replace
{Population
, $Population_{rand}$}End
Return
(Population
)Listing (below) provides an example of the Clonal Selection Algorithm (CLONALG) implemented in the Ruby Programming Language. The demonstration problem is an instance of a continuous function optimization that seeks $\min f(x)$ where $f=\sum_{i=1}^n x_{i}^2$, $-5.0\leq x_i \leq 5.0$ and $n=3$. The optimal solution for this basin function is $(v_0,\ldots,v_{n-1})=0.0$. The algorithm is implemented as described by de Castro and Von Zuben for function optimization [Castro2002a].
def objective_function(vector) return vector.inject(0.0) {|sum, x| sum + (x**2.0)} end def decode(bitstring, search_space, bits_per_param) vector = [] search_space.each_with_index do |bounds, i| off, sum = i*bits_per_param, 0.0 param = bitstring[off...(off+bits_per_param)].reverse param.size.times do |j| sum += ((param[j].chr=='1') ? 1.0 : 0.0) * (2.0 ** j.to_f) end min, max = bounds vector << min + ((max-min)/((2.0**bits_per_param.to_f)-1.0)) * sum end return vector end def evaluate(pop, search_space, bits_per_param) pop.each do |p| p[:vector] = decode(p[:bitstring], search_space, bits_per_param) p[:cost] = objective_function(p[:vector]) end end def random_bitstring(num_bits) return (0...num_bits).inject(""){|s,i| s<<((rand<0.5) ? "1" : "0")} end def point_mutation(bitstring, rate) child = "" bitstring.size.times do |i| bit = bitstring[i].chr child << ((rand()<rate) ? ((bit=='1') ? "0" : "1") : bit) end return child end def calculate_mutation_rate(antibody, mutate_factor=-2.5) return Math.exp(mutate_factor * antibody[:affinity]) end def num_clones(pop_size, clone_factor) return (pop_size * clone_factor).floor end def calculate_affinity(pop) pop.sort!{|x,y| x[:cost]<=>y[:cost]} range = pop.last[:cost] - pop.first[:cost] if range == 0.0 pop.each {|p| p[:affinity] = 1.0} else pop.each {|p| p[:affinity] = 1.0-(p[:cost]/range)} end end def clone_and_hypermutate(pop, clone_factor) clones = [] num_clones = num_clones(pop.size, clone_factor) calculate_affinity(pop) pop.each do |antibody| m_rate = calculate_mutation_rate(antibody) num_clones.times do clone = {} clone[:bitstring] = point_mutation(antibody[:bitstring], m_rate) clones << clone end end return clones end def random_insertion(search_space, pop, num_rand, bits_per_param) return pop if num_rand == 0 rands = Array.new(num_rand) do |i| {:bitstring=>random_bitstring(search_space.size*bits_per_param)} end evaluate(rands, search_space, bits_per_param) return (pop+rands).sort{|x,y| x[:cost]<=>y[:cost]}.first(pop.size) end def search(search_space, max_gens, pop_size, clone_factor, num_rand, bits_per_param=16) pop = Array.new(pop_size) do |i| {:bitstring=>random_bitstring(search_space.size*bits_per_param)} end evaluate(pop, search_space, bits_per_param) best = pop.min{|x,y| x[:cost]<=>y[:cost]} max_gens.times do |gen| clones = clone_and_hypermutate(pop, clone_factor) evaluate(clones, search_space, bits_per_param) pop = (pop+clones).sort{|x,y| x[:cost]<=>y[:cost]}.first(pop_size) pop = random_insertion(search_space, pop, num_rand, bits_per_param) best = (pop + [best]).min{|x,y| x[:cost]<=>y[:cost]} puts " > gen #{gen+1}, f=#{best[:cost]}, s=#{best[:vector].inspect}" end return best end if __FILE__ == $0 # problem configuration problem_size = 2 search_space = Array.new(problem_size) {|i| [-5, +5]} # algorithm configuration max_gens = 100 pop_size = 100 clone_factor = 0.1 num_rand = 2 # execute the algorithm best = search(search_space, max_gens, pop_size, clone_factor, num_rand) puts "done! Solution: f=#{best[:cost]}, s=#{best[:vector].inspect}" end
Hidden at the back of a technical report on the applications of Artificial Immune Systems de Castro and Von Zuben [Castro1999] proposed the Clonal Selection Algorithm (CSA) as a computational realization of the clonal selection principle for pattern matching and optimization. The algorithm was later published [Castro2000], and investigated where it was renamed to CLONALG (CLONal selection ALGorithm) [Castro2002a].
Watkins et al. proposed to exploit the inherent distributedness of the CLONALG and proposed a parallel version of the pattern recognition version of the algorithm [Watkins2003]. White and Garret also investigated the pattern recognition version of CLONALG and generalized the approach for the task of binary pattern classification renaming it to Clonal Classification (CLONCLAS) where their approach was compared to a number of simple Hamming distance based heuristics [White2003]. In an attempt to address concerns of algorithm efficiency, parameterization, and representation selection for continuous function optimization Garrett proposed an updated version of CLONALG called Adaptive Clonal Selection (ACS) [Garrett2004]. In their book, de Castro and Timmis provide a detailed treatment of CLONALG including a description of the approach (starting page 79) and a step through of the algorithm (starting page 99) [Castro2002b]. Cutello and Nicosia provide a study of the clonal selection principle and algorithms inspired by the theory [Cutello2005]. Brownlee provides a review of Clonal Selection algorithms providing a taxonomy, algorithm reviews, and a broader bibliography [Brownlee2007b].
[Brownlee2007b] | J. Brownlee, "Clonal Selection Algorithms", Technical Report 070209A, Complex Intelligent Systems Laboratory (CIS), Centre for Information\n\tTechnology Research (CITR), Faculty of Information and Communication\n\tTechnologies (ICT), Swinburne University of Technology, 2007. |
[Burnet1957] | F. M. Burnet, "A modification of Jerne's theory of antibody production using the\n\tconcept of clonal selection", Australian Journal of Science, 1957. |
[Burnet1959] | F. M. Burnet, "The clonal selection theory of acquired immunity", Vanderbilt University Press, 1959. |
[Castro1999] | L. N. de Castro and F. J. Von Zuben, "Artificial Immune Systems - Part I: Basic Theory and Applications", Technical Report TR DCA 01/99, Department of Computer Engineering and Industrial Automation, School\n\tof Electrical and Computer Engineering, State University of Campinas,\n\tBrazil, 1999. |
[Castro2000] | L. N. de Castro and F. J. Von Zuben, "The Clonal Selection Algorithm with Engineering Applications", in Proceedings of the Genetic and Evolutionary Computation Conference\n\t(GECCO '00), Workshop on Artificial Immune Systems and Their Applications, 2000. |
[Castro2002a] | L. N. de Castro and F. J. Von Zuben, "Learning and optimization using the clonal selection principle", IEEE Transactions on Evolutionary Computation, 2002. |
[Castro2002b] | L. N. de Castro and J. Timmis, "Artificial immune systems: a new computational intelligence approach", Springer, 2002. |
[Cutello2005] | V. Cutello and G. Nicosia, "Chapter VI. The Clonal Selection Principle for In Silico and In Vivo\n\tComputing", in Recent Developments in Biologically Inspired Computing, pages 104–146, Idea Group Publishing, 2005. |
[Garrett2004] | S. M. Garrett, "Parameter-free, adaptive clonal selection", in Congress on Evolutionary Computing (CEC 2004), 2004. |
[Watkins2003] | A. Watkins and X. Bi and A. Phadke, "Parallelizing an Immune-Inspired Algorithm for Efficient Pattern\n\tRecognition", in Intelligent Engineering Systems through Artificial Neural Networks:\n\tSmart Engineering System Design: Neural Networks, 2003. |
[White2003] | J. White and S. M. Garrett, "Improved Pattern Recognition with Artificial Clonal Selection?", in Proceedings Artificial Immune Systems: Second International Conference,\n\tICARIS 2003, 2003. |
Please Note: This content was automatically generated from the book content and may contain minor differences.