Clever Algorithms: Nature-Inspired Programming Recipes

A book by Jason Brownlee

Home | Read Online | Amazon | GoodReads | Google Books | PDF (code) | GitHub



Immune Network Algorithm

Artificial Immune Network, aiNet, Optimization Artificial Immune Network, opt-aiNet.

Taxonomy

The Artificial Immune Network algorithm (aiNet) is a Immune Network Algorithm from the field of Artificial Immune Systems. It is related to other Artificial Immune System algorithms such as the Clonal Selection Algorithm, the Negative Selection Algorithm, and the Dendritic Cell Algorithm. The Artificial Immune Network algorithm includes the base version and the extension for optimization problems called the Optimization Artificial Immune Network algorithm (opt-aiNet).

Inspiration

The Artificial Immune Network algorithm is inspired by the Immune Network theory of the acquired immune system. The clonal selection theory of acquired immunity accounts for the adaptive behavior of the immune system including the ongoing selection and proliferation of cells that select-for potentially harmful (and typically foreign) material in the body. A concern of the clonal selection theory is that it presumes that the repertoire of reactive cells remains idle when there are no pathogen to which to respond. Jerne proposed an Immune Network Theory (Idiotypic Networks) where immune cells are not at rest in the absence of pathogen, instead antibody and immune cells recognize and respond to each other [Jerne1974] [Jerne1974a] [Jerne1984].

The Immune Network theory proposes that antibody (both free floating and surface bound) possess idiotopes (surface features) to which the receptors of other antibody can bind. As a result of receptor interactions, the repertoire becomes dynamic, where receptors continually both inhibit and excite each other in complex regulatory networks (chains of receptors). The theory suggests that the clonal selection process may be triggered by the idiotopes of other immune cells and molecules in addition to the surface characteristics of pathogen, and that the maturation process applies both to the receptors themselves and the idiotopes which they expose.

Metaphor

The immune network theory has interesting resource maintenance and signaling information processing properties. The classical clonal selection and negative selection paradigms integrate the accumulative and filtered learning of the acquired immune system, whereas the immune network theory proposes an additional order of complexity between the cells and molecules under selection. In addition to cells that interact directly with pathogen, there are cells that interact with those reactive cells and with pathogen indirectly, in successive layers such that networks of activity for higher-order structures such as internal images of pathogen (promotion), and regulatory networks (so-called anti-idiotopes and anti-anti-idiotopes).

Strategy

The objective of the immune network process is to prepare a repertoire of discrete pattern detectors for a given problem domain, where better performing cells suppress low-affinity (similar) cells in the network. This principle is achieved through an interactive process of exposing the population to external information to which it responds with both a clonal selection response and internal meta-dynamics of intra-population responses that stabilizes the responses of the population to the external stimuli.

Procedure

Algorithm (below) provides a pseudocode listing of the Optimization Artificial Immune Network algorithm (opt-aiNet) for minimizing a cost function.

Input: $Population_{size}$, ProblemSize, $N_{clones}$, $N_{random}$, AffinityThreshold
Output: $S_{best}$
Population $\leftarrow$ InitializePopulation{$Population_{size}$, ProblemSize}
While ($\neg$StopCondition())
    EvaluatePopulation{Population}
    $S_{best}$ $\leftarrow$ GetBestSolution{Population}
    Progeny $\leftarrow \emptyset$
    $Cost_{avg}$ $\leftarrow$ CalculateAveragePopulationCost{Population}
    While (CalculateAveragePopulationCost{Population} $>$ $Cost_{avg}$)
        For ($Cell_{i}$ $\in$ Population)
            Clones $\leftarrow$ CreateClones{$Cell_{i}$, $N_{clones}$}
            For ($Clone_{i}$ $\in$ Clones)
                $Clone_{i}$ $\leftarrow$ MutateRelativeToFitnessOfParent{$Clone_{i}$, $Cell_{i}$}
            End
            EvaluatePopulation{Clones}
            Progeny $\leftarrow$ GetBestSolution{Clones}
        End
    End
    SupressLowAffinityCells{Progeny, AffinityThreshold}
    Progeny $\leftarrow$ CreateRandomCells{$N_{random}$}
    Population $\leftarrow$ Progeny
End
Return ($S_{best}$)
Pseudocode for opt-aiNet.

Heuristics

Code Listing

Listing (below) provides an example of the Optimization Artificial Immune Network (opt-aiNet) implemented in the Ruby Programming Language. The demonstration problem is an instance of a continuous function optimization that seeks $\min f(x)$ where $f=\sum_{i=1}^n x_{i}^2$, $-5.0\leq x_i \leq 5.0$ and $n=2$. The optimal solution for this basin function is $(v_0,\ldots,v_{n-1})=0.0$. The algorithm is an implementation based on the specification by de Castro and Von Zuben [Castro2002c].

def objective_function(vector)
  return vector.inject(0.0) {|sum, x| sum + (x**2.0)}
end

def random_vector(minmax)
  return Array.new(minmax.size) do |i|
    minmax[i][0] + ((minmax[i][1] - minmax[i][0]) * rand())
  end
end

def random_gaussian(mean=0.0, stdev=1.0)
  u1 = u2 = w = 0
  begin
    u1 = 2 * rand() - 1
    u2 = 2 * rand() - 1
    w = u1 * u1 + u2 * u2
  end while w >= 1
  w = Math.sqrt((-2.0 * Math.log(w)) / w)
  return mean + (u2 * w) * stdev
end

def clone(parent)
  v = Array.new(parent[:vector].size) {|i| parent[:vector][i]}
  return {:vector=>v}
end

def mutation_rate(beta, normalized_cost)
  return (1.0/beta) * Math.exp(-normalized_cost)
end

def mutate(beta, child, normalized_cost)
  child[:vector].each_with_index do |v, i|
    alpha = mutation_rate(beta, normalized_cost)
    child[:vector][i] = v + alpha * random_gaussian()
  end
end

def clone_cell(beta, num_clones, parent)
  clones = Array.new(num_clones) {clone(parent)}
  clones.each {|clone| mutate(beta, clone, parent[:norm_cost])}
  clones.each{|c| c[:cost] = objective_function(c[:vector])}
  clones.sort!{|x,y| x[:cost] <=> y[:cost]}
  return clones.first
end

def calculate_normalized_cost(pop)
  pop.sort!{|x,y| x[:cost]<=>y[:cost]}
  range = pop.last[:cost] - pop.first[:cost]
  if range == 0.0
    pop.each {|p| p[:norm_cost] = 1.0}
  else
    pop.each {|p| p[:norm_cost] = 1.0-(p[:cost]/range)}
  end
end

def average_cost(pop)
  sum = pop.inject(0.0){|sum,x| sum + x[:cost]}
  return sum / pop.size.to_f
end

def distance(c1, c2)
  sum = 0.0
  c1.each_index {|i| sum += (c1[i]-c2[i])**2.0}
  return Math.sqrt(sum)
end

def get_neighborhood(cell, pop, aff_thresh)
  neighbors = []
  pop.each do |p|
    neighbors << p if distance(p[:vector], cell[:vector]) < aff_thresh
  end
  return neighbors
end

def affinity_supress(population, aff_thresh)
  pop = []
  population.each do |cell|
    neighbors = get_neighborhood(cell, population, aff_thresh)
    neighbors.sort!{|x,y| x[:cost] <=> y[:cost]}
    pop << cell if neighbors.empty? or cell.equal?(neighbors.first)
  end
  return pop
end

def search(search_space, max_gens, pop_size, num_clones, beta, num_rand, aff_thresh)
  pop = Array.new(pop_size) {|i| {:vector=>random_vector(search_space)} }
  pop.each{|c| c[:cost] = objective_function(c[:vector])}
  best = nil
  max_gens.times do |gen|
    pop.each{|c| c[:cost] = objective_function(c[:vector])}
    calculate_normalized_cost(pop)
    pop.sort!{|x,y| x[:cost] <=> y[:cost]}
    best = pop.first if best.nil? or pop.first[:cost] < best[:cost]
    avgCost, progeny = average_cost(pop), nil
    begin
      progeny=Array.new(pop.size){|i| clone_cell(beta, num_clones, pop[i])}
    end until average_cost(progeny) < avgCost
    pop = affinity_supress(progeny, aff_thresh)
    num_rand.times {pop << {:vector=>random_vector(search_space)}}
    puts " > gen #{gen+1}, popSize=#{pop.size}, fitness=#{best[:cost]}"
  end
  return best
end

if __FILE__ == $0
  # problem configuration
  problem_size = 2
  search_space = Array.new(problem_size) {|i| [-5, +5]}
  # algorithm configuration
  max_gens = 150
  pop_size = 20
  num_clones = 10
  beta = 100
  num_rand = 2
  aff_thresh = (search_space[0][1]-search_space[0][0])*0.05
  # execute the algorithm
  best = search(search_space, max_gens, pop_size, num_clones, beta, num_rand, aff_thresh)
  puts "done! Solution: f=#{best[:cost]}, s=#{best[:vector].inspect}"
end
Optimization Artificial Immune Network in Ruby
Download: optainet.rb.

References

Primary Sources

Early works, such as Farmer et al. [Farmer1986] suggested at the exploitation of the information processing properties of network theory for machine learning. A seminal network theory based algorithm was proposed by Timmis et al. for clustering problems called the Artificial Immune Network (AIN) [Timmis2000] that was later extended and renamed the Resource Limited Artificial Immune System [Timmis2001] and Artificial Immune Network (AINE) [Knight2001]. The Artificial Immune Network (aiNet) algorithm was proposed by de Castro and Von Zuben that extended the principles of the Artificial Immune Network (AIN) and the Clonal Selection Algorithm (CLONALG) and was applied to clustering [Castro2000a]. The aiNet algorithm was further extended to optimization domains and renamed opt-aiNet [Castro2002c].

Learn More

The authors de Castro and Von Zuben provide a detailed presentation of the aiNet algorithm as a book chapter that includes immunological theory, a description of the algorithm, and demonstration application to clustering problem instances [Castro2001]. Timmis and Edmonds provide a careful examination of the opt-aiNet algorithm and propose some modifications and augmentations to improve its applicability and performance for multimodal function optimization problem domains [Timmis2004]. The authors de Franca, Von Zuben, and de Castro proposed an extension to opt-aiNet that provided a number of enhancements and adapted its capability for for dynamic function optimization problems called dopt-aiNet [Franca2005].

Bibliography

[Castro2000a] L. N. de Castro and F. J. Von Zuben, "An evolutionary immune network for data clustering", in Proceedings Sixth Brazilian Symposium on Neural Networks, 2000.
[Castro2001] L. N. de Castro and F. J. Von Zuben, "Chapter XII: aiNet: An Artificial Immune Network for Data Analysis", in Data Mining: A Heuristic Approach, pages 231–259, Idea Group Publishing, 2001.
[Castro2002c] L. N. de Castro and J. Timmis, "An artificial immune network for multimodal function optimization", in Proceedings of the 2002 Congress on Evolutionary Computation (CEC'02), 2002.
[Farmer1986] J. D. Farmer and N. H. Packard and Alan S. Perelson, "The immune system, adaptation, and machine learning", Physica D, 1986.
[Franca2005] F. O. de Fran&ccedil;a and L. N. de Castro and F. J. Von Zuben, "An artificial immune network for multimodal function optimization\n\ton dynamic environments", in Genetic And Evolutionary Computation Conference, 2005.
[Jerne1974] N. K. Jerne, "Clonal selection in a lymphocyte network", in Cellular Selection and Regulation in the Immune Response, 1974.
[Jerne1974a] N. K. Jerne, "Towards a network theory of the immune system", Annales d'immunologie (Annals of Immunology), Institut Pasteur (Paris,\n\tFrance), Societe Francaise d'Immunologie, 1974.
[Jerne1984] N. K. Jerne, "Idiotypic networks and other preconceived ideas", Immunological Reviews, 1984.
[Knight2001] T. Knight and J. Timmis, "AINE: An Immunological Approach to Data Mining", in First IEEE International Conference on Data Mining (ICDM'01), 2001.
[Timmis2000] J. Timmis and M. Neal and J. Hunt, "An Artificial Immune System for Data Analysis", Biosystems, 2000.
[Timmis2001] J. Timmis and M. J. Neal, "A resource limited artificial immune system for data analysis", Knowledge Based Systems Journal: Special Issue, 2001.
[Timmis2004] J. Timmis and C. Edmonds, "A Comment on opt–AINet: An Immune Network Algorithm for Optimisation", in Lecture Notes in Computer Science, 2004.



Please Note: This content was automatically generated from the book content and may contain minor differences.