Home | Read Online | Amazon | GoodReads | Google Books | PDF (code) | GitHub
Bees Algorithm, BA.
The Bees Algorithm beings to Bee Inspired Algorithms and the field of Swarm Intelligence, and more broadly the fields of Computational Intelligence and Metaheuristics. The Bees Algorithm is related to other Bee Inspired Algorithms, such as Bee Colony Optimization, and other Swarm Intelligence algorithms such as Ant Colony Optimization and Particle Swarm Optimization.
The Bees Algorithm is inspired by the foraging behavior of honey bees. Honey bees collect nectar from vast areas around their hive (more than 10 kilometers). Bee Colonies have been observed to send bees to collect nectar from flower patches relative to the amount of food available at each patch. Bees communicate with each other at the hive via a waggle dance that informs other bees in the hive as to the direction, distance, and quality rating of food sources.
Honey bees collect nectar from flower patches as a food source for the hive. The hive sends out scout's that locate patches of flowers, who then return to the hive and inform other bees about the fitness and location of a food source via a waggle dance. The scout returns to the flower patch with follower bees. A small number of scouts continue to search for new patches, while bees returning from flower patches continue to communicate the quality of the patch.
The information processing objective of the algorithm is to locate and explore good sites within a problem search space. Scouts are sent out to randomly sample the problem space and locate good sites. The good sites are exploited via the application of a local search, where a small number of good sites are explored more than the others. Good sites are continually exploited, although many scouts are sent out each iteration always in search of additional good sites.
Algorithm (below) provides a pseudocode listing of the Bees Algorithm for minimizing a cost function.
Input
:
$Problem_{size}$, $Bees_{num}$, $Sites_{num}$, $EliteSites_{num}$, $PatchSize_{init}$, $EliteBees_{num}$, $OtherBees_{num}$
Output
:
$Bee_{best}$
Population
$\leftarrow$ InitializePopulation
{$Bees_{num}$, $Problem_{size}$}While
($\neg$StopCondition
())EvaluatePopulation
{Population
}GetBestSolution
{Population
}NextGeneration
$\leftarrow \emptyset$SelectBestSites
{Population
, $Sites_{num}$}For
($Site_{i}$ $\in$ $Sites_{best}$)If
($i <$ $EliteSites_{num}$)Else
End
Neighborhood
$\leftarrow$ $\emptyset$For
($j$ To
$RecruitedBees_{num}$)Neighborhood
$\leftarrow$ CreateNeighborhoodBee
{$Site_{i}$, $Patch_{size}$}End
NextGeneration
$\leftarrow$ GetBestSolution
{Neighborhood
}End
For
($j$ To
$RemainingBees_{num}$)NextGeneration
$\leftarrow$ CreateRandomBee
()End
Population
$\leftarrow$ NextGeneration
End
Return
($Bee_{best}$)Listing (below) provides an example of the Bees Algorithm implemented in the Ruby Programming Language. The demonstration problem is an instance of a continuous function optimization that seeks $\min f(x)$ where $f=\sum_{i=1}^n x_{i}^2$, $-5.0\leq x_i \leq 5.0$ and $n=3$. The optimal solution for this basin function is $(v_0,\ldots,v_{n-1})=0.0$. The algorithm is an implementation of the Bees Algorithm as described in the seminal paper [Pham2006]. A fixed patch size decrease factor of 0.95 was applied each iteration.
def objective_function(vector) return vector.inject(0.0) {|sum, x| sum + (x ** 2.0)} end def random_vector(minmax) return Array.new(minmax.size) do |i| minmax[i][0] + ((minmax[i][1] - minmax[i][0]) * rand()) end end def create_random_bee(search_space) return {:vector=>random_vector(search_space)} end def create_neigh_bee(site, patch_size, search_space) vector = [] site.each_with_index do |v,i| v = (rand()<0.5) ? v+rand()*patch_size : v-rand()*patch_size v = search_space[i][0] if v < search_space[i][0] v = search_space[i][1] if v > search_space[i][1] vector << v end bee = {} bee[:vector] = vector return bee end def search_neigh(parent, neigh_size, patch_size, search_space) neigh = [] neigh_size.times do neigh << create_neigh_bee(parent[:vector], patch_size, search_space) end neigh.each{|bee| bee[:fitness] = objective_function(bee[:vector])} return neigh.sort{|x,y| x[:fitness]<=>y[:fitness]}.first end def create_scout_bees(search_space, num_scouts) return Array.new(num_scouts) do create_random_bee(search_space) end end def search(max_gens, search_space, num_bees, num_sites, elite_sites, patch_size, e_bees, o_bees) best = nil pop = Array.new(num_bees){ create_random_bee(search_space) } max_gens.times do |gen| pop.each{|bee| bee[:fitness] = objective_function(bee[:vector])} pop.sort!{|x,y| x[:fitness]<=>y[:fitness]} best = pop.first if best.nil? or pop.first[:fitness] < best[:fitness] next_gen = [] pop[0...num_sites].each_with_index do |parent, i| neigh_size = (i<elite_sites) ? e_bees : o_bees next_gen << search_neigh(parent, neigh_size, patch_size, search_space) end scouts = create_scout_bees(search_space, (num_bees-num_sites)) pop = next_gen + scouts patch_size = patch_size * 0.95 puts " > it=#{gen+1}, patch_size=#{patch_size}, f=#{best[:fitness]}" end return best end if __FILE__ == $0 # problem configuration problem_size = 3 search_space = Array.new(problem_size) {|i| [-5, 5]} # algorithm configuration max_gens = 500 num_bees = 45 num_sites = 3 elite_sites = 1 patch_size = 3.0 e_bees = 7 o_bees = 2 # execute the algorithm best = search(max_gens, search_space, num_bees, num_sites, elite_sites, patch_size, e_bees, o_bees) puts "done! Solution: f=#{best[:fitness]}, s=#{best[:vector].inspect}" end
The Bees Algorithm was proposed by Pham et al. in a technical report in 2005 [Pham2005], and later published [Pham2006]. In this work, the algorithm was applied to standard instances of continuous function optimization problems.
The majority of the work on the algorithm has concerned its application to various problem domains. The following is a selection of popular application papers: the optimization of linear antenna arrays by Guney and Onay [Guney2007], the optimization of codebook vectors in the Learning Vector Quantization algorithm for classification by Pham et al. [Pham2006a], optimization of neural networks for classification by Pham et al. [Pham2006b], and the optimization of clustering methods by Pham et al. [Pham2007].
[Guney2007] | K. Guney and M. Onay, "Amplitude-only pattern nulling of linear antenna arrays with the\n\tuse of bees algorithm", Progress In Electromagnetics Research, 2007. |
[Pham2005] | D. T. Pham and A. Ghanbarzadeh and E. Koc and S. Otri and \n\tS. Rahim and M. Zaidi, "The Bees Algorithm", Manufacturing Engineering Centre, Cardiff University, 2005. |
[Pham2006] | D. T. Pham and Ghanbarzadeh A. and Koc E. and Otri S. and Rahim S.\n\tand M.Zaidi, "The Bees Algorithm - A Novel Tool for Complex Optimisation Problems", in Proceedings of IPROMS 2006 Conference, 2006. |
[Pham2006a] | D. T. Pham and S. Otri and A. Ghanbarzadeh and E. Koc, "Application of the bees algorithm to the training of learning vector\n\tquantisation networks for control chart pattern recognition", in Proceedings of Information and Communication Technologies (ICTTA'06), 2006. |
[Pham2006b] | D. T. Pham and A. J. Soroka and A. Ghanbarzadeh and E. Koc and S.\n\tOtri and M. Packianather, "Optimising neural networks for identification of wood defects using\n\tthe bees algorithm", in Proceedings of the 2006 IEEE International Conference on Industrial\n\tInformatics, 2006. |
[Pham2007] | D. T. Pham and S. Otri and A. A. Afify and M. Mahmuddin and H. Al-Jabbouli, "Data Clustering Using the Bees Algorithm", in Proc 40th CIRP International Manufacturing Systems Seminar, 2007. |
Please Note: This content was automatically generated from the book content and may contain minor differences.