[ieee 2012 3rd international conference on innovations in bio-inspired computing and applications...
TRANSCRIPT
![Page 1: [IEEE 2012 3rd International Conference on Innovations in Bio-Inspired Computing and Applications (IBICA) - Kaohsiung City, Taiwan (2012.09.26-2012.09.28)] 2012 Third International](https://reader036.vdocuments.net/reader036/viewer/2022092623/5750a5431a28abcf0cb09c4a/html5/thumbnails/1.jpg)
An Expandable Genetic Cell Array for Global Optimization of ContinuousMultimodal Functions
Ting-Hua ChangDepartment of Information Management
Ling-Tung UniversityTaichung, Taiwan
Abstract�This study presents a simple, fast, accurate, andexpandable algorithm with very few parameters for solvingglobal optimization problem of continuous multimodalfunctions � a calculation unit called cell based on GeneticAlgorithm and Particle Swarm is designed. The cell consists ofonly three chromosomes, among which two of thechromosomes apply crossover operation, and the otherchromosome performs Particle Swarm search as the mutationoperation. Characteristics of this new method are comparedwith other hybrid methods. The experimental results onseventeen benchmark functions show the proposed calculationcell can find the optimal solution in fewer function calls thanthe published GA-PSO hybrid method. Results of multi-cellexperiments are presented, and the possibility of incorporatingmany cells in large searching space is discussed.
Keywords- global optimization; genetic algorithm; particleswarm optimization; G3A; cellular automata
I. INTRODUCTIONGenetic Algorithms (GAs) are general purpose search
algorithms which use principles inspired by nature geneticpopulations to evolve solutions to problems [1, 2]. Globallyoptimize a continuous variable function f in a given searchdomain consists in finding its global minima without beingtrapped into one of its local minima. Although fixed-lengthand binary coded strings for the represented solution havedominated GA research, global optimization problem ofcontinuous multimodal functions is better handled by real-coded GA [3]. The use of real parameters makes it possibleto use large domains for the variables, which is difficult toachieve in binary implementations where increasing thedomain would mean sacrificing precision, assuming a fixedlength for the chromosomes. Adaptations of GAs to thecontinuous optimization problems have been proposed in [4�9]. However, slow convergence of GAs before finding anaccurate solution is a well-known problem, closely related totheir lack of exploiting any local information. To overcome�������������� ����������� methods have been proposed inthe literature [12�15], combining a GA with some classical���� -� ��������� � ��������� ��� �� of GA is to localize���� �������������� ������ ������� ��� ��� solution space,������ ��� ������� ������ ��� optimization algorithmallows to reach the best solution in that region quickly. Oneof the most common local searching algorithms in hybrid
methods is the Nelder-Mead (NM) simplex method [10, 11].One of the reasons for its popularity is that this method doesnot need the derivatives of the function under exploration.This is an important feature in many applications wheregradient information is not always available. However, onehas to be careful when using this method since it is sensitiveto the choice of initial points and is not guaranteed to attainthe global optimum.Another popular algorithm often used in the diversified
searching is the Particle Swarm Optimization (PSO) [16, 17].The original form of PSO has a drawback of flying over theoptimal solution or being trapped in local extremes easily.Many delicate and state-of-the-art variations have beenproposed to improve PSO [20]. Although the modified PSOdoes eventually locate the desired solution, the convergencerate of PSO [18] is also typically slower than those of localdirect search techniques (e.g., Hooke and Jeeves method [19]and Nelder�Mead simplex search method), as PSO does notdigest every information obtained locally to determine themost promising search direction.In general, hybrid methods may improve the performance
than GA or PSO alone, but there are some weaknesses: (1)the architectures of hybrid methods are often complicatedand many internal parameters are needed. It is challenging totune those parameters to achieve the optimal performance. Infact the optimal setting of the parameters is problem-dependent. (2) It is arbitrary to determine the criteria aboutwhen to start local searching, and the definition of a�promising� area is not rigorous. While applied to largersolution space, the performances can be affected by the resultof searching for the promising area. For example, it is wellknown that Simplex-based methods are easy to be trapped inlocal optimum.A novel approach, denoted as G3A, of combining GA
and PSO is presented in this study. Typically researchers donot hybrid PSO and GA for that PSO and GA are bothsuperior and more suitable in doing long range searching,and for their slow convergence near the solution. Howeverwith affordable extra computing efforts, local convergencecan be still achieved by carefully choosing the velocity termsin PSO. The advantage of hybrid PSO and GA is that there isno need to hard-distinguish the global search from the localsearch stage. The proposed G3A algorithm operates with atleast three chromosomes in presence, of which twochromosomes with better fitness values are applied for
2012 Third International Conference on Innovations in Bio-Inspired Computing and Applications
978-0-7695-4837-1/12 $26.00 © 2012 IEEE
DOI 10.1109/IBICA.2012.19
98
![Page 2: [IEEE 2012 3rd International Conference on Innovations in Bio-Inspired Computing and Applications (IBICA) - Kaohsiung City, Taiwan (2012.09.26-2012.09.28)] 2012 Third International](https://reader036.vdocuments.net/reader036/viewer/2022092623/5750a5431a28abcf0cb09c4a/html5/thumbnails/2.jpg)
crossover of genetic algorithm and the remainingchromosome undergoes swarm search. The author refers thethree-chromosome unit as a basic �cell�. A cell is anirreducible, fundamental unit in G3A calculation. A systemin operation is expandable by running many cellsindependently or interactively. Besides the high effectivenessof chromosome usage, one of the attracting features of G3Ais that very few parameters are involved. In this paper theauthor had tested single-cell and multi-cell runs, and theresults are presented alone with results from other method.The paper is organized as follows. Section 2 is devoted to
review and the overall description of this optimizationmethod. The experiment results on various functions aredescribed in Section 3. Section 4 consists in a discussion ofthe experimental results and the impacts of parametersettings. Some words of conclusion and future prospects aregiven in Section 5.
II. REVIEW OF GA-PSO HYBRID METHODSSome other authors had attempted such efforts to weld
GA and PSO. Kao and Zahara [21] denoted their method asGA-PSO; this hybrid technique incorporates concepts fromGA and PSO and creates individuals in a new generation notonly by crossover and mutation operations as found in GAbut also by mechanisms of PSO. The results of variousexperimental studies using a suite of 17 multimodal testfunctions taken from the literature have demonstrated thesuperiority of the hybrid GA-PSO approach over some othersearch techniques in terms of solution quality andconvergence rates. However the results in [21] had shown abias toward functions with the origin point as the optimalsolution, and this issue will be discussed in more detail inSection 4. W.F. Abd-El-Wahed et al. [22] proposed anapproach which integrates the merits of both GA and PSOand it has two characteristic features. Firstly, the algorithm isinitialized by a set of random particles which travel throughthe search space. During this travel an evolution of theseparticles is performed by integrating PSO and GA. Secondly,to restrict velocity of the particles and control it, the authorsintroduced a modified constriction factor. However in theirmethod, hundreds of particles are needed and the crucial step� �fixing the particles� � seems to bring in additionalartifacts and hand-made manipulations. A detailedcomparison with [22] will be omitted in this paper.
A. The presented algorithmIn the proposed G3A algorithm, among the three
chromosomes, the two chromosomes with better fitnessvalue go for crossover using the BLX-��crossoveroperator[25]: the offspring chromosome X is the randomvector chosen from the interval [Gmin � I��, Gmax + I��],where ����� � ����
1iG �
2iG � i� � ���� � ��
1iG �
2iG � i� � �
� ���� � ����� i is the gene identifier and � = 0.5. Threechildren are generated by the parent pair via crossover, andthe better two are kept for the next generation. The otherchromosome which does not participate in crossover
undergoes swarm algorithm to update its �position vector� X,i.e., the values of genes, using the following formula:
V(t+1) = (0.5 + r0/2)V(t) + C1r1(Pcell � X(t)) + C2r2(Pglobal �X(t))� � ���
X(t+1) = X(t) + V(t+1) (2)
where r0, r1, and r2 are random numbers in Uniform[0,1]. Foreach function, the value of X is confined in the searchingdomain of that function, and V is unconstrained. Note thatPcell in the second term represents the best position of thiscell, whose value will be the same as Pglobal if the user onlyuses one cell. C1 and C2 are the two constants known as thePersonal Factor and the Social Factor, respectively. Theoptimal values of C1 and C2 are problem-dependent, seeSection 4 for more discussions about this issue. From theGA�s point of view, the swarm operation in G3A is aneffective substitution for the mutation operation. Updatingthe position vector toward the global best position is in fact aguided mutation toward the optimal solution. After thecrossover and swarm operation, the three chromosomes arethen compared for the fitness values, and the operation cyclerepeats till termination criteria satisfied, see Figure 1.
Figure 1. The G3A algorithm for one cell.
B. Comparison with other hybrid methodsThe comparison of G3A with some other hybrid methods
is summarized in Table 1. The purpose of this comparison isto show the complexity in a typical hybrid method. Note theletter �N� in the second row of Table I represents thedimensionality of the test function.
III. EXPERIMENTAL RESULTSSeventeen benchmark functions were used to test the
performance of the G3A cells. The definitions for thosefunctions can be found in Appendix and [15,21]. To comparedirectly with the results from GA-PSO [21], the terminationcriterion was set to |fobj � fanal| < error with the error valuestaken from [21]. The method or setting reaches the sameerror tolerance in fewer function calls is more efficient infinding the optimal solution than the others. The experimentwas conducted for 1, 3, and 10-cell settings for each functionwith two sets of (C1, C2) values: (2.0, 0) and (2.0, 0.5). In theformer case each cell is running independently (C2 = 0),while in the later case (interactive, C2�� 0)
Initialization: create 3 chromosomes.REPEATEvaluation: Evaluate the fitness of each chromosome.GA: apply BLX-0.5 crossover to the best twochromosomes; select 2 from 3 children.PSO: update the last chromosome by PSO algorithm.
UNTIL stopping criteria are reached.
99
![Page 3: [IEEE 2012 3rd International Conference on Innovations in Bio-Inspired Computing and Applications (IBICA) - Kaohsiung City, Taiwan (2012.09.26-2012.09.28)] 2012 Third International](https://reader036.vdocuments.net/reader036/viewer/2022092623/5750a5431a28abcf0cb09c4a/html5/thumbnails/3.jpg)
TABLE I. COMPARISON OF OTHER HYBRID METHODS AND G3A
Method G3A GA-PSO[21] NM-GA[23] NM-PSO[24]Number of particles 3 * number of cells 4N 2N+2 3N+1
Number ofparameters
4 5 7 11
Parameter list � C1 , C2, and w from PSO� � from BLX-crossover
� C1, C2, w and Vmax from PSO� Mutation rate in GA
� 5 parameters fromNM
� Mutation rate andcrossover constant
� C1, C2, w from PSO� Mutation successthreshold, number ofparticles in heuristicmutation
� 5 parameters plus initialstep size in NM
Modification fromthe standardmethods
� Replace personal bestposition by cell bestposition in PSO
� Replace personal best positionby best neighbor position inPSO
� Modified simplexsearch
� Modified simplexsearch
� Heuristic mutation ofPSO
TABLE II. EXPERIMENTAL RESULTS FROM 17 TEST FUNCTIONS
Function Error(��10-5) Success Rate (%) Average of function evaluation numbers in C1=2.0, C2=0 (0.5) settings and in
published GA-PSO[21] method
� � 1 cell 3 cells 10 cells GA-PSO 1 cell 3 cells 10 cells
� � ������ �������� �������� ���� �������� ���������� ����������
�� � ������ ������ ������ ��� ���������� ���������� ������������
�� �� ������ ������ �������� ����� ���������� ���������� ����������
�� � ������ �������� �������� ��� ���������� ���������� ������������
�� � ������ ������� �������� ����� ���������� ���������� ������������
� �� �������� �������� �������� ������ ������������ ����������� �����������
�� � �������� �������� �������� �� �������� ���������� ����������
! � �������� �������� �������� ��� ���������� ���������� ����������
���� �� ������ ������� �������� ���� ��������� ���������� ����������
���� �� ������ ������ ������ ������ ���������� ���������� ������������
���� �� ������ ������ ������ ����� ���������� ���������� ������������
����� �� ������ ������ ������� ����� ���������� ���������� ������������
� �� ������ ������ �������� ������� ��������������� ��������������� ���������������
�� � �������� �������� �������� ��� ������������ ����������� ������������
���� �� ������ ������ �������� ����� ���������� ���������� ������������
�� � ������ ������ ������� ������� ���������������� ���������������� ����������������
��� � �������� �������� �������� ��� �������������� ������������� ��������������
the cells share the information of the global best position.Each function was tested for 50 times, and if the run fulfilledthe termination criterion in 500,000 loops, that run wasconsidered as success. The average1 (in 50 times) number ofcalls to the fitness function and the success rate, as well asthe number of function evaluations from GA-PSO, arecompared in Table II. The results from the interactive case
1 The numbers presented in Table II are the median numbers from 50 runsbecause the distributions are asymmetric.
are quoted in parentheses. Note that the average numbers offunction evaluations shown are obtained from the successfulruns only, and the rates of success in GA-PSO[21] are all100%.
IV. DISCUSSIONA few discussions regarding these results are given
below:(1) In the (2.0,0) setting for most functions, it is found
that the rate of success approximately obeys the probability
100
![Page 4: [IEEE 2012 3rd International Conference on Innovations in Bio-Inspired Computing and Applications (IBICA) - Kaohsiung City, Taiwan (2012.09.26-2012.09.28)] 2012 Third International](https://reader036.vdocuments.net/reader036/viewer/2022092623/5750a5431a28abcf0cb09c4a/html5/thumbnails/4.jpg)
law: (rate of success for n cells case) = 1 � (rate of failure for1 cell case)n , where n is the number of cells, indicates thateach cell indeed finds the solution independently. This factimplies that the optimal solutions can eventually be found byincreasing the amount of cells, although in the current studysome functions can not obtain 100 % of successful rate by 10cells.(2) For functions R2, R5, Z5, R10, and Z10, the 3-cell
interactive (C2 = 0.5) setting found the optimal solution infewer evaluations than in the 1-cell setting. On the otherhand, in S4,n and H6,4, although the number of evaluations in3-cell setting is larger than in the 1-cell setting, the ratio ofnumber of evaluations in 3-cell to 1-cell is smaller than theindependent (C2 = 0) case. These observations seem to hintthat the cells are working cooperatively by sharing the globalbest position with each other. The enhancement ofperformance is quite dramatic. This indicates that G3Aalgorithm may have good potential in solving largerdimensional problems. Such non-linear enhancement effectwas not obvious in 2-D and 3-D functions. The number ofevaluations increased in the 10-cell runs indicates that somesearching paths are repeated and redundant. A betteralgorithm to coordinate the searching among cells would bean interesting topic in the future research.(3) The fluctuation of average number of function
evaluations in G3A algorithm is not as volatile as in the GA-PSO [21] case. Some results from the published GA-PSO arerather impressive. However the variation on the averagenumber of function evaluations, to achieve roughly the samelevel of accuracy for the same dimensionality problems,spreading over 3 orders of magnitude is hard to understandfrom the usual sense. Thus the author reconstructed the GA-PSO model using C++ code, and found that the performancedepends strongly on the maximum velocity setting of theparticles. Unfortunately these values were not given in [21].However the author found that a small Vmax value wouldgreatly accelerate the searching for those functions with nolocal minimums and the optimal solution locates at the originduring the PSO process � the searching could be artificiallyconfined near the optimal solution point, and the resultscould be significantly biased.(4) The value of personal factor C1 and social factor C2 in
PSO has a significant impact on rate of successful runs andnumber of function calls. To study the effect, the author hadtested 4 settings of (C1 , C2 ) : (1,1), (2,0), (2,0.5), and (2,2).Regarding to the number of function evaluations, it wasfound that the (C1 = 1, C2 =1) setting has the bestperformance. The rank is (1, 1) > (2, 0.5) > (2, 0) > (2, 2).However the (C1 = 1, C2 =1) setting was doing poorly in therate of success on the S4,n and H6,4 functions: the rate ofsuccess is only about 50 � 60 % in the 10-cell setting, anddid not improve to follow the prediction by probabilityscaling law as the number of cell increased. Apparently forthose functions the cell needs a larger velocity boost (C1 � 2�to escape the local minima traps.
V. CONCLUSION AND FUTURE PROSPECTSA different approach, denoted as G3A, of combining GA
and PSO is presented in this study. For most of the testfunctions under investigation, the performance of theproposed algorithm is better than the published GA-PSOmethods in three aspects: (1) fewer function-evaluations, (2)smaller population size, and (3) fewer internal parameters. AG3A cell could be treated as a cellular automaton in findingthe global optimal solution of continuous multimodalfunctions. This system can run in arbitrary number of cells.In the future development, a coordinated many-cell systemseems promising for large scale searching. So far the cellsexhibit some repeated searching behavior in the absence of acentral control unit. A controller node and algorithm isneeded to coordinate the calculations effectively. Next, aninteger version of G3A will have a wider applicationpotential in real-world applications. The swarm part could bereplaced by some other mutation method, e.g. a controlledmutation from the best chromosome. Finally, designingmulti-objective cells following the G3A methodology is ofgreat interest.
APPENDIX: LIST OF TEST FUNCTIONS
Branin RCOC (RC) (two variables):
� � ;10cos8111065
45),( 1
2
1212221 �
���
��
� ���
���
��
�
� ��
�
� ��� xxxxxxRC
���
search domain: �5 < x1 < 10, 0 < x2 < 15; three globalminima: (x1, x2)* = (-�, 12.275), (�, 2.275), (9.42478, 2.475);RC((x1, x2)*) = 0.397887.Easom (ES) (two variables):
));)()((exp()cos()cos(),( 22
212121 �� ������ xxxxxxES
search domain: �100 < xj < 100, j = 1,2; one globalminimum: (x1, x2)* = (�, �) ; ES( (x1, x2)*) = �1.Goldstein and Price (GP) (two variables):
� ������������ )361431419()1(1),( 22212
211
22121 xxxxxxxxxxGP
� � � �� �;2736481232183230 22212
211
221 xxxxxxxx ��������
search domain: �2 < xj < 2, j = 1,2; one global minimum: (x1,x2)* = (�, ��) ; GP( (x1, x2)*) = 3.B2 (B2) (two variables):
;7.0)4cos(4.0)3cos(3.02),(2 2122
2121 ����� xxxxxxB ��
search domain: �100 < xj < 100, j = 1,2; one globalminimum: (x1, x2)* = (�, �) ; B2( (x1, x2)*) = 0.Shubert (SH) (two variables):
� �� �� �� �� �� �;1cos
1cos),(5
1 2
5
1 121
��
�
�
��
����
j
j
jxjj
jxjjxxSH
search domain: �10 < xj < 10, j = 1,2; 18 global minimum:SH( (x1, x2)*) = ���������.Rosenbrock (Rn) (n variables):
� � � �� �� �
� � ����1
122
12 1100)( n
j jjjn xxxxR ;
101
![Page 5: [IEEE 2012 3rd International Conference on Innovations in Bio-Inspired Computing and Applications (IBICA) - Kaohsiung City, Taiwan (2012.09.26-2012.09.28)] 2012 Third International](https://reader036.vdocuments.net/reader036/viewer/2022092623/5750a5431a28abcf0cb09c4a/html5/thumbnails/5.jpg)
three functions were studied: R2, R5, and R10; searchdomain:��5 < xj < 10, j = 1, . . . , n; one global minimum x*= (1, . . . ,1); Rn(x*) = 0.Zakharov (Zn) (n variables):
� � � � � �41
2
112 5.05.0)( ��� ���
���n
j jn
j jn
j jn jxjxxxZ ;
three functions were studied: Z2, Z5, and Z10; searchdomain:��5 < xj < 10, j = 1, . . . , n; one global minimum x*= (0, . . . ,0); Zn(x*) = 0.De Joung (DJ) (3 variables):
23
22
21321 ),,( xxxxxxDJ ��� ;
search domain: �5.12 < xj < 5.12, j = 1,2,3; one globalminimum: (x1, x2, x3)* = (�, ����) ; DJ( (x1, x2, x3)*) = 0.Hartmann (H3,4) (3 variables):H3,4(x) = � �� ��� ��
���3
124
1exp
j ijjiji i pxac ;
search domain: � < xj < 1, j = 1,2,3; one global minimum: x*
= (0.11, 0.555, 0.855); H3,4(x*) = �3.86343.
TABLE A1 COEFFICIENTS OF H3,4 FUNCTIONi aij ci pij1 3.0 10.0 30.0 1.0 0.3689 0.1170 0.26732 0.1 10.0 35.0 1.2 0.4699 0.4387 0.74703 3.0 10.0 30.0 3.0 0.1091 0.8732 0.55474 0.1 10.0 35.0 3.2 0.0381 0.5743 0.8827
Shekel (S4,n) (4 variables) :S4,n(x) = � �
�n
i 1[(x � ai)T(x � ai) + ci ]-1;
x = (x1,x2,x3,x4)T ; ai = � �4321 ,,, iiii aaaa T; three functions S4,5,S4.7, and S4,10 were studied; search domain: � < xj < 1, j =1, . . . , 4;S4,5: one global minimum S4,5(x*) = �10.1532;S4,7: one global minimum S4,7(x*) = �10.40294;S4,10: one global minimum S4,10(x*) = �10.53641.
TABLE A2 COEFFICIENTS OF S4,n FUNCTIONi aiT ci1 4.0 4.0 4.0 4.0 0.12 1.0 1.0 1.0 1.0 0.23 8.0 8.0 8.0 8.0 0.24 6.0 6.0 6.0 6.0 0.45 3.0 7.0 3.0 7.0 0.46 2.0 9.0 2.0 9.0 0.67 5.0 5.0 3.0 3.0 0.38 8.0 1.0 8.0 1.0 0.79 6.0 2.0 6.0 2.0 0.510 7.0 3.6 7.0 3.6 0.5
Hartmann (H6,4) (6 variables):H6,4(x) = � �� ��� ��
���6
124
1exp
j ijjiji i pxac ;
search domain: � < xj < 1, j = 1,. . . , 6; one global minimum:x* = (0.20169, 0.150011, 0.47687, 0.275332, 0.311652,0.6573); H6,4(x*) = �3.32237;
TABLE A3 COEFFICIENTS OF H6,4 FUNCTIONi aij1 10.0 3.0 17.0 3.5 1.7 8.02 0.05 10.0 17.0 0.1 8.0 14.03 3.0 3.5 1.7 10.0 17.0 8.04 17.0 8.0 0.05 10.0 0.1 14.0ci pij1.0 0.1312 0.1696 0.5569 0.0124 0.8283 0.58861.2 0.2329 0.4135 0.8307 0.3736 0.1004 0.99913.0 0.2348 0.1451 0.3522 0.2883 0.3047 0.66503.2 0.4047 0.8828 0.8732 0.5743 0.1091 0.0381
REFERENCES[1] J. H. Holland, Adaption in Nature and Artificial Systems, The
University of Michigan Press, 1975.[2] D. E. Goldburg, Genetic Algorithms in Search, Optimization, and
Machine Learning, Addison-Wesley, New York, 1989.[3] D. E. Goldburg, Real-coded genetic algorithms, virtual alphabets, and
blocking, Complex Systems 5 (1991) 139-167.[4] G. Berthiau, P. Siarry, A genetic algorithm for globally minimizing
functions of several continuous variables, in: Second InternationalConference on Meta-heuristics, Sophia-Antipolis, France, July 1997.
[5] R. Chelouah, P. Siarry, A continuous genetic algorithm designed forthe global optimization of multimodal functions, Journal ofHeuristics 6 (2000) 191�213.
[6] Z. Michalewicz, Genetic Algorithms + Data Structures = EvolutionPrograms, Springer-Verlag, Heidelberg, 1996.
[7] H. Mühlenbein, Evolution in time and space��the parallel geneticalgorithm, in: G. Rawlins (Ed.), Foundations of Genetic Algorithms,Morgan-Kaufman, 1991, pp. 316�337.
[8] H. Mühlenbein, M. Schomish, and J. Born, The parallel geneticalgorithm as function optimizer, in: Proceedings of the FourthInternational Conference on Genetic Algorithms, San Diego, CA,USA, 1991, pp. 271�278.
[9] H. Mühlenbein, D. Schlierkamp-Voosen, Analysis of selection,mutation and recombination in genetic algorithms, Technical Report93/94, GMD, 1993.
[10] J.A. Nelder, R. Mead, A simplex method for function minimization,The Computer Journal 7 (1965) 308�313.
[11] W.H. Press, B.P. Flanery, S.A. Teukolsky, W.T. Vatterling,Numerical Recipes in C, The Art of Science Computing, CambridgeUP, New York, 1988.
[12] N. Durand, J.M. Alliot, A combined Nelder�Mead simplex andgenetic algorithm, in: W. Banzhaf, J. Daida, A.E. Eiben, M.H. Garzon,V. Honavar, M. Jakiela, R.E. Smith (Eds.), Proceedings of theGenetic and Evolutionary Computation Conference� � !!"�##�Morgan Kaufmann, Orlando, FL, USA, 1999, pp. 1�7.
[13] J.A. Joines, M.G. Kay, R.E. King, A hybrid-genetic algorithm formanufacturing cell design. Technical Report NCSU-IE, Departmentof Industrial Engineering, North Carolina State University, Box 7906Raleigh, NC 27695-7906, February 1997.
[14] J.M. Renders, S.P. Flasse, Hybrid method using genetic algorithmsfor the global optimization, IEEE Transactions on Systems, Man, andCybernetics 26 (2) (1996) 243�258.
[15] R. Chelouah, P. Siarry, Genetic and Nelder�Mead algorithmshybridized for a more accurate global optimization of continuousmultiminima functions, European Journal of Operational Research148 (2003) 335�348.
[16] R.C. Eberhart, J. Kennedy, A new optimizer using particle swarmtheory, in: Proceedings of the Sixth International Symposium onMicro Machine and Human Science, Nagoya, Japan, 1995, pp. 39�43.
[17] J. Kennedy, R.C. Eberhart, Particle swarm optimization, in:Proceedings of the IEEE International Conference on NeuralNetworks, Piscataway, NJ, USA, 1995, pp. 1942�1948.
102
![Page 6: [IEEE 2012 3rd International Conference on Innovations in Bio-Inspired Computing and Applications (IBICA) - Kaohsiung City, Taiwan (2012.09.26-2012.09.28)] 2012 Third International](https://reader036.vdocuments.net/reader036/viewer/2022092623/5750a5431a28abcf0cb09c4a/html5/thumbnails/6.jpg)
[18] S. Smith, The simplex method and evolutionary algorithms, in:Proceedings of the IEEE International Conference on EvolutionaryComputation, Anchorage, AL, USA, 1998, pp. 799�804.
[19] R. Hooke, T.A. Jeeves, Direct search solution of numerical andstatistical problems, Journal of Association for Computing Machinery8 (1961) 212�221.
[20] A. Banks, J. Vincent, and C. Anyakoha, A review of particle swarmoptimization. Part II: hybridisation, combinatorial, multicriteria andconstrained optimization, and indicative applications, NaturalComputing 7 (2008) 109�124.
[21] Y. Kao, E. Zahara, A hybrid genetic algorithm and particle swarmoptimization for multimodal functions, Applied Soft Computing 8(2008) 849�857.
[22] W.F. Abd-El-Waheda, A.A. Mousab, and M.A. El-Shorbagy,Integrating particle swarm optimization with genetic algorithms forsolving nonlinear optimization problems, Journal of Computationaland Applied Mathematics 235 (2011) 1446�1453.
[23] S.K. Fan, Y.C. Liang, and E. Zahara, A genetic algorithm and aparticle swarm optimizer hybridized with Nelder�Mead simplexsearch, Computers and Industrial Engineering 50 (2006) 401�425.
[24] S.K. Fan, E. Zahara, Hybrid simplex search and particle swarmoptimization for unconstrained optimization problems, EuropeanJournal of Operational Research 181 (2007) 527�548.
[25] L.J. Eshelman, J.D. Shaffer, Real-coded genetic algorithms andinterval schemata, Foundations of Genetic Algorithms 2, L. DarrellWhitley (Ed.), Morgan Kaufmann, San Mateo, 1993, pp.187-202.
103