[ieee 2012 sixth international conference on genetic and evolutionary computing (icgec) -...

4
An Expandable Genetic Cell System for Solving Global Optimization Problem on Continuous Multimodal Functions Ting-Hua Chang Department of Information Management Ling-Tung University Taichung, Taiwan e-mail: [email protected] AbstractThis study presents a simple, fast, accurate, and expandable algorithm with very few parameters for solving global optimization problem of continuous multimodal functions a calculation unit called cell based on Genetic Algorithm and Particle Swarm is designed. The cell consists of only three chromosomes, among which two of the chromosomes apply crossover operation, and the other chromosome performs Particle Swarm search as the mutation operation. Characteristics of this new method are compared with other hybrid methods. The experimental results on eight benchmark functions show the proposed calculation cell can find the optimal solution in fewer function calls than the published GA- PSO hybrid method. Results of multi-cell experiments are presented, and the possibility of incorporating many cells in large searching space is discussed. Keywords-global optimization; genetic algorithm; particle swarm optimization; G3A; cellular automata I. INTRODUCTION Genetic Algorithms (GAs) are general purpose search algorithms which use principles inspired by nature genetic populations to evolve solutions to problems [1, 2]. Globally optimize a continuous variable function f in a given search domain consists in finding its global minima without being trapped into one of its local minima. Although fixed-length and binary coded strings for the represented solution have dominated GA research, global optimization problem of continuous multimodal functions is better handled by real-coded GA [3]. The use of real parameters makes it possible to use large domains for the variables, which is difficult to achieve in binary implementations where increasing the domain would mean sacrificing precision, assuming a fixed length for the chromosomes. Adaptations of GAs to the continuous optimization problems have been proposed in [49]. However, slow convergence of GAs before finding an accurate solution is a well-known problem, closely related to their lack of exploiting any local information. To overcome that methods have been proposed in the literature [12 15], combining a GA with some - of GA is to optimization algorithm allows to reach the best solution in that region quickly. One of the most common local searching algorithms in hybrid methods is the Nelder- Mead (NM) simplex method [10, 11]. One of the reasons for its popularity is that this method does not need the derivatives of the function under exploration. This is an important feature in many applications where gradient information is not always available. However, one has to be careful when using this method since it is sensitive to the choice of initial points and is not guaranteed to attain the global optimum. Another popular algorithm often used in the diversified searching is the Particle Swarm Optimization (PSO) [16, 17]. The original form of PSO has a drawback of flying over the optimal solution or being trapped in local extremes easily. Many delicate and complicated variations have been proposed to improve PSO [20]. Although the modified PSO does eventually locate the desired solution, the convergence rate of PSO [18] is also typically slower than those of local direct search techniques (e.g., Hooke and Jeeves method [19] and NelderMead simplex search method), as PSO does not digest every information obtained locally to determine the most promising search direction. In general, hybrid methods [22,23] may improve the performance than GA or PSO alone, but there are some weaknesses: (1) the architectures of hybrid methods are often complicated and many internal parameters are needed. It is challenging to tune those parameters to achieve the optimal performance. In fact the optimal setting of the parameters is problem-dependent. (2) It is arbitrary to determine the criteria about when to start local searching, and the definition of a promising area is not rigorous. While applied to larger solution space, the performances can be affected by the result of searching for the promising area. For example, it is well known that Simplex-based methods are easy to be trapped in local optimum. A novel approach, denoted as G3A, of combining GA and PSO is presented in this study. Typically researchers do not hybrid PSO and GA for that PSO and GA are both superior and more suitable in doing long range searching, and for their slow convergence 2012 Sixth International Conference on Genetic and Evolutionary Computing 978-0-7695-4763-3/12 $26.00 © 2012 IEEE DOI 10.1109/ICGEC.2012.51 433 2012 Sixth International Conference on Genetic and Evolutionary Computing 978-0-7695-4763-3/12 $26.00 © 2012 IEEE DOI 10.1109/ICGEC.2012.51 429 2012 Sixth International Conference on Genetic and Evolutionary Computing 978-0-7695-4763-3/12 $26.00 © 2012 IEEE DOI 10.1109/ICGEC.2012.51 429 2012 Sixth International Conference on Genetic and Evolutionary Computing 978-0-7695-4763-3/12 $26.00 © 2012 IEEE DOI 10.1109/ICGEC.2012.51 429

Upload: ting-hua

Post on 13-Mar-2017

215 views

Category:

Documents


2 download

TRANSCRIPT

An Expandable Genetic Cell System for Solving Global Optimization Problem onContinuous Multimodal Functions

Ting-Hua ChangDepartment of Information Management

Ling-Tung UniversityTaichung, Taiwan

e-mail: [email protected]

Abstract�This study presents a simple, fast, accurate,and expandable algorithm with very few parameters forsolving global optimization problem of continuousmultimodal functions � a calculation unit called cell basedon Genetic Algorithm and Particle Swarm is designed.The cell consists of only three chromosomes, amongwhich two of the chromosomes apply crossover operation,and the other chromosome performs Particle Swarmsearch as the mutation operation. Characteristics of thisnew method are compared with other hybrid methods.The experimental results on eight benchmark functionsshow the proposed calculation cell can find the optimalsolution in fewer function calls than the published GA-PSO hybrid method. Results of multi-cell experiments arepresented, and the possibility of incorporating many cellsin large searching space is discussed.

Keywords-global optimization; genetic algorithm;particle swarm optimization; G3A; cellular automata

I. INTRODUCTIONGenetic Algorithms (GAs) are general purpose

search algorithms which use principles inspired bynature genetic populations to evolve solutions toproblems [1, 2]. Globally optimize a continuousvariable function f in a given search domain consists infinding its global minima without being trapped intoone of its local minima. Although fixed-length andbinary coded strings for the represented solution havedominated GA research, global optimization problem ofcontinuous multimodal functions is better handled byreal-coded GA [3]. The use of real parameters makes itpossible to use large domains for the variables, which isdifficult to achieve in binary implementations whereincreasing the domain would mean sacrificing precision,assuming a fixed length for the chromosomes.Adaptations of GAs to the continuous optimizationproblems have been proposed in [4�9]. However, slowconvergence of GAs before finding an accurate solutionis a well-known problem, closely related to their lack ofexploiting any local information. To overcome that��������������� ������ methods have been proposed inthe literature [12�15], combining a GA with some�������� �� ���-����������� ����� ��� � �� ��� of GA isto �������� ����� ������������� ������ ��� ��� � �� ����

��������� ������ � ���� � �� ��������������� ����optimization algorithm allows to reach the best solutionin that region quickly. One of the most common localsearching algorithms in hybrid methods is the Nelder-Mead (NM) simplex method [10, 11]. One of thereasons for its popularity is that this method does notneed the derivatives of the function under exploration.This is an important feature in many applications wheregradient information is not always available. However,one has to be careful when using this method since it issensitive to the choice of initial points and is notguaranteed to attain the global optimum.Another popular algorithm often used in the

diversified searching is the Particle SwarmOptimization (PSO) [16, 17]. The original form of PSOhas a drawback of flying over the optimal solution orbeing trapped in local extremes easily. Many delicateand complicated variations have been proposed toimprove PSO [20]. Although the modified PSO doeseventually locate the desired solution, the convergencerate of PSO [18] is also typically slower than those oflocal direct search techniques (e.g., Hooke and Jeevesmethod [19] and Nelder�Mead simplex search method),as PSO does not digest every information obtainedlocally to determine the most promising searchdirection.In general, hybrid methods [22,23] may improve the

performance than GA or PSO alone, but there are someweaknesses: (1) the architectures of hybrid methods areoften complicated and many internal parameters areneeded. It is challenging to tune those parameters toachieve the optimal performance. In fact the optimalsetting of the parameters is problem-dependent. (2) It isarbitrary to determine the criteria about when to startlocal searching, and the definition of a �promising� areais not rigorous. While applied to larger solution space,the performances can be affected by the result ofsearching for the promising area. For example, it is wellknown that Simplex-based methods are easy to betrapped in local optimum.A novel approach, denoted as G3A, of combining

GA and PSO is presented in this study. Typicallyresearchers do not hybrid PSO and GA for that PSOand GA are both superior and more suitable in doinglong range searching, and for their slow convergence

2012 Sixth International Conference on Genetic and Evolutionary Computing

978-0-7695-4763-3/12 $26.00 © 2012 IEEE

DOI 10.1109/ICGEC.2012.51

433

2012 Sixth International Conference on Genetic and Evolutionary Computing

978-0-7695-4763-3/12 $26.00 © 2012 IEEE

DOI 10.1109/ICGEC.2012.51

429

2012 Sixth International Conference on Genetic and Evolutionary Computing

978-0-7695-4763-3/12 $26.00 © 2012 IEEE

DOI 10.1109/ICGEC.2012.51

429

2012 Sixth International Conference on Genetic and Evolutionary Computing

978-0-7695-4763-3/12 $26.00 © 2012 IEEE

DOI 10.1109/ICGEC.2012.51

429

near the solution. However with affordable extracomputing efforts, local convergence can be stillachieved by carefully choosing the velocity terms inPSO. The advantage of hybrid PSO and GA is thatthere is no need to hard-distinguish the global searchfrom the local search stage. The proposed G3Aalgorithm operates with at least three chromosomes inpresence, of which two chromosomes with better fitnessvalues are applied for crossover of genetic algorithmand the remaining chromosome undergoes swarmsearch. The author refers a three-chromosome unit as abasic �cell�. A cell is an irreducible, fundamental unitin G3A calculation. A system in operation isexpandable by running many cells independently orinteractively. Besides the high effectiveness ofchromosome usage, one of the attracting features ofG3A is that very few parameters are involved. In thispaper the author had tested single-cell and multi-cellruns, and the results are presented alone with resultsfrom other method.

II. G3A ALGORITHMSome other authors had attempted such efforts to

weld GA and PSO. Kao and Zahara [21] denoted theirmethod as GA-PSO; this hybrid technique incorporatesconcepts from GA and PSO and creates individuals in anew generation not only by crossover and mutationoperations as found in GA but also by mechanisms ofPSO. The results of various experimental studies usinga suite of 17 multimodal test functions taken from theliterature have demonstrated the superiority of thehybrid GA-PSO approach over some other searchtechniques in terms of solution quality and convergencerates. However their method is quite complicated andfor some functions many evaluations are needed toachieve the desired accuracy.In the proposed G3A algorithm, among the three

chromosomes, the two chromosomes with better fitnessvalue go for crossover using the BLX-� crossoveroperator[24]: the offspring chromosome X is therandom vector chosen from the interval [Gmin � I��,Gmax + I��], where ����� � ����

1iG �

2iG � i� � ���� �

��1iG �

2iG � i� � � � ���� � ����� i is the gene

identifier and � = 0.5. Three children are generated bythe parent pair via crossover, and the better two are keptfor the next generation. The other chromosome notparticipated in crossover undergoes swarm algorithm toupdate its �position vector� X, i.e., the values of genes,using the following formula:

V(t+1) = (0.5 + r0/2)V(t) + C1r1(Pcell � X(t)) +C2r2(Pglobal � X(t))� � ���

� X(t+1) = X(t) + V(t+1)�� ���

where r0, r1, and r2 are random numbers inUniform[0,1]. For each function, the value of X isconfined in the searching domain of that function, andV is unconstrained. Note that Pcell in the second termrepresents the best position of this cell, whose valuewill be the same as Pglobal if the user only uses one cell.C1 and C2 are the two constants known as the PersonalFactor and the Social Factor, respectively. The optimalvalues of C1 and C2 are problem-dependent. From theGA�s point of view, the swarm operation in G3A is aneffective substitution for the mutation operation.Updating the position vector toward the global bestposition is in fact a guided mutation toward the optimalsolution. After the crossover and swarm operation, thethree chromosomes are then compared for the fitnessvalue, and the operation cycle repeats till terminationcriteria satisfied, see Figure 1.

III. EXPERIMENTAL RESULTSEight benchmark functions were used to test the

performance of the G3A cells. The definitions for thosefunctions can be found in [15,21]. To compare directlywith the results from GA-PSO [21], the terminationcriterion was set to |fobj � fanal| < error with the errorvalues taken from [21]. The experiment was conductedfor 1, 2, and 3-cell settings for each function with twosets of (C1, C2) values: (2.0, 0) and (1.0, 1.0). In theformer case each cell is running independently (C2 = 0),while in the later (interactive, C2 � 0) case the cellsshare the information of the global best position. Eachfunction was tested for 50 times, and if the run fulfilledthe termination criterion in 500,000 loops, that run wasconsidered as success. The average 1 (in 50 times)number of calls to the fitness function and the successrate are compared in Table I. The results from theinteractive case are quoted in parentheses. Note that theaverage numbers of function evaluations shown areobtained from the successful runs only, and the rates ofsuccess in GA-PSO [21] are all 100 %.

Figure 1. The G3A algorithm for one cell.

1 In this study the median number is used instead because thedistribution is asymmetric.

Initialization: create 3 chromosomes.REPEATEvaluation: Evaluate the fitness of each chromosome.GA: apply BLX-0.5 crossover to the best twochromosomes, select 2 from 3 children.PSO: update the other chromosome by PSO algorithm.

UNTIL stopping criteria are reached.

434430430430

IV. DISCUSSIONA few discussions regarding these results are given

below:(1) In the (2.0,0) setting for most functions, it is

found that the rate of success approximately obeys theprobability law: (rate of success for n cells case) = 1 �(rate of failure for 1 cell case)n , where n is the numberof cells, indicates that each cell indeed finds thesolution independently. This fact implies that theoptimal solutions can eventually be found by increasingthe amount of cells, although in the current study somefunctions can not obtain 100 % of successful rate by 3cells only.(2) For functions R2, R5, H6,4, and R10, the 2-cell and

3-cell interactive (C2 = 1.0) settings found the optimalsolution in fewer evaluations than in the 1-cell setting.These observations seem to hint that the cells areworking cooperatively by sharing the global bestposition with each other. The enhancement ofperformance is quite dramatic. This indicates that G3Aalgorithm may have good potential in solving largerdimensional problems.(3) The value of personal factor C1 and social factor

C2 in PSO has a significant impact on rate of successfulruns and number of function calls. Regarding to thenumber of function evaluations, it was found that the(C1 = 1, C2 =1) setting has better performance ingeneral. However the (C1 = 1, C2 =1) setting was doingpoorly in the rate of success on the H6,4 function.Apparently for this function the cell needs a largervelocity boost (C1 � 2) to escape the local minimatraps. For some functions the successful rates are lowerthan the independent run case. A better algorithm tocoordinate the searching among cells would be aninteresting topic in the future research.

V. CONCLUSION AND FUTURE PROSPECTSA different approach, denoted as G3A, of

combining GA and PSO is presented in this study. Formost of the test functions under investigation, theperformance of the proposed algorithm is better thanthe published GA-PSO methods in three aspects: (1)fewer function-evaluations, (2) smaller population size,and (3) fewer internal parameters. A G3A cell could betreated as a cellular automaton in finding the globaloptimal solution of continuous multimodal functions.This system can be run in arbitrary number of cells. Inthe future development, a coordinated many-cell systemseems promising for large scale searching. So far thecells exhibit moderate successful rate in somebenchmark functions. A controller node and algorithmmight be helpful to coordinate the calculationseffectively. Next, an integer version will have a widerapplication potential in real-world applications. Theswarm part could be replaced by some other mutationmethod, e.g. a controlled mutation from the best

chromosome. Finally, designing multi-objective cellsfollowing the G3A methodology is of great interest.

REFERENCES[1] J. H. Holland, Adaption in Nature and Artificial Systems, The

University of Michigan Press, 1975.[2] D. E. Goldburg, Genetic Algorithms in Search, Optimization,

and Machine Learning, Addison-Wesley, New York, 1989.[3] D. E. Goldburg, Real-coded genetic algorithms, virtual

alphabets, and blocking, Complex Systems 5 (1991) 139-167.[4] G. Berthiau, P. Siarry, A genetic algorithm for globally

minimizing functions of several continuous variables, in:Second International Conference on Meta-heuristics, Sophia-Antipolis, France, July 1997.

[5] R. Chelouah, P. Siarry, A continuous genetic algorithmdesigned for the global optimization of multimodal functions,Journal of Heuristics 6 (2000) 191�213.

[6] Z. Michalewicz, Genetic Algorithms + Data Structures =Evolution Programs, Springer-Verlag, Heidelberg, 1996.

[7] H. Mühlenbein, Evolution in time and space��the parallelgenetic algorithm, in: G. Rawlins (Ed.), Foundations of GeneticAlgorithms, Morgan-Kaufman, 1991, pp. 316�337.

[8] H. Mühlenbein, M. Schomish, and J. Born, The parallel geneticalgorithm as function optimizer, in: Proceedings of the FourthInternational Conference on Genetic Algorithms, San Diego,CA, USA, (1991), pp. 271�278.

[9] H. Mühlenbein, D. Schlierkamp-Voosen, Analysis of selection,mutation and recombination in genetic algorithms, TechnicalReport 93/94, GMD, 1993.

[10] J.A. Nelder, R. Mead, A simplex method for functionminimization, The Computer Journal 7 (1965) 308�313.

[11] W.H. Press, B.P. Flanery, S.A. Teukolsky, W.T. Vatterling,Numerical Recipes in C, The Art of Science Computing,Cambridge UP, New York, 1988.

[12] N. Durand, J.M. Alliot, A combined Nelder�Mead simplex andgenetic algorithm, in: W. Banzhaf, J. Daida, A.E. Eiben, M.H.Garzon, V. Honavar, M. Jakiela, R.E. Smith (Eds.),Proceedings of the Genetic and Evolutionary Computation��������� ���� ���, Morgan Kaufmann, Orlando, FL,USA, 1999, pp. 1�7.

[13] J.A. Joines, M.G. Kay, and R.E. King, A hybrid-geneticalgorithm for manufacturing cell design. Technical ReportNCSU-IE, Department of Industrial Engineering, NorthCarolina State University, Box 7906 Raleigh, NC 27695-7906,February 1997.

[14] J.M. Renders, S.P. Flasse, Hybrid method using geneticalgorithms for the global optimization, IEEE Transactions onSystems, Man, and Cybernetics 26 (2) (1996) 243�258.

[15] R. Chelouah, P. Siarry, Genetic and Nelder�Mead algorithmshybridized for a more accurate global optimization ofcontinuous multiminima functions, European Journal ofOperational Research 148 (2003) 335�348.

[16] R.C. Eberhart, J. Kennedy, A new optimizer using particleswarm theory, in: Proceedings of the Sixth InternationalSymposium on Micro Machine and Human Science, Nagoya,Japan, (1995), pp. 39�43.

[17] J. Kennedy, R.C. Eberhart, Particle swarm optimization, in:Proceedings of the IEEE International Conference on NeuralNetworks, Piscataway, NJ, USA, (1995), pp. 1942�1948.

[18] S. Smith, The simplex method and evolutionary algorithms, in:Proceedings of the IEEE International Conference onEvolutionary Computation, Anchorage, AL, USA, (1998), pp.799�804.

435431431431

[19] R. Hooke, T.A. Jeeves, Direct search solution of numerical andstatistical problems, Journal of Association for ComputingMachinery 8 (1961) 212�221.

[20] A. Banks, J. Vincent, and C. Anyakoha, A review of particleswarm optimization. Part II: hybridisation, combinatorial,multicriteria and constrained optimization, and indicativeapplications, Natural Computing 7 (2008) 109�124.

[21] Y. Kao, E. Zahara, A hybrid genetic algorithm and particleswarm optimization for multimodal functions, Applied SoftComputing 8 (2008) 849�857.

[22] S.K. Fan, Y.C. Liang, and E. Zahara, A genetic algorithm and aparticle swarm optimizer hybridized with Nelder�Mead

simplex search, Computers and Industrial Engineering 50(2006) 401�425.

[23] S.K. Fan, E. Zahara, Hybrid simplex search and particle swarmoptimization for unconstrained optimization problems,European Journal of Operational Research 181 (2007) 527�548.

[24] L.J. Eshelman, J.D. Shaffer, Real-coded genetic algorithms andinterval schemata. Foundations of Genetic Algorithms 2, L.Darrell Whitley (Ed.), Morgan Kaufmann, San Mateo, 1993,pp.187-202.

TABLE I. EXPERIMENTAL RESULTS FROM 8 TEST FUNCTIONS

Function Error Success Rate (%) average number of function evaluations

1 cell 2 cells 3 cells GA-PSO[21] 1 cell 2 cells 3 cells

RC 0.00009 92(90) 100(94) 100(100) 8254 927(735) 1620(930) 1971(1134)

GP 0.00012 58(54) 82(80) 96(86) 25706 1323(765) 2322(1272) 2844(1755)

SH 0.00007 66(48) 68(86) 90(90) 96211 1863(1047) 3390(2118) 4653(3177)

R2 0.00064 100(100) 100(100) 100(100) 104894 17457(7731) 16428(3288) 19413(3105)

H3,4 0.0002 72(74) 80(92) 98(100) 2117 1161(801) 2136(1068) 2115(1107)

R5 0.00013 72(74) 88(72) 96(82) 1358064 1177884(701661) 2202708(58680) 3185658(46539)

H6,4 0.00024 44(38) 72(60) 80(70) 12568 2655(2493) 4836(1344) 6885(1791)

R10 0.00005 58(82) 90(70) 94(88) 5319160 3738201(2434185) 7036302(163002) 10068741(102078)

436432432432