research article gradient-based cuckoo search for global...

13
Research Article Gradient-Based Cuckoo Search for Global Optimization Seif-Eddeen K. Fateen 1 and Adrián Bonilla-Petriciolet 2 1 Department of Chemical Engineering, Cairo University, Giza 12316, Egypt 2 Department of Chemical Engineering, Aguascalientes Institute of Technology, 20256 Aguascalientes, AGS, Mexico Correspondence should be addressed to Seif-Eddeen K. Fateen; [email protected] Received 30 December 2013; Accepted 7 April 2014; Published 8 May 2014 Academic Editor: P. Karthigaikumar Copyright © 2014 S.-E. K. Fateen and A. Bonilla-Petriciolet. is is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. One of the major advantages of stochastic global optimization methods is the lack of the need of the gradient of the objective function. However, in some cases, this gradient is readily available and can be used to improve the numerical performance of stochastic optimization methods specially the quality and precision of global optimal solution. In this study, we proposed a gradient- based modification to the cuckoo search algorithm, which is a nature-inspired swarm-based stochastic global optimization method. We introduced the gradient-based cuckoo search (GBCS) and evaluated its performance vis-` a-vis the original algorithm in solving twenty-four benchmark functions. e use of GBCS improved reliability and effectiveness of the algorithm in all but four of the tested benchmark problems. GBCS proved to be a strong candidate for solving difficult optimization problems, for which the gradient of the objective function is readily available. 1. Introduction e use of stochastic global optimization methods has gained popularity in a wide variety of scientific and engineering applications as those methods have some advantages over deterministic optimization methods [1]. ose advantages include the lack of the need for a good initial guess and the ability to handle multimodal and nonconvex objective functions without the assumptions of continuity and differ- entiability. In addition, one of the important advantages is the lack of the need for information about the gradient of the objective function. Gradient-free optimization methods can be either deterministic or stochastic, but their applications can be found in many disciplines [2]. In many applications, however, the gradient of the objective function is already available or easily obtainable. Yet, this valuable piece of information is entirely ignored by traditional stochastic optimization methods. However, for the functions whose gradient is available, the use of the gradient may improve the reliability and efficiency of the stochastic search algorithm. In particular, the quality and precision of solutions obtained with gradient-based optimization methods outperform those obtained with traditional stochastic optimization methods. For a wide variety of engineering applications, it is needed to obtain solutions with a high precision and, therefore, the conventional stochastic optimization methods may fail to satisfy this requirement. Until now, several stochastic methods have been pro- posed and investigated in challenging optimization prob- lems using continuous variables and they include simulated annealing, genetic algorithms, differential evolution, parti- cle swarm optimization, harmony search, and ant colony optimization. In general, these methods may show different numerical performances and, consequently, the search for more effective and reliable stochastic global optimization methods is currently an active area of research. In partic- ular, the cuckoo search (CS) [3] is a novel nature-inspired stochastic optimization method. is relatively new method is gaining popularity in finding the global minimum of diverse science and engineering application problems [48]. For example, it was recently used for the design of integrated power systems [5] and solving reliability-redundancy alloca- tion [6], phase equilibrium [7], and mobile-robot navigation [8] problems. CS was selected to test the concept of using the Hindawi Publishing Corporation Mathematical Problems in Engineering Volume 2014, Article ID 493740, 12 pages http://dx.doi.org/10.1155/2014/493740

Upload: others

Post on 31-May-2020

2 views

Category:

Documents


0 download

TRANSCRIPT

Research ArticleGradient-Based Cuckoo Search for Global Optimization

Seif-Eddeen K Fateen1 and Adriaacuten Bonilla-Petriciolet2

1 Department of Chemical Engineering Cairo University Giza 12316 Egypt2 Department of Chemical Engineering Aguascalientes Institute of Technology 20256 Aguascalientes AGS Mexico

Correspondence should be addressed to Seif-Eddeen K Fateen fateeneng1cuedueg

Received 30 December 2013 Accepted 7 April 2014 Published 8 May 2014

Academic Editor P Karthigaikumar

Copyright copy 2014 S-E K Fateen and A Bonilla-PetricioletThis is an open access article distributed under the Creative CommonsAttribution License which permits unrestricted use distribution and reproduction in any medium provided the original work isproperly cited

One of the major advantages of stochastic global optimization methods is the lack of the need of the gradient of the objectivefunction However in some cases this gradient is readily available and can be used to improve the numerical performance ofstochastic optimizationmethods specially the quality and precision of global optimal solution In this study we proposed a gradient-basedmodification to the cuckoo search algorithm which is a nature-inspired swarm-based stochastic global optimizationmethodWe introduced the gradient-based cuckoo search (GBCS) and evaluated its performance vis-a-vis the original algorithm in solvingtwenty-four benchmark functions The use of GBCS improved reliability and effectiveness of the algorithm in all but four of thetested benchmark problems GBCS proved to be a strong candidate for solving difficult optimization problems for which thegradient of the objective function is readily available

1 Introduction

Theuse of stochastic global optimizationmethods has gainedpopularity in a wide variety of scientific and engineeringapplications as those methods have some advantages overdeterministic optimization methods [1] Those advantagesinclude the lack of the need for a good initial guess andthe ability to handle multimodal and nonconvex objectivefunctions without the assumptions of continuity and differ-entiability In addition one of the important advantages isthe lack of the need for information about the gradient of theobjective function Gradient-free optimization methods canbe either deterministic or stochastic but their applicationscan be found in many disciplines [2] In many applicationshowever the gradient of the objective function is alreadyavailable or easily obtainable Yet this valuable piece ofinformation is entirely ignored by traditional stochasticoptimization methods However for the functions whosegradient is available the use of the gradient may improve thereliability and efficiency of the stochastic search algorithmIn particular the quality and precision of solutions obtainedwith gradient-based optimizationmethods outperform those

obtained with traditional stochastic optimization methodsFor a wide variety of engineering applications it is neededto obtain solutions with a high precision and therefore theconventional stochastic optimization methods may fail tosatisfy this requirement

Until now several stochastic methods have been pro-posed and investigated in challenging optimization prob-lems using continuous variables and they include simulatedannealing genetic algorithms differential evolution parti-cle swarm optimization harmony search and ant colonyoptimization In general these methods may show differentnumerical performances and consequently the search formore effective and reliable stochastic global optimizationmethods is currently an active area of research In partic-ular the cuckoo search (CS) [3] is a novel nature-inspiredstochastic optimization method This relatively new methodis gaining popularity in finding the global minimum ofdiverse science and engineering application problems [4ndash8]For example it was recently used for the design of integratedpower systems [5] and solving reliability-redundancy alloca-tion [6] phase equilibrium [7] and mobile-robot navigation[8] problems CS was selected to test the concept of using the

Hindawi Publishing CorporationMathematical Problems in EngineeringVolume 2014 Article ID 493740 12 pageshttpdxdoiorg1011552014493740

2 Mathematical Problems in Engineering

gradient as a source of new information that guides cuckoosin their search

Therefore in this study a simple modification was doneto the CS algorithm to make use of the gradient informationand enhance the reliability and efficiency of the algorithmThe aim of this work is to present a modification to theexisting CS algorithm based on the gradient of the objectivefunction and to evaluate its performance in comparisonwith the original algorithm The remainder of this paper isdivided as follows Section 2 introduces the cuckoo searchalgorithm Section 3 introduces the proposed modificationand our new gradient-based cuckoo search (GBCS) algo-rithm The numerical experiments performed to evaluatethe modification are presented in Section 4 The results ofthe numerical experiments are presented and discussed inSection 5 Section 6 summarizes the conclusions of the work

2 Cuckoo Search (CS) Algorithm

CS is a nature-inspired stochastic global optimizationmethod that was developed by Yang and Deb [3 9] Itsconcept comes from the brood parasitism behavior of thecuckoo bird Specifically brood parasitism is a reproductivestrategy followed by cuckoos in which they lay their eggs inthe nests of other birds which are usually other species Ifthese eggs are discovered by the host bird it may abandonthe nest completely or throw away the alien eggsThis naturalphenomenonhas led to the evolution of cuckoo eggs tomimicthe egg appearance of local host birds The following ruleshave been employed in the search algorithm to implementthose concepts (1) one egg is laid by each cuckoo in arandom nest and it represents a set of solution coordinates(2) the best eggs (ie solutions) are contained in a fractionof the nests and will carry over to the next generation and(3) the number of nests is fixed and a host bird can findan alien egg with a specified probability 119901

119886isin [0 1] If

this condition occurs the host bird can discard the egg orabandon the nest and a new nest will be built elsewhere Foralgorithm simplicity this condition has been implementedin CS assuming that a fraction 119901

119886of 119899 nests is replaced by

new nests The pseudocode of CS is reported in Algorithm 1and details of this metaheuristic are reported in [3] Notethat Levy flights are used in CS for performing effectivelyboth local and global searches in the solution domain A Levyflight is a randomwalk (ie a trajectory that consists of takingsuccessive random steps) and is characterized by a sequenceof sudden jumps which are chosen from a probability densityfunction that has a power law tail In fact Levy flight isconsidered as the optimum random search pattern and hasbeen useful in stochastic simulations of random naturalphenomena including applications of astronomy physicsand biology To generate a new egg in CS a Levy flight isperformed using the coordinates of an egg selected randomlyThis step can be represented by

119909119905+1

119894= 119909119905

119894+ 120572 oplus Levy (120582) (1)

where oplus denotes entry-wise multiplication 120572 is the step sizeand Levy (120582) is the Levy distribution Egg is displaced to this

new position if the objective function value is found betterthan another randomly selected egg The step size 120572 controlsthe scale of random search and depends on scales of theoptimization problems under analysis

A fraction (1-119901119886) of the nests selected at random is

abandoned and replaced by new ones at new locations vialocal random walks The local random walk can be writtenas

119909119905+1

119894= 119909119905

119894+ 120572 (119909

119905

119895minus 119909119905

119896) (2)

where 119909119905119895and 119909119905119896are two different solutions selected randomly

by random permutation and 120572 is a random number drawnfrom a uniform distribution An advantage of CS over geneticalgorithm particle swarm optimization and other stochasticoptimization methods is that there is only one parameter tobe tuned namely the fraction of nests to be abandoned (1-119901

119886)

However Yang and Deb [3 9] found that the results obtainedfor a variety of optimization problems were not so dependenton the value of 119901

119886and suggested using 119901

119886= 025

3 Gradient-Based Cuckoo Search(GBCS) Algorithm

The purpose of this section is to introduce the simplemodification of the original CS algorithm to incorporateinformation about the gradient of the objective function Anymodification to algorithm should not change its stochasticnature so as not to negatively affect its performance Amodification was made to the local random walk in whicha fraction (1-119901

119886) of the nests are replaced (2) In the original

algorithm when new nests are generated from the replacednests via a random step the magnitude and the directionof the step are both random In the modified algorithmthe randomness of the magnitude of the step is reservedHowever the direction is determined based on the sign ofthe gradient of the function If the gradient is negative thestep direction is made positive If the gradient is positive thestep direction ismade negativeThus new nests are generatedrandomly from the worse nests but in the direction of theminimum as seen from the point of view of the old nestsThus (2) is replaced by

step119894= 120572 (119909

119905

119895minus 119909119905

119896)

119909119905+1

119894= 119909119905

119894+ step

119894otimes sign(minus

step119894

119889119891119894

)

(3)

where sign function obtains the sign of its argument and 119889119891119894

is the gradient of the objective function at each variable thatis 120597119891120597119909

119894

This simplemodification does not change the structure ofthe CS algorithm but makes important usage of the availableinformation about the gradient of the objective function Noadditional parameter is needed to implement this change

4 Numerical Experiments

Twenty-four classical benchmark functions were used toevaluate the performance of GBCS as compared to the

Mathematical Problems in Engineering 3

beginObjective function 119891(119909)Generate initial population of 119899 host nests 119909

119894

while (t ltMaxGeneration)Get a cuckoo randomly by Levy flightsEvaluate its qualityfitnessChoose a nest among n (say j) randomlyIf (119865119894gt 119865119895)

Replace j by the new solutionendA fraction (1 minus 119901

119886) of nests are abandoned at random and new ones are built via random walk

Keep the best solutionsRank the solutions and find the current best

end whilePostprocess results

end

Algorithm 1 Simplified algorithm of the cuckoo search algorithm

original CS These problems were chosen from amongsta list of forty-one functions All forty-one functions werescreened first by performing five optimization runs with bothCS and GBCS Functions for which both CS and GBCSperformed extremely well with no significant difference inthe results were deemed not suitable for comparison andexcluded from the evaluation Since CS is already a high-performance global optimizer the excluded functions werenot suitable for showing the differences between the twoalgorithms Table 1 lists the twenty-four benchmark functionsused for the evaluation of the two algorithms along with theirderivatives and the search domain The number of variablesthe number of iterations used and the value at the globalminimum for each problem are shown with the results inTable 2

Note that the benchmark functions used include threestochastic test functions Most deterministic algorithms suchas Nelder-Mead downhill simplex method would fail withthose stochastic functions [9] However the inclusion ofthose stochastic functions in the present evaluation is toensure that the modification of the stochastic method notonly preserves its stochastic nature and ability to solvestochastic functions but also improves the performanceof minimization algorithm The stochastic functions weredeveloped by turning three deterministic functions (CubeRosenbrock and Griewank) into stochastic test functionsby adding a stochastic vector 120576 drawn from a uniformdistribution in isin [0 1] [9]

The twenty-four problems constitute comprehensive test-ing for the reliability and effectiveness of the suggestedmodification to the original CS Eight functions have twovariables only and their surface plots are shown in Figure 1

Each test function was run for thirty times on MATLABon an HP Pavilion dv6 laptop with Core i7 processorThe function value was recorded at every iteration and theaverage and standard deviation for the thirty runs werecalculated at every iteration The maximum number ofiterations was 1000 for functions with 2 variables and 5000

for functions with 50 variables The population size for thecuckoo swarm was 100 The only parameter used in theoriginal CS 119901

119886 was kept constant at a value of 025 The

MATLAB code for all the functions and their derivativesis available online at httpdxdoiorg1011552014493740 assupplementary material to this paper

5 Results and Discussion

As stated each of the numerical experiments was repeated30 times with different random seeds for GBCS and for theoriginal CS algorithm The value at each iteration for eachtrial was recorded The mean and the standard deviation ofthe function values were calculated at each iteration Theprogress of the mean values is presented in Figures 2ndash4 foreach function and a brief discussion of those results followsThe total CPU time for running the two algorithms for 30times on each problem varied depending on the problemrsquosnumber of variables and the number of iterations used CPUtime ranged from 3827min for two-variable problems to9103min for fifty-variable problems

The Ackley function has one minimum only which wasobtained using the two methods as shown in Figure 2(a)However GBCS is clearly more effective than CS in reachingthe minimum at less number of iterations The improvementin performance was also clear with the Beale function(Figure 2(b)) Beale has one minimum only which wasobtained satisfactory by the two methods This pattern ofbehavior was also observed for the booth function GBCSwas significantly more effective than the CS as it reached theglobal optimumwithin a tolerance of 10minus32 in less than half ofthe number of iterations of CS as shown in Figure 2(c) Thefirst three functions are relatively easy to optimize the globaloptima were easily obtained

The cross-leg table function is a difficult one to minimizeIts value at the global minimum is minus1 Both algorithmshave not been able to reach the global minimum Yet

4 Mathematical Problems in Engineering

Table1Testfunctio

nsused

fortestin

gthep

erform

ance

ofCS

andGBC

S

Num

ber

Nam

eFu

nctio

nandits

deriv

ative

Search

Dom

ain

1Ac

kley

1198911=20(1minus119890minus02radic05(1199092 1+1199092 2)

)minus11989005(cos21205871199091+cos21205871199092)

+1198901

1205971198911

120597119909119896

=2119909119896119890minus02radic05(1199092 1+1199092 2)

radic05(1199092 1+1199092 2)

+12058711989005(cos(21205871199091)+cos(21205871199092))

sin(2120587119909119896)

[minus3535]

2Be

ale

1198912=(15minus1199091+11990911199092)2+(225minus1199091+11990911199092 2)2+(2625minus1199091+11990911199093 2)2

1205971198912

1205971199091

=2(1199092 2minus1)(11990911199092 2minus1199091+225)+2(1199093 2minus1)(11990911199093 2minus1199091+2625)+2(1199092minus1)(11990911199092minus1199091+15)

1205971198912

1205971199092

=21199091(11990911199092minus1199091+15)+411990911199092(11990911199092 2minus1199091+225)+611990911199092 2(11990911199093 2minus1199091+2625)

[minus4545]

3Bo

oth

1198913=(1199091+21199092minus7)2+(21199091+1199092minus5)2

1205971198913

1205971199091

=101199091+81199092minus34

1205971198913

1205971199092

=81199091+101199092minus38

[minus1010]

4Cr

oss-legtable

1198914=minus[1003816 1003816 1003816 1003816 1003816 1003816 1003816sin(1199091)sin(1199092)119890|100minusradic1199092 1+1199092 2120587|

1003816 1003816 1003816 1003816 1003816 1003816 1003816+1]

minus01

1205971198914

1205971199091

=011205901(sign[sin(1199091)sin(1199092)]cos(1199091)sin(1199092)+1199091sig

n(1205902120587minus100)1003816 1003816 1003816 1003816sin(1199091)sin(1199092)1003816 1003816 1003816 10038161205871205902)

(1205901

1003816 1003816 1003816 1003816sin(1199091)sin(1199092)1003816 1003816 1003816 1003816+1)11

1205971198914

1205971199092

=011205901(sign[sin(1199091)sin(1199092)]cos(1199092)sin(1199091)+1199092sig

n(1205902120587minus100)1003816 1003816 1003816 1003816sin(1199091)sin(1199092)1003816 1003816 1003816 10038161205871205902)

(1205901

1003816 1003816 1003816 1003816sin(1199091)sin(1199092)1003816 1003816 1003816 1003816+1)11

where1205901=119890|(1205902120587)minus100|

1205902=radic1199092 1+1199092 2

[minus1010]

5Him

melb

lau

1198915=(1199092 1+1199092minus11)2+(1199092 2+1199091minus7)2

1205971198915

1205971199091

=21199091+41199091(1199092 1+1199092minus11)+21199092 2minus141205971198915

1205971199092

=21199092+41199092(1199092 2+1199091minus7)+21199092 1minus22

[minus55]

6Levy

13

1198916=sin2

(31205871199091)+(1199091minus1)2[1+sin2

(31205871199092)]+(1199092minus1)2[1+sin2

(21205871199092)]

1205971198916

1205971199091

=(21199091minus2)(sin(31205871199092)2+1)+6120587cos(31205871199091)sin(31205871199091)

1205971198916

1205971199092

=(21199092minus2)(sin(21205871199092)2+1)+4120587cos(21205871199092)sin(21205871199092)(1199092minus1)2+6120587cos(31205871199092)sin(31205871199092)(1199091minus1)2

[minus1010]

7Matyas

1198917=0261199092 1minus04811990911199092+0261199092 2

1205971198917

1205971199091

=0521199091minus04811990921205971198917

1205971199092

=0521199092minus0481199091

[minus1010]

Mathematical Problems in Engineering 5

Table1Con

tinued

Num

ber

Nam

eFu

nctio

nandits

deriv

ative

Search

Dom

ain

8Schaffer

1198918=05+

sin2

[radic1199092 1+1199092 2]minus05

[0001(1199092 1+1199092 2)+1]2

1205971198918

120597119909119896

=minus119909119896((0004sin(120590)2minus0002)

(0001(1199092 1+1199092 2)+1)3

minus2cos (120590)sin(120590)

120590(0001(1199092 1+1199092 2)+1)2)

Where120590=radic1199092 1+1199092 2

[minus100100]

9Po

well

1198919=(1199091+101199092)2+5(1199093minus1199094)2+(1199092minus21199093)4+10(1199091minus1199094)4

1205971198919

1205971199091

=21199091+201199092+40(1199091minus1199094)31205971198919

1205971199092

=201199091+2001199092+4(1199092minus21199093)3

1205971198919

1205971199093

=101199093minus101199094minus8(1199092minus21199093)31205971198919

1205971199094

=101199094minus101199093minus40(1199091minus1199094)3

[minus10001000]

10Po

wer

sum

11989110=

119898 sum 119894=1

[(

119898 sum 119895=1

119909119894 119895)minus119887 119894]

2

119887=[81842114]

12059711989110

120597119909119896

=2

119898 sum 119894=1

119894119909119894minus1

119896[(

119898 sum 119895=1

119909119894 119895)minus119887 119894]

[04]

11Sh

ekel5

11989111=minus

119898 sum 119894=1

1

119888 119894+sum119899 119895=1(119909119895minus119886119894119895)2

=1

119898=1012059711989111

120597119909119896

=

119898 sum 119894=1

2(119909119896minus119886119894119896)

[119888 119894+sum119899 119895=1(119909119895minus119886119894119895)2

]2

[010]

12Woo

d

11989112=100(1199092 1minus1199092)2+(1199091minus1)2+(1199093minus1)2+90(1199092 3minus1199094)2+101[(1199092minus1)2+(1199094minus1)2]+198(1199092minus1)(1199094minus1)

12059711989112

1205971199091

=21199091minus4001199091(1199092minus1199092 1)minus212059711989112

1205971199092

=minus2001199092 1+22021199092+1981199094minus40

12059711989112

1205971199093

=21199093minus3601199093(1199094minus1199092 3)minus212059711989112

1205971199094

=minus1801199092 3+1981199092+20021199094minus40

[minus10001000]

13Cu

be11989113=

119898minus1

sum 119894=1

100(119909119894+1minus1199093 119894)2

+(1minus119909119894)212059711989113

1205971199091

=21199091minus6001199092 1(1199092minus1199093 1)minus2

12059711989113

120597119909119896

=202119909119896minus6001199092 119896(119909119896+1minus1199093 119896)minus2001199093 119896minus1minus212059711989113

120597119909119898

=200(119909119898minus1199093 119898minus1)

[minus100100]

14StochasticC

ube

11989114=

119898minus1

sum 119894=1

100120576 119894(119909119894+1minus1199093 119894)2

+(1minus119909119894)212059711989113

1205971199091

=21199091minus6001199092 1120576 119894(1199092minus1199093 1)minus2

12059711989113

120597119909119896

=2119909119896minus600120576 1198961199092 119896(119909119896+1minus1199093 119896)+200120576 119894(119909119896minus1199093 119896minus1)minus212059711989113

120597119909119898

=200120576 119898minus1(119909119898minus1199093 119898minus1)

[minus100100]

15Sphere

11989115=

119898 sum 119894=1

1199092 11989412059711989115

120597119909119896

=2119909119896

[minus100100]

16Hartm

ann

11989116=minus

4 sum 119894=1

120572119894exp(minus

6 sum 119895=1

119860119894119895(119909119895minus119875119894119895)2

)12059711989116

120597119909119896

=

4 sum 119894=1

2120572119894119860119894119895( 119909119896minus119875119894119896)exp(minus

6 sum 119895=1

119860119894119895(119909119895minus119875119894119895)2

)[01]

6 Mathematical Problems in Engineering

Table1Con

tinued

Num

ber

Nam

eFu

nctio

nandits

deriv

ative

Search

Dom

ain

17Dixon

-pric

e11989117=(1199091minus1)2+

119898 sum 119894=2

119894(21199092 119894minus119909119894minus1)2

12059711989117

1205971199091

=minus81199092 2+61199091minus2

12059711989117

120597119909119896

=(2119909119896minus41199092 119896+1)(119896+1)minus8119896119909119896(119909119896minus1minus21199092 119896)12059711989117

120597119909119898

=minus8119898119909119898(119909119898minus1minus21199092 119898)

[minus1010]

18Grie

wank

11989118=

1

4000[

119898 sum 119894=1

(119909119894minus100)2]minus[

119898 prod 119894=1

cos(119909119894minus100

radic119894

)]+1

12059711989118

120597119909119896

=(119909119896minus100)

2000

+1 radic119894tan(119909119896minus100

radic119894

)[

119898 prod 119894=1

cos(119909119894minus100

radic119894

)]

[minus600600]

19Grie

wanksto

chastic

11989119=

1

4000[

119898 sum 119894=1

120576 119894(119909119894minus100)2]minus[

119898 prod 119894=1

cos(119909119894minus100

radic119894

)]+1

12059711989118

120597119909119896

=120576 119896(119909119896minus100)

2000

+1 radic119894tan(119909119896minus100

radic119894

)[

119898 prod 119894=1

cos(119909119894minus100

radic119894

)]

[minus600600]

20Michaelw

icz

11989120=minus

119898 sum 119894=1

sin(119909119894)sin20

(1198941199092 119894

120587)

12059711989120

120597119909119896

=minuscos (119896minus119909)sin(1198961199092 119896

120587)

20

minus40119896119909119896

120587cos(1198961199092 119896

120587)sin(1198961199092 119896

120587)

19

sin(119909119896)

[0120587]

21Ro

senb

rock

11989121=

119898minus1

sum 119894=1

100(119909119894+1minus1199092 119894)2

+(119909119894minus1)212059711989121

1205971199091

=2(1199091minus1)minus4001199091(1199092minus1199092 1)

12059711989121

120597119909119896

=minus2001199092 119896minus1+202119909119896minus400119909119896(119909119896+1minus1199092 119896)minus212059711989121

120597119909119898

=200(119909119898minus1199092 119898minus1)

[minus5050]

22StochasticR

osenbrock

11989122=

119898minus1

sum 119894=1

100120576 119894(119909119894+1minus1199092 119894)2

+(119909119894minus1)212059711989122

1205971199091

=2(1199091minus1)minus400120576 11199091(1199092minus1199092 1)

12059711989122

120597119909119896

=2119909119896+200120576 119896(119909119896minus1199092 119896minus1)minus400120576 119896119909119896(119909119896+1minus1199092 119896)minus212059711989122

120597119909119898

=200120576 119894(119909119898minus1199092 119898minus1)

[minus5050]

23Trigon

ometric

11989123=

119898 sum 119894=1

[119898+119894(1minuscos119909119894)minussin119909119894minus

119898 sum 119895=1

cos119909119895]

2

12059711989123

120597119909119896

=2[119898+119896(1minuscos119909119894)minussin119909119894minus

119898 sum 119895=1

cos119909119895][ (119894+1)sin119909119896minuscos119909119896]

[minus10001000]

24Za

charov

11989124=

119898 sum 119894=1

1199092 119894+(

119898 sum 119894=1

05119894119909119894)

2

+(

119898 sum 119894=1

05119894119909119894)

4

12059711989124

120597119909119896

=2119909119896+119896

119898 sum 119894=1

05119894119909119894+2119896(

119898 sum 119894=1

05119894119909119894)

3

[minus510]

Mathematical Problems in Engineering 7

020

400

20400

5

10

15

20

xy minus20 minus20minus40 minus40

f(xy)

(a)

05

0

50

05

1

15

2

xy

f(xy)

minus5 minus5

times105

(b)

05

100

510

0500

10001500200025003000

xy

f(xy)

minus5 minus5minus10 minus10

(c)

x

0 05

10

yminus5 minus5

minus1

minus08

minus06

minus04

minus02

0

minus10 minus10

f(xy)

510

(d)

0

200

400

600

800

1000

xy

f(xy)

0 0

5

minus5 minus5

5

(e)

xy

0

100

200

300

400

f(xy)

550 0

1010

minus5 minus5minus10 minus10

(f)

xy

0

40

20

60

80

100

f(xy)

0 05

510

10

minus5 minus5minus10 minus10

(g)

y x

550 0

1010

minus5 minus5minus10 minus10

0

0402

0608

1

f(xy)

(h)

Figure 1 Surface plots of the two-variable benchmark functions used in this study (a) Ackley (b) Beale (c) Booth (d) cross-leg table (e)Himmelblau (f) Levy 13 (g) Matyas and (h) Schaffer functions

8 Mathematical Problems in Engineering

Table 2 Values of the meanminima and standard deviations obtained by the CS and GBCS algorithms compared with the value of the globalminima of the twenty-four benchmark problems

Number Benchmarkfunction

Number ofvariables

Globalmin

Number ofiterations

GBCS CSMean Std Dev Mean Std Dev

1 Ackley 2 0 1000 0 0 22204E-16 67752E-162 Beale 2 0 1000 0 0 57891E-30 12165E-293 Booth 2 0 1000 0 0 0 04 Cross-leg table 2 minus1 1000 minus11463E-2 7672E-3 minus62704E-3 36529E-35 Himmelblau 2 0 1000 17058E-28 2836E-28 25958E-19 53451E-196 Levy 13 2 0 1000 13498E-31 66809E-47 13498E-31 66809E-477 Matyas 2 0 1000 27691E-54 4728E-54 20407E-38 50616E-388 Schaffer 2 0 3000 0 0 74015E-18 19193E-179 Powell 4 0 1000 18694E-8 35848E-8 16296E-13 34802E-1310 Power sum 4 0 1000 18328E-4 16761E-4 25432E-4 18167E-411 Shekel 5 4 minus10536 200 minus10536 16289E-5 minus10536 18421E-212 Wood 4 0 1000 23726 22208 040838 033713 Cube 5 0 5000 12567 086542 5782E-8 25596E-714 Stochastic cube 5 0 5000 77438 69815 64369 5029215 Sphere 5 0 1000 25147E-38 51577E-38 11371E-21 12967E-2116 Hartmann 6 minus33224 200 minus33224 43959E-10 minus33215 60711E-417 Dixon-price 50 0 5000 47094E-2 16904E-1 66667E-1 26103E-618 Griewank 50 0 5000 0 0 33651E-10 94382E-10

19 StochasticGriewank 50 0 5000 72758E-13 28579E-12 69263 20451

20 Michaelwicz 50 5000 minus32263 13729 minus27383 1355121 Rosenbrock 50 0 5000 097368 05885 35286 37012

22 StochasticRosenbrock 50 0 5000 58599 19122 48944 46783

23 Trigonometric 50 0 5000 53560 45360 19435 4033724 Zacharov 50 0 5000 65769 13288 27031 52748

GBCS performed significantly better than CS as shown inFigure 2(d) On the other hand both algorithms were ableto identify the global minimum of the Himmelblau functionFigure 2(e) shows the evolution of the mean best valuesGBCS performed more effectively than CS Both algorithmswere also able to identify the minimum of the Levy 13 func-tion (Figure 2(f)) However GBCS was significantly moreeffective than CS This pattern was repeated with the Matyasfunction as shown in Figure 2(g)

The Schaffer function is multimodal Both GBCS and CSfailed to converge to the global minimum within the 10minus10tolerance at 1000 iterations Running both algorithms to 3000iterations resulted in GBCS reaching the global optimumwhile CS not as shown in Figure 2(f) Schaffer functionconcludes the 2-variable functions GBCS performed betterthan CS in all of them

The Powell function has four variables The evolution ofthe mean best values of CS and GBCS plotted in Figure 3(a)showed that performance of CS was better than GBCSalthough they were close to the global minimum Powell isone of the few test functions for which the use of the gradientdid not improve the performance of the CS algorithm

Minor improvements were observed with the power sumand the Shekel 5 functions Power sumrsquos global optimum wasnot obtained by the two algorithms as shown in Figure 3(b)as both algorithms seem to be trapped in a local minimumFor the Shekel 5 function Figure 3(c) the global optimumwas easily optimized with minor improvement in perfor-mance for the GBCS algorithm The global optimum for theWood function which has four variables as well was notachieved by both algorithms within 1000 iterations as shownin Figure 3(d) When they were run for 5000 GBCS seemedto be trapped in a local minimum and was not able to reachthe global optimum which was obtained by CS The Woodfunction is the only function for which GBCS performancewas much worse than that of the original CS algorithm

The performance of both algorithms for the Cube and theStochastic Cube functions was peculiar In both cases GBCSoutperformed CS at the early iterations At larger iterationsCS did better For the two functions as depicted in Figures3(e) and 3(f) the globalminimumwas not obtainedwith 1000iterations so both problems were run for 5000 CS was ableto come close to the global minimum for the Cube functionwhile GBCS seemed to be trapped in a local minimum For

Mathematical Problems in Engineering 9

0 500 1000

Func

tion

valu

e

Iterations

1010

100

10minus10

10minus20

(a)

0 500 1000

Func

tion

valu

eIterations

100

10minus20

10minus30

10minus10

10minus40

(b)

0 500 1000

Func

tion

valu

e

Iterations

1020

10minus20

10minus40

100

(c)

0 500 1000

Iterations

minus10minus4

minus10minus3

minus10minus2

minus10minus1

Func

tion

valu

e

(d)

0 500 1000

Iterations

1010

100

10minus10

10minus20

10minus30

Func

tion

valu

e

(e)

0 500 1000

Iterations

1020

100

10minus20

10minus40

Func

tion

valu

e(f)

0 500 1000

Iterations

GBCSCS

100

10minus20

10minus40

10minus60

Func

tion

valu

e

(g)

0 1000 2000 3000

Iterations

GBCSCS

100

10minus5

10minus10

10minus20

10minus15

Func

tion

valu

e

(h)

Figure 2 Evolution of mean best values for GBCS and the original CS algorithm for (a) Ackley (b) Beale (c) Booth (d) Cross-leg table (e)Himmelblau (f) Levy 13 (g) Matyas and (h) Schaffer functions

the Stochastic Cube function CS did slightly better at largeriterations but both algorithms failed to reach the globalminimum after 5000 iterations

The sphere function has five variables as the Cubefunction but it is easier to solve Both algorithms were ableto identify the global optimum as depicted in Figure 3(g)but GBCS considerably outperformed CS in terms of per-formance efficiency The Hartmann function which has 6

variables was relatively easy to solve for GBCS algorithms asshown in Figure 3(h) GBCS arrived at the global minimumbut CS could not do so within the tolerance GBCS alsooutperformed CS in terms of performance efficiency inreaching the global optimum of the Hartmann function

Figure 4 shows the performance of test functions with 50variablesThese are themost challenging problems due to thelarge domain space Figure 4(a) depicts the performance of

10 Mathematical Problems in Engineering

0 500 1000

Iterations

100

1010

10minus10

10minus20

Func

tion

valu

e

(a)

0 500 1000Iterations

102

100

10minus2

10minus4

Func

tion

valu

e(b)

0 50 100 150 200Iterations

minus100

minus101

minus102

Func

tion

valu

e

(c)

1020

100

10minus20

10minus40

Func

tion

valu

e

0 2000 4000 6000

Iterations

(d)

0 2000 4000 6000

Iterations

1010

105

100

10minus5

10minus10

Func

tion

valu

e

(e)

0 2000 4000 6000Iterations

1010

105

100

10minus5

Func

tion

valu

e(f)

0 500 1000

Iterations

1020

100

10minus20

10minus40

Func

tion

valu

e

GBCSCS

(g)

0 50 100 150 200

Iterations

Func

tion

valu

e

minus1004

minus1006

GBCSCS

(h)

Figure 3 Evolution of mean best values for GBCS and the original CS algorithm for (a) Powell (b) Power Sum (c) Shekel 5 (d) Wood (e)Cube (f) Stochastic Cube (g) Sphere and (h) Hartmann functions

both algorithms for the Dixon-Price function Clearly GBCSperformed better than CS in the early iterations Even thoughboth algorithms were not able to attain the global minimumtheminimum arrived at by GBCS at early iterations is orders-of-magnitude lower than that arrived at by CS

GBCS outperformed CS in both the Griewank(Figure 4(b)) and the Stochastic Griewank (Figure 4(c))For the Griewank function both algorithms were able toidentify the global minimum However GBCS arrived at the

global minimum at less than the half number of iterationscompared to CS For the Stochastic Griewank CS wasnot able to identify the global minimum even after 5000iterations The result predicted by CS was more than 10orders-of-magnitude higher than the result predicted byGBCS

Figure 4(d) shows the results for the Michaelwics func-tion To the best of the authorsrsquo knowledge the globalminimum for the 50-variableMichaelwics is not known since

Mathematical Problems in Engineering 11

0 2000 4000 6000

Iterations

1010

105

100

10minus5

(a)

0 2000 60004000

IterationsFu

nctio

n va

lue

1010

100

10minus10

10minus20

(b)

0 2000 60004000

Iterations

Func

tion

valu

e

105

100

10minus5

10minus10

10minus15

(c)

0 2000 60004000

Iterations

Func

tion

valu

e

minus1011

minus1013

minus1015

(d)

0 2000 60004000

Iterations

Func

tion

valu

e

105

1010

100

10minus5

(e)Fu

nctio

n va

lue

105

1010

1000 2000 60004000

Iterations

(f)

0 2000 60004000

Iterations

Func

tion

valu

e

105

106

104

103

GBCSCS

(g)

0 2000 60004000

Iterations

Func

tion

valu

e

105

1010

100

GBCSCS

(h)

Figure 4 Evolution of mean best values for GBCS and the original CS algorithm for (a) Dixon-Price (b) Griewank (c) Stochastic Griewank(d) Michaelwics (e) Rosenbrock (f) Stochastic Rosenbrock (g) Trigonometric and (h) Zacherov functions

there is no published literature identifying it GBCS identifiedaminimum that is lower than that identified byCS as depictedin Figure 4(d)

Although both algorithms were not able to identifythe global optima of the Rosenbrock and the StochasticRosenbrock functions as depicted in Figures 4(e) and 4(f)respectively the results obtained by GBCS were orders-of-magnitude lower than those obtained by CS The superiorityof the GBCS is clearly demonstrated with those two func-tions

Figure 4(g) shows the results for the Trigonometric func-tion For this difficult problem both algorithms were notable to reach the global minimum within 5000 iterations Inthis function nonetheless as in most cases studied GBCSachieved a lower result than CS

The last of the 50-variable functions tested was theZacherov function The results of Figure 4(h) also show thesuperiority of GBCS in reaching results that are orders-of-magnitude lower than those obtained by the original CSalgorithm

12 Mathematical Problems in Engineering

Table 2 shows a summary of the evaluation results for thetwenty-four benchmark problems GBCS was able to providebetter solutions to all challenging problemsGBCSwas able toachieve better performance in eighteen problems equivalentperformance in twoproblems andworse performance in fourout of the twenty-four problems In one case the StochasticGriewank function the global optimum was successfullyobtained by GBCS for this 50-variable problem while CSfailed to find the global optimum even after 5000 iterationsOn the other hand the four functions for which CS per-formed better were the Powell Wood Cube and StochasticCube functions

It is interesting to note that GBCS outperformed CSin handling two of the three stochastic functions tested inthis study This result confirms that despite the use of thegradient as a guide upon which some moves are basedthe stochastic nature of the algorithm remained unaffectedIn fact the additional information was used quite subtlyCuckoos still search for nests randomly but with the help ofsome intelligence

6 Conclusions

In this study we use the gradient of the objective functionwhich could be available or easily obtainable for severalobjective functions in engineering calculations to improvethe performance of one of the most promising stochasticalgorithms the cuckoo search The proposed modificationwas subtly implemented in the algorithm by changing thedirection of the local random walk of cuckoos towards thedirection of the minimum value from the point of viewof the cuckoorsquos location We evaluated this modification byattempting to find the global optimum of twenty-four bench-mark functionsThe newly developed GBCS algorithm led toimproved reliability and effectiveness of the algorithm in allbut four of tested benchmark problems which included threestochastic functions In some cases the global minimumcould not have been obtained via the original CS algorithmbut was obtained via GBCS Although the improvementin performance was not achieved with the entire set ofbenchmark problems GBCS proved to be more reliable andefficient in the majority of the tested problems

Conflict of InterestsThe authors declare that there is no conflict of interestsregarding the publication of this paper

References

[1] C A Floudas and C E Gounaris ldquoA review of recent advancesin global optimizationrdquo Journal of Global Optimization vol 45no 1 pp 3ndash38 2009

[2] L M Rios and N V Sahinidis ldquoDerivative-free optimizationa review of algorithms and comparison of software implemen-tationsrdquo Journal of Global Optimization vol 56 pp 1247ndash12932012

[3] X S Yang and S Deb ldquoCuckoo search via Levy flightsrdquo in Pro-ceedings of the World Congress on Nature amp Biologically InspiredComputing (NABIC rsquo09) pp 210ndash214 IEEE Coimbatore IndiaDecember 2009

[4] X-S Yang and S Deb ldquoCuckoo search recent advances andapplicationsrdquo Neural Computing and Applications vol 24 no1 pp 169ndash174 2014

[5] J Piechocki D Ambroziak A Palkowski and G RedlarskildquoUse of modified cuckoo search algorithm in the design processof integrated power systems for modern and energy self-sufficient farmsrdquo Applied Energy vol 114 pp 901ndash908 2014

[6] G Kanagaraj S Ponnambalam and N Jawahar ldquoA hybridcuckoo search and genetic algorithm for reliability-redundancyallocation problemsrdquo Computers amp Industrial Engineering vol66 no 4 pp 1115ndash1124 2013

[7] V Bhargava S Fateen and A Bonilla-Petriciolet ldquoCuckoosearch a new nature-inspired optimization method for phaseequilibrium calculationsrdquo Fluid Phase Equilibria vol 337 pp191ndash200 2013

[8] P KMohanty andD R Parhi ldquoCuckoo search algorithm for themobile robot navigationrdquo in Swarm Evolutionary and MemeticComputing pp 527ndash536 Springer New York NY USA 2013

[9] X S Yang and S Deb ldquoEngineering optimisation by cuckoosearchrdquo International Journal of Mathematical Modelling andNumerical Optimisation vol 1 no 4 pp 330ndash343 2010

Submit your manuscripts athttpwwwhindawicom

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical Problems in Engineering

Hindawi Publishing Corporationhttpwwwhindawicom

Differential EquationsInternational Journal of

Volume 2014

Applied MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Probability and StatisticsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical PhysicsAdvances in

Complex AnalysisJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

OptimizationJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

CombinatoricsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Operations ResearchAdvances in

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Function Spaces

Abstract and Applied AnalysisHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of Mathematics and Mathematical Sciences

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Algebra

Discrete Dynamics in Nature and Society

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Decision SciencesAdvances in

Discrete MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014 Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Stochastic AnalysisInternational Journal of

2 Mathematical Problems in Engineering

gradient as a source of new information that guides cuckoosin their search

Therefore in this study a simple modification was doneto the CS algorithm to make use of the gradient informationand enhance the reliability and efficiency of the algorithmThe aim of this work is to present a modification to theexisting CS algorithm based on the gradient of the objectivefunction and to evaluate its performance in comparisonwith the original algorithm The remainder of this paper isdivided as follows Section 2 introduces the cuckoo searchalgorithm Section 3 introduces the proposed modificationand our new gradient-based cuckoo search (GBCS) algo-rithm The numerical experiments performed to evaluatethe modification are presented in Section 4 The results ofthe numerical experiments are presented and discussed inSection 5 Section 6 summarizes the conclusions of the work

2 Cuckoo Search (CS) Algorithm

CS is a nature-inspired stochastic global optimizationmethod that was developed by Yang and Deb [3 9] Itsconcept comes from the brood parasitism behavior of thecuckoo bird Specifically brood parasitism is a reproductivestrategy followed by cuckoos in which they lay their eggs inthe nests of other birds which are usually other species Ifthese eggs are discovered by the host bird it may abandonthe nest completely or throw away the alien eggsThis naturalphenomenonhas led to the evolution of cuckoo eggs tomimicthe egg appearance of local host birds The following ruleshave been employed in the search algorithm to implementthose concepts (1) one egg is laid by each cuckoo in arandom nest and it represents a set of solution coordinates(2) the best eggs (ie solutions) are contained in a fractionof the nests and will carry over to the next generation and(3) the number of nests is fixed and a host bird can findan alien egg with a specified probability 119901

119886isin [0 1] If

this condition occurs the host bird can discard the egg orabandon the nest and a new nest will be built elsewhere Foralgorithm simplicity this condition has been implementedin CS assuming that a fraction 119901

119886of 119899 nests is replaced by

new nests The pseudocode of CS is reported in Algorithm 1and details of this metaheuristic are reported in [3] Notethat Levy flights are used in CS for performing effectivelyboth local and global searches in the solution domain A Levyflight is a randomwalk (ie a trajectory that consists of takingsuccessive random steps) and is characterized by a sequenceof sudden jumps which are chosen from a probability densityfunction that has a power law tail In fact Levy flight isconsidered as the optimum random search pattern and hasbeen useful in stochastic simulations of random naturalphenomena including applications of astronomy physicsand biology To generate a new egg in CS a Levy flight isperformed using the coordinates of an egg selected randomlyThis step can be represented by

119909119905+1

119894= 119909119905

119894+ 120572 oplus Levy (120582) (1)

where oplus denotes entry-wise multiplication 120572 is the step sizeand Levy (120582) is the Levy distribution Egg is displaced to this

new position if the objective function value is found betterthan another randomly selected egg The step size 120572 controlsthe scale of random search and depends on scales of theoptimization problems under analysis

A fraction (1-119901119886) of the nests selected at random is

abandoned and replaced by new ones at new locations vialocal random walks The local random walk can be writtenas

119909119905+1

119894= 119909119905

119894+ 120572 (119909

119905

119895minus 119909119905

119896) (2)

where 119909119905119895and 119909119905119896are two different solutions selected randomly

by random permutation and 120572 is a random number drawnfrom a uniform distribution An advantage of CS over geneticalgorithm particle swarm optimization and other stochasticoptimization methods is that there is only one parameter tobe tuned namely the fraction of nests to be abandoned (1-119901

119886)

However Yang and Deb [3 9] found that the results obtainedfor a variety of optimization problems were not so dependenton the value of 119901

119886and suggested using 119901

119886= 025

3 Gradient-Based Cuckoo Search(GBCS) Algorithm

The purpose of this section is to introduce the simplemodification of the original CS algorithm to incorporateinformation about the gradient of the objective function Anymodification to algorithm should not change its stochasticnature so as not to negatively affect its performance Amodification was made to the local random walk in whicha fraction (1-119901

119886) of the nests are replaced (2) In the original

algorithm when new nests are generated from the replacednests via a random step the magnitude and the directionof the step are both random In the modified algorithmthe randomness of the magnitude of the step is reservedHowever the direction is determined based on the sign ofthe gradient of the function If the gradient is negative thestep direction is made positive If the gradient is positive thestep direction ismade negativeThus new nests are generatedrandomly from the worse nests but in the direction of theminimum as seen from the point of view of the old nestsThus (2) is replaced by

step119894= 120572 (119909

119905

119895minus 119909119905

119896)

119909119905+1

119894= 119909119905

119894+ step

119894otimes sign(minus

step119894

119889119891119894

)

(3)

where sign function obtains the sign of its argument and 119889119891119894

is the gradient of the objective function at each variable thatis 120597119891120597119909

119894

This simplemodification does not change the structure ofthe CS algorithm but makes important usage of the availableinformation about the gradient of the objective function Noadditional parameter is needed to implement this change

4 Numerical Experiments

Twenty-four classical benchmark functions were used toevaluate the performance of GBCS as compared to the

Mathematical Problems in Engineering 3

beginObjective function 119891(119909)Generate initial population of 119899 host nests 119909

119894

while (t ltMaxGeneration)Get a cuckoo randomly by Levy flightsEvaluate its qualityfitnessChoose a nest among n (say j) randomlyIf (119865119894gt 119865119895)

Replace j by the new solutionendA fraction (1 minus 119901

119886) of nests are abandoned at random and new ones are built via random walk

Keep the best solutionsRank the solutions and find the current best

end whilePostprocess results

end

Algorithm 1 Simplified algorithm of the cuckoo search algorithm

original CS These problems were chosen from amongsta list of forty-one functions All forty-one functions werescreened first by performing five optimization runs with bothCS and GBCS Functions for which both CS and GBCSperformed extremely well with no significant difference inthe results were deemed not suitable for comparison andexcluded from the evaluation Since CS is already a high-performance global optimizer the excluded functions werenot suitable for showing the differences between the twoalgorithms Table 1 lists the twenty-four benchmark functionsused for the evaluation of the two algorithms along with theirderivatives and the search domain The number of variablesthe number of iterations used and the value at the globalminimum for each problem are shown with the results inTable 2

Note that the benchmark functions used include threestochastic test functions Most deterministic algorithms suchas Nelder-Mead downhill simplex method would fail withthose stochastic functions [9] However the inclusion ofthose stochastic functions in the present evaluation is toensure that the modification of the stochastic method notonly preserves its stochastic nature and ability to solvestochastic functions but also improves the performanceof minimization algorithm The stochastic functions weredeveloped by turning three deterministic functions (CubeRosenbrock and Griewank) into stochastic test functionsby adding a stochastic vector 120576 drawn from a uniformdistribution in isin [0 1] [9]

The twenty-four problems constitute comprehensive test-ing for the reliability and effectiveness of the suggestedmodification to the original CS Eight functions have twovariables only and their surface plots are shown in Figure 1

Each test function was run for thirty times on MATLABon an HP Pavilion dv6 laptop with Core i7 processorThe function value was recorded at every iteration and theaverage and standard deviation for the thirty runs werecalculated at every iteration The maximum number ofiterations was 1000 for functions with 2 variables and 5000

for functions with 50 variables The population size for thecuckoo swarm was 100 The only parameter used in theoriginal CS 119901

119886 was kept constant at a value of 025 The

MATLAB code for all the functions and their derivativesis available online at httpdxdoiorg1011552014493740 assupplementary material to this paper

5 Results and Discussion

As stated each of the numerical experiments was repeated30 times with different random seeds for GBCS and for theoriginal CS algorithm The value at each iteration for eachtrial was recorded The mean and the standard deviation ofthe function values were calculated at each iteration Theprogress of the mean values is presented in Figures 2ndash4 foreach function and a brief discussion of those results followsThe total CPU time for running the two algorithms for 30times on each problem varied depending on the problemrsquosnumber of variables and the number of iterations used CPUtime ranged from 3827min for two-variable problems to9103min for fifty-variable problems

The Ackley function has one minimum only which wasobtained using the two methods as shown in Figure 2(a)However GBCS is clearly more effective than CS in reachingthe minimum at less number of iterations The improvementin performance was also clear with the Beale function(Figure 2(b)) Beale has one minimum only which wasobtained satisfactory by the two methods This pattern ofbehavior was also observed for the booth function GBCSwas significantly more effective than the CS as it reached theglobal optimumwithin a tolerance of 10minus32 in less than half ofthe number of iterations of CS as shown in Figure 2(c) Thefirst three functions are relatively easy to optimize the globaloptima were easily obtained

The cross-leg table function is a difficult one to minimizeIts value at the global minimum is minus1 Both algorithmshave not been able to reach the global minimum Yet

4 Mathematical Problems in Engineering

Table1Testfunctio

nsused

fortestin

gthep

erform

ance

ofCS

andGBC

S

Num

ber

Nam

eFu

nctio

nandits

deriv

ative

Search

Dom

ain

1Ac

kley

1198911=20(1minus119890minus02radic05(1199092 1+1199092 2)

)minus11989005(cos21205871199091+cos21205871199092)

+1198901

1205971198911

120597119909119896

=2119909119896119890minus02radic05(1199092 1+1199092 2)

radic05(1199092 1+1199092 2)

+12058711989005(cos(21205871199091)+cos(21205871199092))

sin(2120587119909119896)

[minus3535]

2Be

ale

1198912=(15minus1199091+11990911199092)2+(225minus1199091+11990911199092 2)2+(2625minus1199091+11990911199093 2)2

1205971198912

1205971199091

=2(1199092 2minus1)(11990911199092 2minus1199091+225)+2(1199093 2minus1)(11990911199093 2minus1199091+2625)+2(1199092minus1)(11990911199092minus1199091+15)

1205971198912

1205971199092

=21199091(11990911199092minus1199091+15)+411990911199092(11990911199092 2minus1199091+225)+611990911199092 2(11990911199093 2minus1199091+2625)

[minus4545]

3Bo

oth

1198913=(1199091+21199092minus7)2+(21199091+1199092minus5)2

1205971198913

1205971199091

=101199091+81199092minus34

1205971198913

1205971199092

=81199091+101199092minus38

[minus1010]

4Cr

oss-legtable

1198914=minus[1003816 1003816 1003816 1003816 1003816 1003816 1003816sin(1199091)sin(1199092)119890|100minusradic1199092 1+1199092 2120587|

1003816 1003816 1003816 1003816 1003816 1003816 1003816+1]

minus01

1205971198914

1205971199091

=011205901(sign[sin(1199091)sin(1199092)]cos(1199091)sin(1199092)+1199091sig

n(1205902120587minus100)1003816 1003816 1003816 1003816sin(1199091)sin(1199092)1003816 1003816 1003816 10038161205871205902)

(1205901

1003816 1003816 1003816 1003816sin(1199091)sin(1199092)1003816 1003816 1003816 1003816+1)11

1205971198914

1205971199092

=011205901(sign[sin(1199091)sin(1199092)]cos(1199092)sin(1199091)+1199092sig

n(1205902120587minus100)1003816 1003816 1003816 1003816sin(1199091)sin(1199092)1003816 1003816 1003816 10038161205871205902)

(1205901

1003816 1003816 1003816 1003816sin(1199091)sin(1199092)1003816 1003816 1003816 1003816+1)11

where1205901=119890|(1205902120587)minus100|

1205902=radic1199092 1+1199092 2

[minus1010]

5Him

melb

lau

1198915=(1199092 1+1199092minus11)2+(1199092 2+1199091minus7)2

1205971198915

1205971199091

=21199091+41199091(1199092 1+1199092minus11)+21199092 2minus141205971198915

1205971199092

=21199092+41199092(1199092 2+1199091minus7)+21199092 1minus22

[minus55]

6Levy

13

1198916=sin2

(31205871199091)+(1199091minus1)2[1+sin2

(31205871199092)]+(1199092minus1)2[1+sin2

(21205871199092)]

1205971198916

1205971199091

=(21199091minus2)(sin(31205871199092)2+1)+6120587cos(31205871199091)sin(31205871199091)

1205971198916

1205971199092

=(21199092minus2)(sin(21205871199092)2+1)+4120587cos(21205871199092)sin(21205871199092)(1199092minus1)2+6120587cos(31205871199092)sin(31205871199092)(1199091minus1)2

[minus1010]

7Matyas

1198917=0261199092 1minus04811990911199092+0261199092 2

1205971198917

1205971199091

=0521199091minus04811990921205971198917

1205971199092

=0521199092minus0481199091

[minus1010]

Mathematical Problems in Engineering 5

Table1Con

tinued

Num

ber

Nam

eFu

nctio

nandits

deriv

ative

Search

Dom

ain

8Schaffer

1198918=05+

sin2

[radic1199092 1+1199092 2]minus05

[0001(1199092 1+1199092 2)+1]2

1205971198918

120597119909119896

=minus119909119896((0004sin(120590)2minus0002)

(0001(1199092 1+1199092 2)+1)3

minus2cos (120590)sin(120590)

120590(0001(1199092 1+1199092 2)+1)2)

Where120590=radic1199092 1+1199092 2

[minus100100]

9Po

well

1198919=(1199091+101199092)2+5(1199093minus1199094)2+(1199092minus21199093)4+10(1199091minus1199094)4

1205971198919

1205971199091

=21199091+201199092+40(1199091minus1199094)31205971198919

1205971199092

=201199091+2001199092+4(1199092minus21199093)3

1205971198919

1205971199093

=101199093minus101199094minus8(1199092minus21199093)31205971198919

1205971199094

=101199094minus101199093minus40(1199091minus1199094)3

[minus10001000]

10Po

wer

sum

11989110=

119898 sum 119894=1

[(

119898 sum 119895=1

119909119894 119895)minus119887 119894]

2

119887=[81842114]

12059711989110

120597119909119896

=2

119898 sum 119894=1

119894119909119894minus1

119896[(

119898 sum 119895=1

119909119894 119895)minus119887 119894]

[04]

11Sh

ekel5

11989111=minus

119898 sum 119894=1

1

119888 119894+sum119899 119895=1(119909119895minus119886119894119895)2

=1

119898=1012059711989111

120597119909119896

=

119898 sum 119894=1

2(119909119896minus119886119894119896)

[119888 119894+sum119899 119895=1(119909119895minus119886119894119895)2

]2

[010]

12Woo

d

11989112=100(1199092 1minus1199092)2+(1199091minus1)2+(1199093minus1)2+90(1199092 3minus1199094)2+101[(1199092minus1)2+(1199094minus1)2]+198(1199092minus1)(1199094minus1)

12059711989112

1205971199091

=21199091minus4001199091(1199092minus1199092 1)minus212059711989112

1205971199092

=minus2001199092 1+22021199092+1981199094minus40

12059711989112

1205971199093

=21199093minus3601199093(1199094minus1199092 3)minus212059711989112

1205971199094

=minus1801199092 3+1981199092+20021199094minus40

[minus10001000]

13Cu

be11989113=

119898minus1

sum 119894=1

100(119909119894+1minus1199093 119894)2

+(1minus119909119894)212059711989113

1205971199091

=21199091minus6001199092 1(1199092minus1199093 1)minus2

12059711989113

120597119909119896

=202119909119896minus6001199092 119896(119909119896+1minus1199093 119896)minus2001199093 119896minus1minus212059711989113

120597119909119898

=200(119909119898minus1199093 119898minus1)

[minus100100]

14StochasticC

ube

11989114=

119898minus1

sum 119894=1

100120576 119894(119909119894+1minus1199093 119894)2

+(1minus119909119894)212059711989113

1205971199091

=21199091minus6001199092 1120576 119894(1199092minus1199093 1)minus2

12059711989113

120597119909119896

=2119909119896minus600120576 1198961199092 119896(119909119896+1minus1199093 119896)+200120576 119894(119909119896minus1199093 119896minus1)minus212059711989113

120597119909119898

=200120576 119898minus1(119909119898minus1199093 119898minus1)

[minus100100]

15Sphere

11989115=

119898 sum 119894=1

1199092 11989412059711989115

120597119909119896

=2119909119896

[minus100100]

16Hartm

ann

11989116=minus

4 sum 119894=1

120572119894exp(minus

6 sum 119895=1

119860119894119895(119909119895minus119875119894119895)2

)12059711989116

120597119909119896

=

4 sum 119894=1

2120572119894119860119894119895( 119909119896minus119875119894119896)exp(minus

6 sum 119895=1

119860119894119895(119909119895minus119875119894119895)2

)[01]

6 Mathematical Problems in Engineering

Table1Con

tinued

Num

ber

Nam

eFu

nctio

nandits

deriv

ative

Search

Dom

ain

17Dixon

-pric

e11989117=(1199091minus1)2+

119898 sum 119894=2

119894(21199092 119894minus119909119894minus1)2

12059711989117

1205971199091

=minus81199092 2+61199091minus2

12059711989117

120597119909119896

=(2119909119896minus41199092 119896+1)(119896+1)minus8119896119909119896(119909119896minus1minus21199092 119896)12059711989117

120597119909119898

=minus8119898119909119898(119909119898minus1minus21199092 119898)

[minus1010]

18Grie

wank

11989118=

1

4000[

119898 sum 119894=1

(119909119894minus100)2]minus[

119898 prod 119894=1

cos(119909119894minus100

radic119894

)]+1

12059711989118

120597119909119896

=(119909119896minus100)

2000

+1 radic119894tan(119909119896minus100

radic119894

)[

119898 prod 119894=1

cos(119909119894minus100

radic119894

)]

[minus600600]

19Grie

wanksto

chastic

11989119=

1

4000[

119898 sum 119894=1

120576 119894(119909119894minus100)2]minus[

119898 prod 119894=1

cos(119909119894minus100

radic119894

)]+1

12059711989118

120597119909119896

=120576 119896(119909119896minus100)

2000

+1 radic119894tan(119909119896minus100

radic119894

)[

119898 prod 119894=1

cos(119909119894minus100

radic119894

)]

[minus600600]

20Michaelw

icz

11989120=minus

119898 sum 119894=1

sin(119909119894)sin20

(1198941199092 119894

120587)

12059711989120

120597119909119896

=minuscos (119896minus119909)sin(1198961199092 119896

120587)

20

minus40119896119909119896

120587cos(1198961199092 119896

120587)sin(1198961199092 119896

120587)

19

sin(119909119896)

[0120587]

21Ro

senb

rock

11989121=

119898minus1

sum 119894=1

100(119909119894+1minus1199092 119894)2

+(119909119894minus1)212059711989121

1205971199091

=2(1199091minus1)minus4001199091(1199092minus1199092 1)

12059711989121

120597119909119896

=minus2001199092 119896minus1+202119909119896minus400119909119896(119909119896+1minus1199092 119896)minus212059711989121

120597119909119898

=200(119909119898minus1199092 119898minus1)

[minus5050]

22StochasticR

osenbrock

11989122=

119898minus1

sum 119894=1

100120576 119894(119909119894+1minus1199092 119894)2

+(119909119894minus1)212059711989122

1205971199091

=2(1199091minus1)minus400120576 11199091(1199092minus1199092 1)

12059711989122

120597119909119896

=2119909119896+200120576 119896(119909119896minus1199092 119896minus1)minus400120576 119896119909119896(119909119896+1minus1199092 119896)minus212059711989122

120597119909119898

=200120576 119894(119909119898minus1199092 119898minus1)

[minus5050]

23Trigon

ometric

11989123=

119898 sum 119894=1

[119898+119894(1minuscos119909119894)minussin119909119894minus

119898 sum 119895=1

cos119909119895]

2

12059711989123

120597119909119896

=2[119898+119896(1minuscos119909119894)minussin119909119894minus

119898 sum 119895=1

cos119909119895][ (119894+1)sin119909119896minuscos119909119896]

[minus10001000]

24Za

charov

11989124=

119898 sum 119894=1

1199092 119894+(

119898 sum 119894=1

05119894119909119894)

2

+(

119898 sum 119894=1

05119894119909119894)

4

12059711989124

120597119909119896

=2119909119896+119896

119898 sum 119894=1

05119894119909119894+2119896(

119898 sum 119894=1

05119894119909119894)

3

[minus510]

Mathematical Problems in Engineering 7

020

400

20400

5

10

15

20

xy minus20 minus20minus40 minus40

f(xy)

(a)

05

0

50

05

1

15

2

xy

f(xy)

minus5 minus5

times105

(b)

05

100

510

0500

10001500200025003000

xy

f(xy)

minus5 minus5minus10 minus10

(c)

x

0 05

10

yminus5 minus5

minus1

minus08

minus06

minus04

minus02

0

minus10 minus10

f(xy)

510

(d)

0

200

400

600

800

1000

xy

f(xy)

0 0

5

minus5 minus5

5

(e)

xy

0

100

200

300

400

f(xy)

550 0

1010

minus5 minus5minus10 minus10

(f)

xy

0

40

20

60

80

100

f(xy)

0 05

510

10

minus5 minus5minus10 minus10

(g)

y x

550 0

1010

minus5 minus5minus10 minus10

0

0402

0608

1

f(xy)

(h)

Figure 1 Surface plots of the two-variable benchmark functions used in this study (a) Ackley (b) Beale (c) Booth (d) cross-leg table (e)Himmelblau (f) Levy 13 (g) Matyas and (h) Schaffer functions

8 Mathematical Problems in Engineering

Table 2 Values of the meanminima and standard deviations obtained by the CS and GBCS algorithms compared with the value of the globalminima of the twenty-four benchmark problems

Number Benchmarkfunction

Number ofvariables

Globalmin

Number ofiterations

GBCS CSMean Std Dev Mean Std Dev

1 Ackley 2 0 1000 0 0 22204E-16 67752E-162 Beale 2 0 1000 0 0 57891E-30 12165E-293 Booth 2 0 1000 0 0 0 04 Cross-leg table 2 minus1 1000 minus11463E-2 7672E-3 minus62704E-3 36529E-35 Himmelblau 2 0 1000 17058E-28 2836E-28 25958E-19 53451E-196 Levy 13 2 0 1000 13498E-31 66809E-47 13498E-31 66809E-477 Matyas 2 0 1000 27691E-54 4728E-54 20407E-38 50616E-388 Schaffer 2 0 3000 0 0 74015E-18 19193E-179 Powell 4 0 1000 18694E-8 35848E-8 16296E-13 34802E-1310 Power sum 4 0 1000 18328E-4 16761E-4 25432E-4 18167E-411 Shekel 5 4 minus10536 200 minus10536 16289E-5 minus10536 18421E-212 Wood 4 0 1000 23726 22208 040838 033713 Cube 5 0 5000 12567 086542 5782E-8 25596E-714 Stochastic cube 5 0 5000 77438 69815 64369 5029215 Sphere 5 0 1000 25147E-38 51577E-38 11371E-21 12967E-2116 Hartmann 6 minus33224 200 minus33224 43959E-10 minus33215 60711E-417 Dixon-price 50 0 5000 47094E-2 16904E-1 66667E-1 26103E-618 Griewank 50 0 5000 0 0 33651E-10 94382E-10

19 StochasticGriewank 50 0 5000 72758E-13 28579E-12 69263 20451

20 Michaelwicz 50 5000 minus32263 13729 minus27383 1355121 Rosenbrock 50 0 5000 097368 05885 35286 37012

22 StochasticRosenbrock 50 0 5000 58599 19122 48944 46783

23 Trigonometric 50 0 5000 53560 45360 19435 4033724 Zacharov 50 0 5000 65769 13288 27031 52748

GBCS performed significantly better than CS as shown inFigure 2(d) On the other hand both algorithms were ableto identify the global minimum of the Himmelblau functionFigure 2(e) shows the evolution of the mean best valuesGBCS performed more effectively than CS Both algorithmswere also able to identify the minimum of the Levy 13 func-tion (Figure 2(f)) However GBCS was significantly moreeffective than CS This pattern was repeated with the Matyasfunction as shown in Figure 2(g)

The Schaffer function is multimodal Both GBCS and CSfailed to converge to the global minimum within the 10minus10tolerance at 1000 iterations Running both algorithms to 3000iterations resulted in GBCS reaching the global optimumwhile CS not as shown in Figure 2(f) Schaffer functionconcludes the 2-variable functions GBCS performed betterthan CS in all of them

The Powell function has four variables The evolution ofthe mean best values of CS and GBCS plotted in Figure 3(a)showed that performance of CS was better than GBCSalthough they were close to the global minimum Powell isone of the few test functions for which the use of the gradientdid not improve the performance of the CS algorithm

Minor improvements were observed with the power sumand the Shekel 5 functions Power sumrsquos global optimum wasnot obtained by the two algorithms as shown in Figure 3(b)as both algorithms seem to be trapped in a local minimumFor the Shekel 5 function Figure 3(c) the global optimumwas easily optimized with minor improvement in perfor-mance for the GBCS algorithm The global optimum for theWood function which has four variables as well was notachieved by both algorithms within 1000 iterations as shownin Figure 3(d) When they were run for 5000 GBCS seemedto be trapped in a local minimum and was not able to reachthe global optimum which was obtained by CS The Woodfunction is the only function for which GBCS performancewas much worse than that of the original CS algorithm

The performance of both algorithms for the Cube and theStochastic Cube functions was peculiar In both cases GBCSoutperformed CS at the early iterations At larger iterationsCS did better For the two functions as depicted in Figures3(e) and 3(f) the globalminimumwas not obtainedwith 1000iterations so both problems were run for 5000 CS was ableto come close to the global minimum for the Cube functionwhile GBCS seemed to be trapped in a local minimum For

Mathematical Problems in Engineering 9

0 500 1000

Func

tion

valu

e

Iterations

1010

100

10minus10

10minus20

(a)

0 500 1000

Func

tion

valu

eIterations

100

10minus20

10minus30

10minus10

10minus40

(b)

0 500 1000

Func

tion

valu

e

Iterations

1020

10minus20

10minus40

100

(c)

0 500 1000

Iterations

minus10minus4

minus10minus3

minus10minus2

minus10minus1

Func

tion

valu

e

(d)

0 500 1000

Iterations

1010

100

10minus10

10minus20

10minus30

Func

tion

valu

e

(e)

0 500 1000

Iterations

1020

100

10minus20

10minus40

Func

tion

valu

e(f)

0 500 1000

Iterations

GBCSCS

100

10minus20

10minus40

10minus60

Func

tion

valu

e

(g)

0 1000 2000 3000

Iterations

GBCSCS

100

10minus5

10minus10

10minus20

10minus15

Func

tion

valu

e

(h)

Figure 2 Evolution of mean best values for GBCS and the original CS algorithm for (a) Ackley (b) Beale (c) Booth (d) Cross-leg table (e)Himmelblau (f) Levy 13 (g) Matyas and (h) Schaffer functions

the Stochastic Cube function CS did slightly better at largeriterations but both algorithms failed to reach the globalminimum after 5000 iterations

The sphere function has five variables as the Cubefunction but it is easier to solve Both algorithms were ableto identify the global optimum as depicted in Figure 3(g)but GBCS considerably outperformed CS in terms of per-formance efficiency The Hartmann function which has 6

variables was relatively easy to solve for GBCS algorithms asshown in Figure 3(h) GBCS arrived at the global minimumbut CS could not do so within the tolerance GBCS alsooutperformed CS in terms of performance efficiency inreaching the global optimum of the Hartmann function

Figure 4 shows the performance of test functions with 50variablesThese are themost challenging problems due to thelarge domain space Figure 4(a) depicts the performance of

10 Mathematical Problems in Engineering

0 500 1000

Iterations

100

1010

10minus10

10minus20

Func

tion

valu

e

(a)

0 500 1000Iterations

102

100

10minus2

10minus4

Func

tion

valu

e(b)

0 50 100 150 200Iterations

minus100

minus101

minus102

Func

tion

valu

e

(c)

1020

100

10minus20

10minus40

Func

tion

valu

e

0 2000 4000 6000

Iterations

(d)

0 2000 4000 6000

Iterations

1010

105

100

10minus5

10minus10

Func

tion

valu

e

(e)

0 2000 4000 6000Iterations

1010

105

100

10minus5

Func

tion

valu

e(f)

0 500 1000

Iterations

1020

100

10minus20

10minus40

Func

tion

valu

e

GBCSCS

(g)

0 50 100 150 200

Iterations

Func

tion

valu

e

minus1004

minus1006

GBCSCS

(h)

Figure 3 Evolution of mean best values for GBCS and the original CS algorithm for (a) Powell (b) Power Sum (c) Shekel 5 (d) Wood (e)Cube (f) Stochastic Cube (g) Sphere and (h) Hartmann functions

both algorithms for the Dixon-Price function Clearly GBCSperformed better than CS in the early iterations Even thoughboth algorithms were not able to attain the global minimumtheminimum arrived at by GBCS at early iterations is orders-of-magnitude lower than that arrived at by CS

GBCS outperformed CS in both the Griewank(Figure 4(b)) and the Stochastic Griewank (Figure 4(c))For the Griewank function both algorithms were able toidentify the global minimum However GBCS arrived at the

global minimum at less than the half number of iterationscompared to CS For the Stochastic Griewank CS wasnot able to identify the global minimum even after 5000iterations The result predicted by CS was more than 10orders-of-magnitude higher than the result predicted byGBCS

Figure 4(d) shows the results for the Michaelwics func-tion To the best of the authorsrsquo knowledge the globalminimum for the 50-variableMichaelwics is not known since

Mathematical Problems in Engineering 11

0 2000 4000 6000

Iterations

1010

105

100

10minus5

(a)

0 2000 60004000

IterationsFu

nctio

n va

lue

1010

100

10minus10

10minus20

(b)

0 2000 60004000

Iterations

Func

tion

valu

e

105

100

10minus5

10minus10

10minus15

(c)

0 2000 60004000

Iterations

Func

tion

valu

e

minus1011

minus1013

minus1015

(d)

0 2000 60004000

Iterations

Func

tion

valu

e

105

1010

100

10minus5

(e)Fu

nctio

n va

lue

105

1010

1000 2000 60004000

Iterations

(f)

0 2000 60004000

Iterations

Func

tion

valu

e

105

106

104

103

GBCSCS

(g)

0 2000 60004000

Iterations

Func

tion

valu

e

105

1010

100

GBCSCS

(h)

Figure 4 Evolution of mean best values for GBCS and the original CS algorithm for (a) Dixon-Price (b) Griewank (c) Stochastic Griewank(d) Michaelwics (e) Rosenbrock (f) Stochastic Rosenbrock (g) Trigonometric and (h) Zacherov functions

there is no published literature identifying it GBCS identifiedaminimum that is lower than that identified byCS as depictedin Figure 4(d)

Although both algorithms were not able to identifythe global optima of the Rosenbrock and the StochasticRosenbrock functions as depicted in Figures 4(e) and 4(f)respectively the results obtained by GBCS were orders-of-magnitude lower than those obtained by CS The superiorityof the GBCS is clearly demonstrated with those two func-tions

Figure 4(g) shows the results for the Trigonometric func-tion For this difficult problem both algorithms were notable to reach the global minimum within 5000 iterations Inthis function nonetheless as in most cases studied GBCSachieved a lower result than CS

The last of the 50-variable functions tested was theZacherov function The results of Figure 4(h) also show thesuperiority of GBCS in reaching results that are orders-of-magnitude lower than those obtained by the original CSalgorithm

12 Mathematical Problems in Engineering

Table 2 shows a summary of the evaluation results for thetwenty-four benchmark problems GBCS was able to providebetter solutions to all challenging problemsGBCSwas able toachieve better performance in eighteen problems equivalentperformance in twoproblems andworse performance in fourout of the twenty-four problems In one case the StochasticGriewank function the global optimum was successfullyobtained by GBCS for this 50-variable problem while CSfailed to find the global optimum even after 5000 iterationsOn the other hand the four functions for which CS per-formed better were the Powell Wood Cube and StochasticCube functions

It is interesting to note that GBCS outperformed CSin handling two of the three stochastic functions tested inthis study This result confirms that despite the use of thegradient as a guide upon which some moves are basedthe stochastic nature of the algorithm remained unaffectedIn fact the additional information was used quite subtlyCuckoos still search for nests randomly but with the help ofsome intelligence

6 Conclusions

In this study we use the gradient of the objective functionwhich could be available or easily obtainable for severalobjective functions in engineering calculations to improvethe performance of one of the most promising stochasticalgorithms the cuckoo search The proposed modificationwas subtly implemented in the algorithm by changing thedirection of the local random walk of cuckoos towards thedirection of the minimum value from the point of viewof the cuckoorsquos location We evaluated this modification byattempting to find the global optimum of twenty-four bench-mark functionsThe newly developed GBCS algorithm led toimproved reliability and effectiveness of the algorithm in allbut four of tested benchmark problems which included threestochastic functions In some cases the global minimumcould not have been obtained via the original CS algorithmbut was obtained via GBCS Although the improvementin performance was not achieved with the entire set ofbenchmark problems GBCS proved to be more reliable andefficient in the majority of the tested problems

Conflict of InterestsThe authors declare that there is no conflict of interestsregarding the publication of this paper

References

[1] C A Floudas and C E Gounaris ldquoA review of recent advancesin global optimizationrdquo Journal of Global Optimization vol 45no 1 pp 3ndash38 2009

[2] L M Rios and N V Sahinidis ldquoDerivative-free optimizationa review of algorithms and comparison of software implemen-tationsrdquo Journal of Global Optimization vol 56 pp 1247ndash12932012

[3] X S Yang and S Deb ldquoCuckoo search via Levy flightsrdquo in Pro-ceedings of the World Congress on Nature amp Biologically InspiredComputing (NABIC rsquo09) pp 210ndash214 IEEE Coimbatore IndiaDecember 2009

[4] X-S Yang and S Deb ldquoCuckoo search recent advances andapplicationsrdquo Neural Computing and Applications vol 24 no1 pp 169ndash174 2014

[5] J Piechocki D Ambroziak A Palkowski and G RedlarskildquoUse of modified cuckoo search algorithm in the design processof integrated power systems for modern and energy self-sufficient farmsrdquo Applied Energy vol 114 pp 901ndash908 2014

[6] G Kanagaraj S Ponnambalam and N Jawahar ldquoA hybridcuckoo search and genetic algorithm for reliability-redundancyallocation problemsrdquo Computers amp Industrial Engineering vol66 no 4 pp 1115ndash1124 2013

[7] V Bhargava S Fateen and A Bonilla-Petriciolet ldquoCuckoosearch a new nature-inspired optimization method for phaseequilibrium calculationsrdquo Fluid Phase Equilibria vol 337 pp191ndash200 2013

[8] P KMohanty andD R Parhi ldquoCuckoo search algorithm for themobile robot navigationrdquo in Swarm Evolutionary and MemeticComputing pp 527ndash536 Springer New York NY USA 2013

[9] X S Yang and S Deb ldquoEngineering optimisation by cuckoosearchrdquo International Journal of Mathematical Modelling andNumerical Optimisation vol 1 no 4 pp 330ndash343 2010

Submit your manuscripts athttpwwwhindawicom

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical Problems in Engineering

Hindawi Publishing Corporationhttpwwwhindawicom

Differential EquationsInternational Journal of

Volume 2014

Applied MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Probability and StatisticsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical PhysicsAdvances in

Complex AnalysisJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

OptimizationJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

CombinatoricsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Operations ResearchAdvances in

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Function Spaces

Abstract and Applied AnalysisHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of Mathematics and Mathematical Sciences

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Algebra

Discrete Dynamics in Nature and Society

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Decision SciencesAdvances in

Discrete MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014 Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Stochastic AnalysisInternational Journal of

Mathematical Problems in Engineering 3

beginObjective function 119891(119909)Generate initial population of 119899 host nests 119909

119894

while (t ltMaxGeneration)Get a cuckoo randomly by Levy flightsEvaluate its qualityfitnessChoose a nest among n (say j) randomlyIf (119865119894gt 119865119895)

Replace j by the new solutionendA fraction (1 minus 119901

119886) of nests are abandoned at random and new ones are built via random walk

Keep the best solutionsRank the solutions and find the current best

end whilePostprocess results

end

Algorithm 1 Simplified algorithm of the cuckoo search algorithm

original CS These problems were chosen from amongsta list of forty-one functions All forty-one functions werescreened first by performing five optimization runs with bothCS and GBCS Functions for which both CS and GBCSperformed extremely well with no significant difference inthe results were deemed not suitable for comparison andexcluded from the evaluation Since CS is already a high-performance global optimizer the excluded functions werenot suitable for showing the differences between the twoalgorithms Table 1 lists the twenty-four benchmark functionsused for the evaluation of the two algorithms along with theirderivatives and the search domain The number of variablesthe number of iterations used and the value at the globalminimum for each problem are shown with the results inTable 2

Note that the benchmark functions used include threestochastic test functions Most deterministic algorithms suchas Nelder-Mead downhill simplex method would fail withthose stochastic functions [9] However the inclusion ofthose stochastic functions in the present evaluation is toensure that the modification of the stochastic method notonly preserves its stochastic nature and ability to solvestochastic functions but also improves the performanceof minimization algorithm The stochastic functions weredeveloped by turning three deterministic functions (CubeRosenbrock and Griewank) into stochastic test functionsby adding a stochastic vector 120576 drawn from a uniformdistribution in isin [0 1] [9]

The twenty-four problems constitute comprehensive test-ing for the reliability and effectiveness of the suggestedmodification to the original CS Eight functions have twovariables only and their surface plots are shown in Figure 1

Each test function was run for thirty times on MATLABon an HP Pavilion dv6 laptop with Core i7 processorThe function value was recorded at every iteration and theaverage and standard deviation for the thirty runs werecalculated at every iteration The maximum number ofiterations was 1000 for functions with 2 variables and 5000

for functions with 50 variables The population size for thecuckoo swarm was 100 The only parameter used in theoriginal CS 119901

119886 was kept constant at a value of 025 The

MATLAB code for all the functions and their derivativesis available online at httpdxdoiorg1011552014493740 assupplementary material to this paper

5 Results and Discussion

As stated each of the numerical experiments was repeated30 times with different random seeds for GBCS and for theoriginal CS algorithm The value at each iteration for eachtrial was recorded The mean and the standard deviation ofthe function values were calculated at each iteration Theprogress of the mean values is presented in Figures 2ndash4 foreach function and a brief discussion of those results followsThe total CPU time for running the two algorithms for 30times on each problem varied depending on the problemrsquosnumber of variables and the number of iterations used CPUtime ranged from 3827min for two-variable problems to9103min for fifty-variable problems

The Ackley function has one minimum only which wasobtained using the two methods as shown in Figure 2(a)However GBCS is clearly more effective than CS in reachingthe minimum at less number of iterations The improvementin performance was also clear with the Beale function(Figure 2(b)) Beale has one minimum only which wasobtained satisfactory by the two methods This pattern ofbehavior was also observed for the booth function GBCSwas significantly more effective than the CS as it reached theglobal optimumwithin a tolerance of 10minus32 in less than half ofthe number of iterations of CS as shown in Figure 2(c) Thefirst three functions are relatively easy to optimize the globaloptima were easily obtained

The cross-leg table function is a difficult one to minimizeIts value at the global minimum is minus1 Both algorithmshave not been able to reach the global minimum Yet

4 Mathematical Problems in Engineering

Table1Testfunctio

nsused

fortestin

gthep

erform

ance

ofCS

andGBC

S

Num

ber

Nam

eFu

nctio

nandits

deriv

ative

Search

Dom

ain

1Ac

kley

1198911=20(1minus119890minus02radic05(1199092 1+1199092 2)

)minus11989005(cos21205871199091+cos21205871199092)

+1198901

1205971198911

120597119909119896

=2119909119896119890minus02radic05(1199092 1+1199092 2)

radic05(1199092 1+1199092 2)

+12058711989005(cos(21205871199091)+cos(21205871199092))

sin(2120587119909119896)

[minus3535]

2Be

ale

1198912=(15minus1199091+11990911199092)2+(225minus1199091+11990911199092 2)2+(2625minus1199091+11990911199093 2)2

1205971198912

1205971199091

=2(1199092 2minus1)(11990911199092 2minus1199091+225)+2(1199093 2minus1)(11990911199093 2minus1199091+2625)+2(1199092minus1)(11990911199092minus1199091+15)

1205971198912

1205971199092

=21199091(11990911199092minus1199091+15)+411990911199092(11990911199092 2minus1199091+225)+611990911199092 2(11990911199093 2minus1199091+2625)

[minus4545]

3Bo

oth

1198913=(1199091+21199092minus7)2+(21199091+1199092minus5)2

1205971198913

1205971199091

=101199091+81199092minus34

1205971198913

1205971199092

=81199091+101199092minus38

[minus1010]

4Cr

oss-legtable

1198914=minus[1003816 1003816 1003816 1003816 1003816 1003816 1003816sin(1199091)sin(1199092)119890|100minusradic1199092 1+1199092 2120587|

1003816 1003816 1003816 1003816 1003816 1003816 1003816+1]

minus01

1205971198914

1205971199091

=011205901(sign[sin(1199091)sin(1199092)]cos(1199091)sin(1199092)+1199091sig

n(1205902120587minus100)1003816 1003816 1003816 1003816sin(1199091)sin(1199092)1003816 1003816 1003816 10038161205871205902)

(1205901

1003816 1003816 1003816 1003816sin(1199091)sin(1199092)1003816 1003816 1003816 1003816+1)11

1205971198914

1205971199092

=011205901(sign[sin(1199091)sin(1199092)]cos(1199092)sin(1199091)+1199092sig

n(1205902120587minus100)1003816 1003816 1003816 1003816sin(1199091)sin(1199092)1003816 1003816 1003816 10038161205871205902)

(1205901

1003816 1003816 1003816 1003816sin(1199091)sin(1199092)1003816 1003816 1003816 1003816+1)11

where1205901=119890|(1205902120587)minus100|

1205902=radic1199092 1+1199092 2

[minus1010]

5Him

melb

lau

1198915=(1199092 1+1199092minus11)2+(1199092 2+1199091minus7)2

1205971198915

1205971199091

=21199091+41199091(1199092 1+1199092minus11)+21199092 2minus141205971198915

1205971199092

=21199092+41199092(1199092 2+1199091minus7)+21199092 1minus22

[minus55]

6Levy

13

1198916=sin2

(31205871199091)+(1199091minus1)2[1+sin2

(31205871199092)]+(1199092minus1)2[1+sin2

(21205871199092)]

1205971198916

1205971199091

=(21199091minus2)(sin(31205871199092)2+1)+6120587cos(31205871199091)sin(31205871199091)

1205971198916

1205971199092

=(21199092minus2)(sin(21205871199092)2+1)+4120587cos(21205871199092)sin(21205871199092)(1199092minus1)2+6120587cos(31205871199092)sin(31205871199092)(1199091minus1)2

[minus1010]

7Matyas

1198917=0261199092 1minus04811990911199092+0261199092 2

1205971198917

1205971199091

=0521199091minus04811990921205971198917

1205971199092

=0521199092minus0481199091

[minus1010]

Mathematical Problems in Engineering 5

Table1Con

tinued

Num

ber

Nam

eFu

nctio

nandits

deriv

ative

Search

Dom

ain

8Schaffer

1198918=05+

sin2

[radic1199092 1+1199092 2]minus05

[0001(1199092 1+1199092 2)+1]2

1205971198918

120597119909119896

=minus119909119896((0004sin(120590)2minus0002)

(0001(1199092 1+1199092 2)+1)3

minus2cos (120590)sin(120590)

120590(0001(1199092 1+1199092 2)+1)2)

Where120590=radic1199092 1+1199092 2

[minus100100]

9Po

well

1198919=(1199091+101199092)2+5(1199093minus1199094)2+(1199092minus21199093)4+10(1199091minus1199094)4

1205971198919

1205971199091

=21199091+201199092+40(1199091minus1199094)31205971198919

1205971199092

=201199091+2001199092+4(1199092minus21199093)3

1205971198919

1205971199093

=101199093minus101199094minus8(1199092minus21199093)31205971198919

1205971199094

=101199094minus101199093minus40(1199091minus1199094)3

[minus10001000]

10Po

wer

sum

11989110=

119898 sum 119894=1

[(

119898 sum 119895=1

119909119894 119895)minus119887 119894]

2

119887=[81842114]

12059711989110

120597119909119896

=2

119898 sum 119894=1

119894119909119894minus1

119896[(

119898 sum 119895=1

119909119894 119895)minus119887 119894]

[04]

11Sh

ekel5

11989111=minus

119898 sum 119894=1

1

119888 119894+sum119899 119895=1(119909119895minus119886119894119895)2

=1

119898=1012059711989111

120597119909119896

=

119898 sum 119894=1

2(119909119896minus119886119894119896)

[119888 119894+sum119899 119895=1(119909119895minus119886119894119895)2

]2

[010]

12Woo

d

11989112=100(1199092 1minus1199092)2+(1199091minus1)2+(1199093minus1)2+90(1199092 3minus1199094)2+101[(1199092minus1)2+(1199094minus1)2]+198(1199092minus1)(1199094minus1)

12059711989112

1205971199091

=21199091minus4001199091(1199092minus1199092 1)minus212059711989112

1205971199092

=minus2001199092 1+22021199092+1981199094minus40

12059711989112

1205971199093

=21199093minus3601199093(1199094minus1199092 3)minus212059711989112

1205971199094

=minus1801199092 3+1981199092+20021199094minus40

[minus10001000]

13Cu

be11989113=

119898minus1

sum 119894=1

100(119909119894+1minus1199093 119894)2

+(1minus119909119894)212059711989113

1205971199091

=21199091minus6001199092 1(1199092minus1199093 1)minus2

12059711989113

120597119909119896

=202119909119896minus6001199092 119896(119909119896+1minus1199093 119896)minus2001199093 119896minus1minus212059711989113

120597119909119898

=200(119909119898minus1199093 119898minus1)

[minus100100]

14StochasticC

ube

11989114=

119898minus1

sum 119894=1

100120576 119894(119909119894+1minus1199093 119894)2

+(1minus119909119894)212059711989113

1205971199091

=21199091minus6001199092 1120576 119894(1199092minus1199093 1)minus2

12059711989113

120597119909119896

=2119909119896minus600120576 1198961199092 119896(119909119896+1minus1199093 119896)+200120576 119894(119909119896minus1199093 119896minus1)minus212059711989113

120597119909119898

=200120576 119898minus1(119909119898minus1199093 119898minus1)

[minus100100]

15Sphere

11989115=

119898 sum 119894=1

1199092 11989412059711989115

120597119909119896

=2119909119896

[minus100100]

16Hartm

ann

11989116=minus

4 sum 119894=1

120572119894exp(minus

6 sum 119895=1

119860119894119895(119909119895minus119875119894119895)2

)12059711989116

120597119909119896

=

4 sum 119894=1

2120572119894119860119894119895( 119909119896minus119875119894119896)exp(minus

6 sum 119895=1

119860119894119895(119909119895minus119875119894119895)2

)[01]

6 Mathematical Problems in Engineering

Table1Con

tinued

Num

ber

Nam

eFu

nctio

nandits

deriv

ative

Search

Dom

ain

17Dixon

-pric

e11989117=(1199091minus1)2+

119898 sum 119894=2

119894(21199092 119894minus119909119894minus1)2

12059711989117

1205971199091

=minus81199092 2+61199091minus2

12059711989117

120597119909119896

=(2119909119896minus41199092 119896+1)(119896+1)minus8119896119909119896(119909119896minus1minus21199092 119896)12059711989117

120597119909119898

=minus8119898119909119898(119909119898minus1minus21199092 119898)

[minus1010]

18Grie

wank

11989118=

1

4000[

119898 sum 119894=1

(119909119894minus100)2]minus[

119898 prod 119894=1

cos(119909119894minus100

radic119894

)]+1

12059711989118

120597119909119896

=(119909119896minus100)

2000

+1 radic119894tan(119909119896minus100

radic119894

)[

119898 prod 119894=1

cos(119909119894minus100

radic119894

)]

[minus600600]

19Grie

wanksto

chastic

11989119=

1

4000[

119898 sum 119894=1

120576 119894(119909119894minus100)2]minus[

119898 prod 119894=1

cos(119909119894minus100

radic119894

)]+1

12059711989118

120597119909119896

=120576 119896(119909119896minus100)

2000

+1 radic119894tan(119909119896minus100

radic119894

)[

119898 prod 119894=1

cos(119909119894minus100

radic119894

)]

[minus600600]

20Michaelw

icz

11989120=minus

119898 sum 119894=1

sin(119909119894)sin20

(1198941199092 119894

120587)

12059711989120

120597119909119896

=minuscos (119896minus119909)sin(1198961199092 119896

120587)

20

minus40119896119909119896

120587cos(1198961199092 119896

120587)sin(1198961199092 119896

120587)

19

sin(119909119896)

[0120587]

21Ro

senb

rock

11989121=

119898minus1

sum 119894=1

100(119909119894+1minus1199092 119894)2

+(119909119894minus1)212059711989121

1205971199091

=2(1199091minus1)minus4001199091(1199092minus1199092 1)

12059711989121

120597119909119896

=minus2001199092 119896minus1+202119909119896minus400119909119896(119909119896+1minus1199092 119896)minus212059711989121

120597119909119898

=200(119909119898minus1199092 119898minus1)

[minus5050]

22StochasticR

osenbrock

11989122=

119898minus1

sum 119894=1

100120576 119894(119909119894+1minus1199092 119894)2

+(119909119894minus1)212059711989122

1205971199091

=2(1199091minus1)minus400120576 11199091(1199092minus1199092 1)

12059711989122

120597119909119896

=2119909119896+200120576 119896(119909119896minus1199092 119896minus1)minus400120576 119896119909119896(119909119896+1minus1199092 119896)minus212059711989122

120597119909119898

=200120576 119894(119909119898minus1199092 119898minus1)

[minus5050]

23Trigon

ometric

11989123=

119898 sum 119894=1

[119898+119894(1minuscos119909119894)minussin119909119894minus

119898 sum 119895=1

cos119909119895]

2

12059711989123

120597119909119896

=2[119898+119896(1minuscos119909119894)minussin119909119894minus

119898 sum 119895=1

cos119909119895][ (119894+1)sin119909119896minuscos119909119896]

[minus10001000]

24Za

charov

11989124=

119898 sum 119894=1

1199092 119894+(

119898 sum 119894=1

05119894119909119894)

2

+(

119898 sum 119894=1

05119894119909119894)

4

12059711989124

120597119909119896

=2119909119896+119896

119898 sum 119894=1

05119894119909119894+2119896(

119898 sum 119894=1

05119894119909119894)

3

[minus510]

Mathematical Problems in Engineering 7

020

400

20400

5

10

15

20

xy minus20 minus20minus40 minus40

f(xy)

(a)

05

0

50

05

1

15

2

xy

f(xy)

minus5 minus5

times105

(b)

05

100

510

0500

10001500200025003000

xy

f(xy)

minus5 minus5minus10 minus10

(c)

x

0 05

10

yminus5 minus5

minus1

minus08

minus06

minus04

minus02

0

minus10 minus10

f(xy)

510

(d)

0

200

400

600

800

1000

xy

f(xy)

0 0

5

minus5 minus5

5

(e)

xy

0

100

200

300

400

f(xy)

550 0

1010

minus5 minus5minus10 minus10

(f)

xy

0

40

20

60

80

100

f(xy)

0 05

510

10

minus5 minus5minus10 minus10

(g)

y x

550 0

1010

minus5 minus5minus10 minus10

0

0402

0608

1

f(xy)

(h)

Figure 1 Surface plots of the two-variable benchmark functions used in this study (a) Ackley (b) Beale (c) Booth (d) cross-leg table (e)Himmelblau (f) Levy 13 (g) Matyas and (h) Schaffer functions

8 Mathematical Problems in Engineering

Table 2 Values of the meanminima and standard deviations obtained by the CS and GBCS algorithms compared with the value of the globalminima of the twenty-four benchmark problems

Number Benchmarkfunction

Number ofvariables

Globalmin

Number ofiterations

GBCS CSMean Std Dev Mean Std Dev

1 Ackley 2 0 1000 0 0 22204E-16 67752E-162 Beale 2 0 1000 0 0 57891E-30 12165E-293 Booth 2 0 1000 0 0 0 04 Cross-leg table 2 minus1 1000 minus11463E-2 7672E-3 minus62704E-3 36529E-35 Himmelblau 2 0 1000 17058E-28 2836E-28 25958E-19 53451E-196 Levy 13 2 0 1000 13498E-31 66809E-47 13498E-31 66809E-477 Matyas 2 0 1000 27691E-54 4728E-54 20407E-38 50616E-388 Schaffer 2 0 3000 0 0 74015E-18 19193E-179 Powell 4 0 1000 18694E-8 35848E-8 16296E-13 34802E-1310 Power sum 4 0 1000 18328E-4 16761E-4 25432E-4 18167E-411 Shekel 5 4 minus10536 200 minus10536 16289E-5 minus10536 18421E-212 Wood 4 0 1000 23726 22208 040838 033713 Cube 5 0 5000 12567 086542 5782E-8 25596E-714 Stochastic cube 5 0 5000 77438 69815 64369 5029215 Sphere 5 0 1000 25147E-38 51577E-38 11371E-21 12967E-2116 Hartmann 6 minus33224 200 minus33224 43959E-10 minus33215 60711E-417 Dixon-price 50 0 5000 47094E-2 16904E-1 66667E-1 26103E-618 Griewank 50 0 5000 0 0 33651E-10 94382E-10

19 StochasticGriewank 50 0 5000 72758E-13 28579E-12 69263 20451

20 Michaelwicz 50 5000 minus32263 13729 minus27383 1355121 Rosenbrock 50 0 5000 097368 05885 35286 37012

22 StochasticRosenbrock 50 0 5000 58599 19122 48944 46783

23 Trigonometric 50 0 5000 53560 45360 19435 4033724 Zacharov 50 0 5000 65769 13288 27031 52748

GBCS performed significantly better than CS as shown inFigure 2(d) On the other hand both algorithms were ableto identify the global minimum of the Himmelblau functionFigure 2(e) shows the evolution of the mean best valuesGBCS performed more effectively than CS Both algorithmswere also able to identify the minimum of the Levy 13 func-tion (Figure 2(f)) However GBCS was significantly moreeffective than CS This pattern was repeated with the Matyasfunction as shown in Figure 2(g)

The Schaffer function is multimodal Both GBCS and CSfailed to converge to the global minimum within the 10minus10tolerance at 1000 iterations Running both algorithms to 3000iterations resulted in GBCS reaching the global optimumwhile CS not as shown in Figure 2(f) Schaffer functionconcludes the 2-variable functions GBCS performed betterthan CS in all of them

The Powell function has four variables The evolution ofthe mean best values of CS and GBCS plotted in Figure 3(a)showed that performance of CS was better than GBCSalthough they were close to the global minimum Powell isone of the few test functions for which the use of the gradientdid not improve the performance of the CS algorithm

Minor improvements were observed with the power sumand the Shekel 5 functions Power sumrsquos global optimum wasnot obtained by the two algorithms as shown in Figure 3(b)as both algorithms seem to be trapped in a local minimumFor the Shekel 5 function Figure 3(c) the global optimumwas easily optimized with minor improvement in perfor-mance for the GBCS algorithm The global optimum for theWood function which has four variables as well was notachieved by both algorithms within 1000 iterations as shownin Figure 3(d) When they were run for 5000 GBCS seemedto be trapped in a local minimum and was not able to reachthe global optimum which was obtained by CS The Woodfunction is the only function for which GBCS performancewas much worse than that of the original CS algorithm

The performance of both algorithms for the Cube and theStochastic Cube functions was peculiar In both cases GBCSoutperformed CS at the early iterations At larger iterationsCS did better For the two functions as depicted in Figures3(e) and 3(f) the globalminimumwas not obtainedwith 1000iterations so both problems were run for 5000 CS was ableto come close to the global minimum for the Cube functionwhile GBCS seemed to be trapped in a local minimum For

Mathematical Problems in Engineering 9

0 500 1000

Func

tion

valu

e

Iterations

1010

100

10minus10

10minus20

(a)

0 500 1000

Func

tion

valu

eIterations

100

10minus20

10minus30

10minus10

10minus40

(b)

0 500 1000

Func

tion

valu

e

Iterations

1020

10minus20

10minus40

100

(c)

0 500 1000

Iterations

minus10minus4

minus10minus3

minus10minus2

minus10minus1

Func

tion

valu

e

(d)

0 500 1000

Iterations

1010

100

10minus10

10minus20

10minus30

Func

tion

valu

e

(e)

0 500 1000

Iterations

1020

100

10minus20

10minus40

Func

tion

valu

e(f)

0 500 1000

Iterations

GBCSCS

100

10minus20

10minus40

10minus60

Func

tion

valu

e

(g)

0 1000 2000 3000

Iterations

GBCSCS

100

10minus5

10minus10

10minus20

10minus15

Func

tion

valu

e

(h)

Figure 2 Evolution of mean best values for GBCS and the original CS algorithm for (a) Ackley (b) Beale (c) Booth (d) Cross-leg table (e)Himmelblau (f) Levy 13 (g) Matyas and (h) Schaffer functions

the Stochastic Cube function CS did slightly better at largeriterations but both algorithms failed to reach the globalminimum after 5000 iterations

The sphere function has five variables as the Cubefunction but it is easier to solve Both algorithms were ableto identify the global optimum as depicted in Figure 3(g)but GBCS considerably outperformed CS in terms of per-formance efficiency The Hartmann function which has 6

variables was relatively easy to solve for GBCS algorithms asshown in Figure 3(h) GBCS arrived at the global minimumbut CS could not do so within the tolerance GBCS alsooutperformed CS in terms of performance efficiency inreaching the global optimum of the Hartmann function

Figure 4 shows the performance of test functions with 50variablesThese are themost challenging problems due to thelarge domain space Figure 4(a) depicts the performance of

10 Mathematical Problems in Engineering

0 500 1000

Iterations

100

1010

10minus10

10minus20

Func

tion

valu

e

(a)

0 500 1000Iterations

102

100

10minus2

10minus4

Func

tion

valu

e(b)

0 50 100 150 200Iterations

minus100

minus101

minus102

Func

tion

valu

e

(c)

1020

100

10minus20

10minus40

Func

tion

valu

e

0 2000 4000 6000

Iterations

(d)

0 2000 4000 6000

Iterations

1010

105

100

10minus5

10minus10

Func

tion

valu

e

(e)

0 2000 4000 6000Iterations

1010

105

100

10minus5

Func

tion

valu

e(f)

0 500 1000

Iterations

1020

100

10minus20

10minus40

Func

tion

valu

e

GBCSCS

(g)

0 50 100 150 200

Iterations

Func

tion

valu

e

minus1004

minus1006

GBCSCS

(h)

Figure 3 Evolution of mean best values for GBCS and the original CS algorithm for (a) Powell (b) Power Sum (c) Shekel 5 (d) Wood (e)Cube (f) Stochastic Cube (g) Sphere and (h) Hartmann functions

both algorithms for the Dixon-Price function Clearly GBCSperformed better than CS in the early iterations Even thoughboth algorithms were not able to attain the global minimumtheminimum arrived at by GBCS at early iterations is orders-of-magnitude lower than that arrived at by CS

GBCS outperformed CS in both the Griewank(Figure 4(b)) and the Stochastic Griewank (Figure 4(c))For the Griewank function both algorithms were able toidentify the global minimum However GBCS arrived at the

global minimum at less than the half number of iterationscompared to CS For the Stochastic Griewank CS wasnot able to identify the global minimum even after 5000iterations The result predicted by CS was more than 10orders-of-magnitude higher than the result predicted byGBCS

Figure 4(d) shows the results for the Michaelwics func-tion To the best of the authorsrsquo knowledge the globalminimum for the 50-variableMichaelwics is not known since

Mathematical Problems in Engineering 11

0 2000 4000 6000

Iterations

1010

105

100

10minus5

(a)

0 2000 60004000

IterationsFu

nctio

n va

lue

1010

100

10minus10

10minus20

(b)

0 2000 60004000

Iterations

Func

tion

valu

e

105

100

10minus5

10minus10

10minus15

(c)

0 2000 60004000

Iterations

Func

tion

valu

e

minus1011

minus1013

minus1015

(d)

0 2000 60004000

Iterations

Func

tion

valu

e

105

1010

100

10minus5

(e)Fu

nctio

n va

lue

105

1010

1000 2000 60004000

Iterations

(f)

0 2000 60004000

Iterations

Func

tion

valu

e

105

106

104

103

GBCSCS

(g)

0 2000 60004000

Iterations

Func

tion

valu

e

105

1010

100

GBCSCS

(h)

Figure 4 Evolution of mean best values for GBCS and the original CS algorithm for (a) Dixon-Price (b) Griewank (c) Stochastic Griewank(d) Michaelwics (e) Rosenbrock (f) Stochastic Rosenbrock (g) Trigonometric and (h) Zacherov functions

there is no published literature identifying it GBCS identifiedaminimum that is lower than that identified byCS as depictedin Figure 4(d)

Although both algorithms were not able to identifythe global optima of the Rosenbrock and the StochasticRosenbrock functions as depicted in Figures 4(e) and 4(f)respectively the results obtained by GBCS were orders-of-magnitude lower than those obtained by CS The superiorityof the GBCS is clearly demonstrated with those two func-tions

Figure 4(g) shows the results for the Trigonometric func-tion For this difficult problem both algorithms were notable to reach the global minimum within 5000 iterations Inthis function nonetheless as in most cases studied GBCSachieved a lower result than CS

The last of the 50-variable functions tested was theZacherov function The results of Figure 4(h) also show thesuperiority of GBCS in reaching results that are orders-of-magnitude lower than those obtained by the original CSalgorithm

12 Mathematical Problems in Engineering

Table 2 shows a summary of the evaluation results for thetwenty-four benchmark problems GBCS was able to providebetter solutions to all challenging problemsGBCSwas able toachieve better performance in eighteen problems equivalentperformance in twoproblems andworse performance in fourout of the twenty-four problems In one case the StochasticGriewank function the global optimum was successfullyobtained by GBCS for this 50-variable problem while CSfailed to find the global optimum even after 5000 iterationsOn the other hand the four functions for which CS per-formed better were the Powell Wood Cube and StochasticCube functions

It is interesting to note that GBCS outperformed CSin handling two of the three stochastic functions tested inthis study This result confirms that despite the use of thegradient as a guide upon which some moves are basedthe stochastic nature of the algorithm remained unaffectedIn fact the additional information was used quite subtlyCuckoos still search for nests randomly but with the help ofsome intelligence

6 Conclusions

In this study we use the gradient of the objective functionwhich could be available or easily obtainable for severalobjective functions in engineering calculations to improvethe performance of one of the most promising stochasticalgorithms the cuckoo search The proposed modificationwas subtly implemented in the algorithm by changing thedirection of the local random walk of cuckoos towards thedirection of the minimum value from the point of viewof the cuckoorsquos location We evaluated this modification byattempting to find the global optimum of twenty-four bench-mark functionsThe newly developed GBCS algorithm led toimproved reliability and effectiveness of the algorithm in allbut four of tested benchmark problems which included threestochastic functions In some cases the global minimumcould not have been obtained via the original CS algorithmbut was obtained via GBCS Although the improvementin performance was not achieved with the entire set ofbenchmark problems GBCS proved to be more reliable andefficient in the majority of the tested problems

Conflict of InterestsThe authors declare that there is no conflict of interestsregarding the publication of this paper

References

[1] C A Floudas and C E Gounaris ldquoA review of recent advancesin global optimizationrdquo Journal of Global Optimization vol 45no 1 pp 3ndash38 2009

[2] L M Rios and N V Sahinidis ldquoDerivative-free optimizationa review of algorithms and comparison of software implemen-tationsrdquo Journal of Global Optimization vol 56 pp 1247ndash12932012

[3] X S Yang and S Deb ldquoCuckoo search via Levy flightsrdquo in Pro-ceedings of the World Congress on Nature amp Biologically InspiredComputing (NABIC rsquo09) pp 210ndash214 IEEE Coimbatore IndiaDecember 2009

[4] X-S Yang and S Deb ldquoCuckoo search recent advances andapplicationsrdquo Neural Computing and Applications vol 24 no1 pp 169ndash174 2014

[5] J Piechocki D Ambroziak A Palkowski and G RedlarskildquoUse of modified cuckoo search algorithm in the design processof integrated power systems for modern and energy self-sufficient farmsrdquo Applied Energy vol 114 pp 901ndash908 2014

[6] G Kanagaraj S Ponnambalam and N Jawahar ldquoA hybridcuckoo search and genetic algorithm for reliability-redundancyallocation problemsrdquo Computers amp Industrial Engineering vol66 no 4 pp 1115ndash1124 2013

[7] V Bhargava S Fateen and A Bonilla-Petriciolet ldquoCuckoosearch a new nature-inspired optimization method for phaseequilibrium calculationsrdquo Fluid Phase Equilibria vol 337 pp191ndash200 2013

[8] P KMohanty andD R Parhi ldquoCuckoo search algorithm for themobile robot navigationrdquo in Swarm Evolutionary and MemeticComputing pp 527ndash536 Springer New York NY USA 2013

[9] X S Yang and S Deb ldquoEngineering optimisation by cuckoosearchrdquo International Journal of Mathematical Modelling andNumerical Optimisation vol 1 no 4 pp 330ndash343 2010

Submit your manuscripts athttpwwwhindawicom

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical Problems in Engineering

Hindawi Publishing Corporationhttpwwwhindawicom

Differential EquationsInternational Journal of

Volume 2014

Applied MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Probability and StatisticsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical PhysicsAdvances in

Complex AnalysisJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

OptimizationJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

CombinatoricsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Operations ResearchAdvances in

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Function Spaces

Abstract and Applied AnalysisHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of Mathematics and Mathematical Sciences

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Algebra

Discrete Dynamics in Nature and Society

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Decision SciencesAdvances in

Discrete MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014 Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Stochastic AnalysisInternational Journal of

4 Mathematical Problems in Engineering

Table1Testfunctio

nsused

fortestin

gthep

erform

ance

ofCS

andGBC

S

Num

ber

Nam

eFu

nctio

nandits

deriv

ative

Search

Dom

ain

1Ac

kley

1198911=20(1minus119890minus02radic05(1199092 1+1199092 2)

)minus11989005(cos21205871199091+cos21205871199092)

+1198901

1205971198911

120597119909119896

=2119909119896119890minus02radic05(1199092 1+1199092 2)

radic05(1199092 1+1199092 2)

+12058711989005(cos(21205871199091)+cos(21205871199092))

sin(2120587119909119896)

[minus3535]

2Be

ale

1198912=(15minus1199091+11990911199092)2+(225minus1199091+11990911199092 2)2+(2625minus1199091+11990911199093 2)2

1205971198912

1205971199091

=2(1199092 2minus1)(11990911199092 2minus1199091+225)+2(1199093 2minus1)(11990911199093 2minus1199091+2625)+2(1199092minus1)(11990911199092minus1199091+15)

1205971198912

1205971199092

=21199091(11990911199092minus1199091+15)+411990911199092(11990911199092 2minus1199091+225)+611990911199092 2(11990911199093 2minus1199091+2625)

[minus4545]

3Bo

oth

1198913=(1199091+21199092minus7)2+(21199091+1199092minus5)2

1205971198913

1205971199091

=101199091+81199092minus34

1205971198913

1205971199092

=81199091+101199092minus38

[minus1010]

4Cr

oss-legtable

1198914=minus[1003816 1003816 1003816 1003816 1003816 1003816 1003816sin(1199091)sin(1199092)119890|100minusradic1199092 1+1199092 2120587|

1003816 1003816 1003816 1003816 1003816 1003816 1003816+1]

minus01

1205971198914

1205971199091

=011205901(sign[sin(1199091)sin(1199092)]cos(1199091)sin(1199092)+1199091sig

n(1205902120587minus100)1003816 1003816 1003816 1003816sin(1199091)sin(1199092)1003816 1003816 1003816 10038161205871205902)

(1205901

1003816 1003816 1003816 1003816sin(1199091)sin(1199092)1003816 1003816 1003816 1003816+1)11

1205971198914

1205971199092

=011205901(sign[sin(1199091)sin(1199092)]cos(1199092)sin(1199091)+1199092sig

n(1205902120587minus100)1003816 1003816 1003816 1003816sin(1199091)sin(1199092)1003816 1003816 1003816 10038161205871205902)

(1205901

1003816 1003816 1003816 1003816sin(1199091)sin(1199092)1003816 1003816 1003816 1003816+1)11

where1205901=119890|(1205902120587)minus100|

1205902=radic1199092 1+1199092 2

[minus1010]

5Him

melb

lau

1198915=(1199092 1+1199092minus11)2+(1199092 2+1199091minus7)2

1205971198915

1205971199091

=21199091+41199091(1199092 1+1199092minus11)+21199092 2minus141205971198915

1205971199092

=21199092+41199092(1199092 2+1199091minus7)+21199092 1minus22

[minus55]

6Levy

13

1198916=sin2

(31205871199091)+(1199091minus1)2[1+sin2

(31205871199092)]+(1199092minus1)2[1+sin2

(21205871199092)]

1205971198916

1205971199091

=(21199091minus2)(sin(31205871199092)2+1)+6120587cos(31205871199091)sin(31205871199091)

1205971198916

1205971199092

=(21199092minus2)(sin(21205871199092)2+1)+4120587cos(21205871199092)sin(21205871199092)(1199092minus1)2+6120587cos(31205871199092)sin(31205871199092)(1199091minus1)2

[minus1010]

7Matyas

1198917=0261199092 1minus04811990911199092+0261199092 2

1205971198917

1205971199091

=0521199091minus04811990921205971198917

1205971199092

=0521199092minus0481199091

[minus1010]

Mathematical Problems in Engineering 5

Table1Con

tinued

Num

ber

Nam

eFu

nctio

nandits

deriv

ative

Search

Dom

ain

8Schaffer

1198918=05+

sin2

[radic1199092 1+1199092 2]minus05

[0001(1199092 1+1199092 2)+1]2

1205971198918

120597119909119896

=minus119909119896((0004sin(120590)2minus0002)

(0001(1199092 1+1199092 2)+1)3

minus2cos (120590)sin(120590)

120590(0001(1199092 1+1199092 2)+1)2)

Where120590=radic1199092 1+1199092 2

[minus100100]

9Po

well

1198919=(1199091+101199092)2+5(1199093minus1199094)2+(1199092minus21199093)4+10(1199091minus1199094)4

1205971198919

1205971199091

=21199091+201199092+40(1199091minus1199094)31205971198919

1205971199092

=201199091+2001199092+4(1199092minus21199093)3

1205971198919

1205971199093

=101199093minus101199094minus8(1199092minus21199093)31205971198919

1205971199094

=101199094minus101199093minus40(1199091minus1199094)3

[minus10001000]

10Po

wer

sum

11989110=

119898 sum 119894=1

[(

119898 sum 119895=1

119909119894 119895)minus119887 119894]

2

119887=[81842114]

12059711989110

120597119909119896

=2

119898 sum 119894=1

119894119909119894minus1

119896[(

119898 sum 119895=1

119909119894 119895)minus119887 119894]

[04]

11Sh

ekel5

11989111=minus

119898 sum 119894=1

1

119888 119894+sum119899 119895=1(119909119895minus119886119894119895)2

=1

119898=1012059711989111

120597119909119896

=

119898 sum 119894=1

2(119909119896minus119886119894119896)

[119888 119894+sum119899 119895=1(119909119895minus119886119894119895)2

]2

[010]

12Woo

d

11989112=100(1199092 1minus1199092)2+(1199091minus1)2+(1199093minus1)2+90(1199092 3minus1199094)2+101[(1199092minus1)2+(1199094minus1)2]+198(1199092minus1)(1199094minus1)

12059711989112

1205971199091

=21199091minus4001199091(1199092minus1199092 1)minus212059711989112

1205971199092

=minus2001199092 1+22021199092+1981199094minus40

12059711989112

1205971199093

=21199093minus3601199093(1199094minus1199092 3)minus212059711989112

1205971199094

=minus1801199092 3+1981199092+20021199094minus40

[minus10001000]

13Cu

be11989113=

119898minus1

sum 119894=1

100(119909119894+1minus1199093 119894)2

+(1minus119909119894)212059711989113

1205971199091

=21199091minus6001199092 1(1199092minus1199093 1)minus2

12059711989113

120597119909119896

=202119909119896minus6001199092 119896(119909119896+1minus1199093 119896)minus2001199093 119896minus1minus212059711989113

120597119909119898

=200(119909119898minus1199093 119898minus1)

[minus100100]

14StochasticC

ube

11989114=

119898minus1

sum 119894=1

100120576 119894(119909119894+1minus1199093 119894)2

+(1minus119909119894)212059711989113

1205971199091

=21199091minus6001199092 1120576 119894(1199092minus1199093 1)minus2

12059711989113

120597119909119896

=2119909119896minus600120576 1198961199092 119896(119909119896+1minus1199093 119896)+200120576 119894(119909119896minus1199093 119896minus1)minus212059711989113

120597119909119898

=200120576 119898minus1(119909119898minus1199093 119898minus1)

[minus100100]

15Sphere

11989115=

119898 sum 119894=1

1199092 11989412059711989115

120597119909119896

=2119909119896

[minus100100]

16Hartm

ann

11989116=minus

4 sum 119894=1

120572119894exp(minus

6 sum 119895=1

119860119894119895(119909119895minus119875119894119895)2

)12059711989116

120597119909119896

=

4 sum 119894=1

2120572119894119860119894119895( 119909119896minus119875119894119896)exp(minus

6 sum 119895=1

119860119894119895(119909119895minus119875119894119895)2

)[01]

6 Mathematical Problems in Engineering

Table1Con

tinued

Num

ber

Nam

eFu

nctio

nandits

deriv

ative

Search

Dom

ain

17Dixon

-pric

e11989117=(1199091minus1)2+

119898 sum 119894=2

119894(21199092 119894minus119909119894minus1)2

12059711989117

1205971199091

=minus81199092 2+61199091minus2

12059711989117

120597119909119896

=(2119909119896minus41199092 119896+1)(119896+1)minus8119896119909119896(119909119896minus1minus21199092 119896)12059711989117

120597119909119898

=minus8119898119909119898(119909119898minus1minus21199092 119898)

[minus1010]

18Grie

wank

11989118=

1

4000[

119898 sum 119894=1

(119909119894minus100)2]minus[

119898 prod 119894=1

cos(119909119894minus100

radic119894

)]+1

12059711989118

120597119909119896

=(119909119896minus100)

2000

+1 radic119894tan(119909119896minus100

radic119894

)[

119898 prod 119894=1

cos(119909119894minus100

radic119894

)]

[minus600600]

19Grie

wanksto

chastic

11989119=

1

4000[

119898 sum 119894=1

120576 119894(119909119894minus100)2]minus[

119898 prod 119894=1

cos(119909119894minus100

radic119894

)]+1

12059711989118

120597119909119896

=120576 119896(119909119896minus100)

2000

+1 radic119894tan(119909119896minus100

radic119894

)[

119898 prod 119894=1

cos(119909119894minus100

radic119894

)]

[minus600600]

20Michaelw

icz

11989120=minus

119898 sum 119894=1

sin(119909119894)sin20

(1198941199092 119894

120587)

12059711989120

120597119909119896

=minuscos (119896minus119909)sin(1198961199092 119896

120587)

20

minus40119896119909119896

120587cos(1198961199092 119896

120587)sin(1198961199092 119896

120587)

19

sin(119909119896)

[0120587]

21Ro

senb

rock

11989121=

119898minus1

sum 119894=1

100(119909119894+1minus1199092 119894)2

+(119909119894minus1)212059711989121

1205971199091

=2(1199091minus1)minus4001199091(1199092minus1199092 1)

12059711989121

120597119909119896

=minus2001199092 119896minus1+202119909119896minus400119909119896(119909119896+1minus1199092 119896)minus212059711989121

120597119909119898

=200(119909119898minus1199092 119898minus1)

[minus5050]

22StochasticR

osenbrock

11989122=

119898minus1

sum 119894=1

100120576 119894(119909119894+1minus1199092 119894)2

+(119909119894minus1)212059711989122

1205971199091

=2(1199091minus1)minus400120576 11199091(1199092minus1199092 1)

12059711989122

120597119909119896

=2119909119896+200120576 119896(119909119896minus1199092 119896minus1)minus400120576 119896119909119896(119909119896+1minus1199092 119896)minus212059711989122

120597119909119898

=200120576 119894(119909119898minus1199092 119898minus1)

[minus5050]

23Trigon

ometric

11989123=

119898 sum 119894=1

[119898+119894(1minuscos119909119894)minussin119909119894minus

119898 sum 119895=1

cos119909119895]

2

12059711989123

120597119909119896

=2[119898+119896(1minuscos119909119894)minussin119909119894minus

119898 sum 119895=1

cos119909119895][ (119894+1)sin119909119896minuscos119909119896]

[minus10001000]

24Za

charov

11989124=

119898 sum 119894=1

1199092 119894+(

119898 sum 119894=1

05119894119909119894)

2

+(

119898 sum 119894=1

05119894119909119894)

4

12059711989124

120597119909119896

=2119909119896+119896

119898 sum 119894=1

05119894119909119894+2119896(

119898 sum 119894=1

05119894119909119894)

3

[minus510]

Mathematical Problems in Engineering 7

020

400

20400

5

10

15

20

xy minus20 minus20minus40 minus40

f(xy)

(a)

05

0

50

05

1

15

2

xy

f(xy)

minus5 minus5

times105

(b)

05

100

510

0500

10001500200025003000

xy

f(xy)

minus5 minus5minus10 minus10

(c)

x

0 05

10

yminus5 minus5

minus1

minus08

minus06

minus04

minus02

0

minus10 minus10

f(xy)

510

(d)

0

200

400

600

800

1000

xy

f(xy)

0 0

5

minus5 minus5

5

(e)

xy

0

100

200

300

400

f(xy)

550 0

1010

minus5 minus5minus10 minus10

(f)

xy

0

40

20

60

80

100

f(xy)

0 05

510

10

minus5 minus5minus10 minus10

(g)

y x

550 0

1010

minus5 minus5minus10 minus10

0

0402

0608

1

f(xy)

(h)

Figure 1 Surface plots of the two-variable benchmark functions used in this study (a) Ackley (b) Beale (c) Booth (d) cross-leg table (e)Himmelblau (f) Levy 13 (g) Matyas and (h) Schaffer functions

8 Mathematical Problems in Engineering

Table 2 Values of the meanminima and standard deviations obtained by the CS and GBCS algorithms compared with the value of the globalminima of the twenty-four benchmark problems

Number Benchmarkfunction

Number ofvariables

Globalmin

Number ofiterations

GBCS CSMean Std Dev Mean Std Dev

1 Ackley 2 0 1000 0 0 22204E-16 67752E-162 Beale 2 0 1000 0 0 57891E-30 12165E-293 Booth 2 0 1000 0 0 0 04 Cross-leg table 2 minus1 1000 minus11463E-2 7672E-3 minus62704E-3 36529E-35 Himmelblau 2 0 1000 17058E-28 2836E-28 25958E-19 53451E-196 Levy 13 2 0 1000 13498E-31 66809E-47 13498E-31 66809E-477 Matyas 2 0 1000 27691E-54 4728E-54 20407E-38 50616E-388 Schaffer 2 0 3000 0 0 74015E-18 19193E-179 Powell 4 0 1000 18694E-8 35848E-8 16296E-13 34802E-1310 Power sum 4 0 1000 18328E-4 16761E-4 25432E-4 18167E-411 Shekel 5 4 minus10536 200 minus10536 16289E-5 minus10536 18421E-212 Wood 4 0 1000 23726 22208 040838 033713 Cube 5 0 5000 12567 086542 5782E-8 25596E-714 Stochastic cube 5 0 5000 77438 69815 64369 5029215 Sphere 5 0 1000 25147E-38 51577E-38 11371E-21 12967E-2116 Hartmann 6 minus33224 200 minus33224 43959E-10 minus33215 60711E-417 Dixon-price 50 0 5000 47094E-2 16904E-1 66667E-1 26103E-618 Griewank 50 0 5000 0 0 33651E-10 94382E-10

19 StochasticGriewank 50 0 5000 72758E-13 28579E-12 69263 20451

20 Michaelwicz 50 5000 minus32263 13729 minus27383 1355121 Rosenbrock 50 0 5000 097368 05885 35286 37012

22 StochasticRosenbrock 50 0 5000 58599 19122 48944 46783

23 Trigonometric 50 0 5000 53560 45360 19435 4033724 Zacharov 50 0 5000 65769 13288 27031 52748

GBCS performed significantly better than CS as shown inFigure 2(d) On the other hand both algorithms were ableto identify the global minimum of the Himmelblau functionFigure 2(e) shows the evolution of the mean best valuesGBCS performed more effectively than CS Both algorithmswere also able to identify the minimum of the Levy 13 func-tion (Figure 2(f)) However GBCS was significantly moreeffective than CS This pattern was repeated with the Matyasfunction as shown in Figure 2(g)

The Schaffer function is multimodal Both GBCS and CSfailed to converge to the global minimum within the 10minus10tolerance at 1000 iterations Running both algorithms to 3000iterations resulted in GBCS reaching the global optimumwhile CS not as shown in Figure 2(f) Schaffer functionconcludes the 2-variable functions GBCS performed betterthan CS in all of them

The Powell function has four variables The evolution ofthe mean best values of CS and GBCS plotted in Figure 3(a)showed that performance of CS was better than GBCSalthough they were close to the global minimum Powell isone of the few test functions for which the use of the gradientdid not improve the performance of the CS algorithm

Minor improvements were observed with the power sumand the Shekel 5 functions Power sumrsquos global optimum wasnot obtained by the two algorithms as shown in Figure 3(b)as both algorithms seem to be trapped in a local minimumFor the Shekel 5 function Figure 3(c) the global optimumwas easily optimized with minor improvement in perfor-mance for the GBCS algorithm The global optimum for theWood function which has four variables as well was notachieved by both algorithms within 1000 iterations as shownin Figure 3(d) When they were run for 5000 GBCS seemedto be trapped in a local minimum and was not able to reachthe global optimum which was obtained by CS The Woodfunction is the only function for which GBCS performancewas much worse than that of the original CS algorithm

The performance of both algorithms for the Cube and theStochastic Cube functions was peculiar In both cases GBCSoutperformed CS at the early iterations At larger iterationsCS did better For the two functions as depicted in Figures3(e) and 3(f) the globalminimumwas not obtainedwith 1000iterations so both problems were run for 5000 CS was ableto come close to the global minimum for the Cube functionwhile GBCS seemed to be trapped in a local minimum For

Mathematical Problems in Engineering 9

0 500 1000

Func

tion

valu

e

Iterations

1010

100

10minus10

10minus20

(a)

0 500 1000

Func

tion

valu

eIterations

100

10minus20

10minus30

10minus10

10minus40

(b)

0 500 1000

Func

tion

valu

e

Iterations

1020

10minus20

10minus40

100

(c)

0 500 1000

Iterations

minus10minus4

minus10minus3

minus10minus2

minus10minus1

Func

tion

valu

e

(d)

0 500 1000

Iterations

1010

100

10minus10

10minus20

10minus30

Func

tion

valu

e

(e)

0 500 1000

Iterations

1020

100

10minus20

10minus40

Func

tion

valu

e(f)

0 500 1000

Iterations

GBCSCS

100

10minus20

10minus40

10minus60

Func

tion

valu

e

(g)

0 1000 2000 3000

Iterations

GBCSCS

100

10minus5

10minus10

10minus20

10minus15

Func

tion

valu

e

(h)

Figure 2 Evolution of mean best values for GBCS and the original CS algorithm for (a) Ackley (b) Beale (c) Booth (d) Cross-leg table (e)Himmelblau (f) Levy 13 (g) Matyas and (h) Schaffer functions

the Stochastic Cube function CS did slightly better at largeriterations but both algorithms failed to reach the globalminimum after 5000 iterations

The sphere function has five variables as the Cubefunction but it is easier to solve Both algorithms were ableto identify the global optimum as depicted in Figure 3(g)but GBCS considerably outperformed CS in terms of per-formance efficiency The Hartmann function which has 6

variables was relatively easy to solve for GBCS algorithms asshown in Figure 3(h) GBCS arrived at the global minimumbut CS could not do so within the tolerance GBCS alsooutperformed CS in terms of performance efficiency inreaching the global optimum of the Hartmann function

Figure 4 shows the performance of test functions with 50variablesThese are themost challenging problems due to thelarge domain space Figure 4(a) depicts the performance of

10 Mathematical Problems in Engineering

0 500 1000

Iterations

100

1010

10minus10

10minus20

Func

tion

valu

e

(a)

0 500 1000Iterations

102

100

10minus2

10minus4

Func

tion

valu

e(b)

0 50 100 150 200Iterations

minus100

minus101

minus102

Func

tion

valu

e

(c)

1020

100

10minus20

10minus40

Func

tion

valu

e

0 2000 4000 6000

Iterations

(d)

0 2000 4000 6000

Iterations

1010

105

100

10minus5

10minus10

Func

tion

valu

e

(e)

0 2000 4000 6000Iterations

1010

105

100

10minus5

Func

tion

valu

e(f)

0 500 1000

Iterations

1020

100

10minus20

10minus40

Func

tion

valu

e

GBCSCS

(g)

0 50 100 150 200

Iterations

Func

tion

valu

e

minus1004

minus1006

GBCSCS

(h)

Figure 3 Evolution of mean best values for GBCS and the original CS algorithm for (a) Powell (b) Power Sum (c) Shekel 5 (d) Wood (e)Cube (f) Stochastic Cube (g) Sphere and (h) Hartmann functions

both algorithms for the Dixon-Price function Clearly GBCSperformed better than CS in the early iterations Even thoughboth algorithms were not able to attain the global minimumtheminimum arrived at by GBCS at early iterations is orders-of-magnitude lower than that arrived at by CS

GBCS outperformed CS in both the Griewank(Figure 4(b)) and the Stochastic Griewank (Figure 4(c))For the Griewank function both algorithms were able toidentify the global minimum However GBCS arrived at the

global minimum at less than the half number of iterationscompared to CS For the Stochastic Griewank CS wasnot able to identify the global minimum even after 5000iterations The result predicted by CS was more than 10orders-of-magnitude higher than the result predicted byGBCS

Figure 4(d) shows the results for the Michaelwics func-tion To the best of the authorsrsquo knowledge the globalminimum for the 50-variableMichaelwics is not known since

Mathematical Problems in Engineering 11

0 2000 4000 6000

Iterations

1010

105

100

10minus5

(a)

0 2000 60004000

IterationsFu

nctio

n va

lue

1010

100

10minus10

10minus20

(b)

0 2000 60004000

Iterations

Func

tion

valu

e

105

100

10minus5

10minus10

10minus15

(c)

0 2000 60004000

Iterations

Func

tion

valu

e

minus1011

minus1013

minus1015

(d)

0 2000 60004000

Iterations

Func

tion

valu

e

105

1010

100

10minus5

(e)Fu

nctio

n va

lue

105

1010

1000 2000 60004000

Iterations

(f)

0 2000 60004000

Iterations

Func

tion

valu

e

105

106

104

103

GBCSCS

(g)

0 2000 60004000

Iterations

Func

tion

valu

e

105

1010

100

GBCSCS

(h)

Figure 4 Evolution of mean best values for GBCS and the original CS algorithm for (a) Dixon-Price (b) Griewank (c) Stochastic Griewank(d) Michaelwics (e) Rosenbrock (f) Stochastic Rosenbrock (g) Trigonometric and (h) Zacherov functions

there is no published literature identifying it GBCS identifiedaminimum that is lower than that identified byCS as depictedin Figure 4(d)

Although both algorithms were not able to identifythe global optima of the Rosenbrock and the StochasticRosenbrock functions as depicted in Figures 4(e) and 4(f)respectively the results obtained by GBCS were orders-of-magnitude lower than those obtained by CS The superiorityof the GBCS is clearly demonstrated with those two func-tions

Figure 4(g) shows the results for the Trigonometric func-tion For this difficult problem both algorithms were notable to reach the global minimum within 5000 iterations Inthis function nonetheless as in most cases studied GBCSachieved a lower result than CS

The last of the 50-variable functions tested was theZacherov function The results of Figure 4(h) also show thesuperiority of GBCS in reaching results that are orders-of-magnitude lower than those obtained by the original CSalgorithm

12 Mathematical Problems in Engineering

Table 2 shows a summary of the evaluation results for thetwenty-four benchmark problems GBCS was able to providebetter solutions to all challenging problemsGBCSwas able toachieve better performance in eighteen problems equivalentperformance in twoproblems andworse performance in fourout of the twenty-four problems In one case the StochasticGriewank function the global optimum was successfullyobtained by GBCS for this 50-variable problem while CSfailed to find the global optimum even after 5000 iterationsOn the other hand the four functions for which CS per-formed better were the Powell Wood Cube and StochasticCube functions

It is interesting to note that GBCS outperformed CSin handling two of the three stochastic functions tested inthis study This result confirms that despite the use of thegradient as a guide upon which some moves are basedthe stochastic nature of the algorithm remained unaffectedIn fact the additional information was used quite subtlyCuckoos still search for nests randomly but with the help ofsome intelligence

6 Conclusions

In this study we use the gradient of the objective functionwhich could be available or easily obtainable for severalobjective functions in engineering calculations to improvethe performance of one of the most promising stochasticalgorithms the cuckoo search The proposed modificationwas subtly implemented in the algorithm by changing thedirection of the local random walk of cuckoos towards thedirection of the minimum value from the point of viewof the cuckoorsquos location We evaluated this modification byattempting to find the global optimum of twenty-four bench-mark functionsThe newly developed GBCS algorithm led toimproved reliability and effectiveness of the algorithm in allbut four of tested benchmark problems which included threestochastic functions In some cases the global minimumcould not have been obtained via the original CS algorithmbut was obtained via GBCS Although the improvementin performance was not achieved with the entire set ofbenchmark problems GBCS proved to be more reliable andefficient in the majority of the tested problems

Conflict of InterestsThe authors declare that there is no conflict of interestsregarding the publication of this paper

References

[1] C A Floudas and C E Gounaris ldquoA review of recent advancesin global optimizationrdquo Journal of Global Optimization vol 45no 1 pp 3ndash38 2009

[2] L M Rios and N V Sahinidis ldquoDerivative-free optimizationa review of algorithms and comparison of software implemen-tationsrdquo Journal of Global Optimization vol 56 pp 1247ndash12932012

[3] X S Yang and S Deb ldquoCuckoo search via Levy flightsrdquo in Pro-ceedings of the World Congress on Nature amp Biologically InspiredComputing (NABIC rsquo09) pp 210ndash214 IEEE Coimbatore IndiaDecember 2009

[4] X-S Yang and S Deb ldquoCuckoo search recent advances andapplicationsrdquo Neural Computing and Applications vol 24 no1 pp 169ndash174 2014

[5] J Piechocki D Ambroziak A Palkowski and G RedlarskildquoUse of modified cuckoo search algorithm in the design processof integrated power systems for modern and energy self-sufficient farmsrdquo Applied Energy vol 114 pp 901ndash908 2014

[6] G Kanagaraj S Ponnambalam and N Jawahar ldquoA hybridcuckoo search and genetic algorithm for reliability-redundancyallocation problemsrdquo Computers amp Industrial Engineering vol66 no 4 pp 1115ndash1124 2013

[7] V Bhargava S Fateen and A Bonilla-Petriciolet ldquoCuckoosearch a new nature-inspired optimization method for phaseequilibrium calculationsrdquo Fluid Phase Equilibria vol 337 pp191ndash200 2013

[8] P KMohanty andD R Parhi ldquoCuckoo search algorithm for themobile robot navigationrdquo in Swarm Evolutionary and MemeticComputing pp 527ndash536 Springer New York NY USA 2013

[9] X S Yang and S Deb ldquoEngineering optimisation by cuckoosearchrdquo International Journal of Mathematical Modelling andNumerical Optimisation vol 1 no 4 pp 330ndash343 2010

Submit your manuscripts athttpwwwhindawicom

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical Problems in Engineering

Hindawi Publishing Corporationhttpwwwhindawicom

Differential EquationsInternational Journal of

Volume 2014

Applied MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Probability and StatisticsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical PhysicsAdvances in

Complex AnalysisJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

OptimizationJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

CombinatoricsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Operations ResearchAdvances in

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Function Spaces

Abstract and Applied AnalysisHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of Mathematics and Mathematical Sciences

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Algebra

Discrete Dynamics in Nature and Society

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Decision SciencesAdvances in

Discrete MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014 Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Stochastic AnalysisInternational Journal of

Mathematical Problems in Engineering 5

Table1Con

tinued

Num

ber

Nam

eFu

nctio

nandits

deriv

ative

Search

Dom

ain

8Schaffer

1198918=05+

sin2

[radic1199092 1+1199092 2]minus05

[0001(1199092 1+1199092 2)+1]2

1205971198918

120597119909119896

=minus119909119896((0004sin(120590)2minus0002)

(0001(1199092 1+1199092 2)+1)3

minus2cos (120590)sin(120590)

120590(0001(1199092 1+1199092 2)+1)2)

Where120590=radic1199092 1+1199092 2

[minus100100]

9Po

well

1198919=(1199091+101199092)2+5(1199093minus1199094)2+(1199092minus21199093)4+10(1199091minus1199094)4

1205971198919

1205971199091

=21199091+201199092+40(1199091minus1199094)31205971198919

1205971199092

=201199091+2001199092+4(1199092minus21199093)3

1205971198919

1205971199093

=101199093minus101199094minus8(1199092minus21199093)31205971198919

1205971199094

=101199094minus101199093minus40(1199091minus1199094)3

[minus10001000]

10Po

wer

sum

11989110=

119898 sum 119894=1

[(

119898 sum 119895=1

119909119894 119895)minus119887 119894]

2

119887=[81842114]

12059711989110

120597119909119896

=2

119898 sum 119894=1

119894119909119894minus1

119896[(

119898 sum 119895=1

119909119894 119895)minus119887 119894]

[04]

11Sh

ekel5

11989111=minus

119898 sum 119894=1

1

119888 119894+sum119899 119895=1(119909119895minus119886119894119895)2

=1

119898=1012059711989111

120597119909119896

=

119898 sum 119894=1

2(119909119896minus119886119894119896)

[119888 119894+sum119899 119895=1(119909119895minus119886119894119895)2

]2

[010]

12Woo

d

11989112=100(1199092 1minus1199092)2+(1199091minus1)2+(1199093minus1)2+90(1199092 3minus1199094)2+101[(1199092minus1)2+(1199094minus1)2]+198(1199092minus1)(1199094minus1)

12059711989112

1205971199091

=21199091minus4001199091(1199092minus1199092 1)minus212059711989112

1205971199092

=minus2001199092 1+22021199092+1981199094minus40

12059711989112

1205971199093

=21199093minus3601199093(1199094minus1199092 3)minus212059711989112

1205971199094

=minus1801199092 3+1981199092+20021199094minus40

[minus10001000]

13Cu

be11989113=

119898minus1

sum 119894=1

100(119909119894+1minus1199093 119894)2

+(1minus119909119894)212059711989113

1205971199091

=21199091minus6001199092 1(1199092minus1199093 1)minus2

12059711989113

120597119909119896

=202119909119896minus6001199092 119896(119909119896+1minus1199093 119896)minus2001199093 119896minus1minus212059711989113

120597119909119898

=200(119909119898minus1199093 119898minus1)

[minus100100]

14StochasticC

ube

11989114=

119898minus1

sum 119894=1

100120576 119894(119909119894+1minus1199093 119894)2

+(1minus119909119894)212059711989113

1205971199091

=21199091minus6001199092 1120576 119894(1199092minus1199093 1)minus2

12059711989113

120597119909119896

=2119909119896minus600120576 1198961199092 119896(119909119896+1minus1199093 119896)+200120576 119894(119909119896minus1199093 119896minus1)minus212059711989113

120597119909119898

=200120576 119898minus1(119909119898minus1199093 119898minus1)

[minus100100]

15Sphere

11989115=

119898 sum 119894=1

1199092 11989412059711989115

120597119909119896

=2119909119896

[minus100100]

16Hartm

ann

11989116=minus

4 sum 119894=1

120572119894exp(minus

6 sum 119895=1

119860119894119895(119909119895minus119875119894119895)2

)12059711989116

120597119909119896

=

4 sum 119894=1

2120572119894119860119894119895( 119909119896minus119875119894119896)exp(minus

6 sum 119895=1

119860119894119895(119909119895minus119875119894119895)2

)[01]

6 Mathematical Problems in Engineering

Table1Con

tinued

Num

ber

Nam

eFu

nctio

nandits

deriv

ative

Search

Dom

ain

17Dixon

-pric

e11989117=(1199091minus1)2+

119898 sum 119894=2

119894(21199092 119894minus119909119894minus1)2

12059711989117

1205971199091

=minus81199092 2+61199091minus2

12059711989117

120597119909119896

=(2119909119896minus41199092 119896+1)(119896+1)minus8119896119909119896(119909119896minus1minus21199092 119896)12059711989117

120597119909119898

=minus8119898119909119898(119909119898minus1minus21199092 119898)

[minus1010]

18Grie

wank

11989118=

1

4000[

119898 sum 119894=1

(119909119894minus100)2]minus[

119898 prod 119894=1

cos(119909119894minus100

radic119894

)]+1

12059711989118

120597119909119896

=(119909119896minus100)

2000

+1 radic119894tan(119909119896minus100

radic119894

)[

119898 prod 119894=1

cos(119909119894minus100

radic119894

)]

[minus600600]

19Grie

wanksto

chastic

11989119=

1

4000[

119898 sum 119894=1

120576 119894(119909119894minus100)2]minus[

119898 prod 119894=1

cos(119909119894minus100

radic119894

)]+1

12059711989118

120597119909119896

=120576 119896(119909119896minus100)

2000

+1 radic119894tan(119909119896minus100

radic119894

)[

119898 prod 119894=1

cos(119909119894minus100

radic119894

)]

[minus600600]

20Michaelw

icz

11989120=minus

119898 sum 119894=1

sin(119909119894)sin20

(1198941199092 119894

120587)

12059711989120

120597119909119896

=minuscos (119896minus119909)sin(1198961199092 119896

120587)

20

minus40119896119909119896

120587cos(1198961199092 119896

120587)sin(1198961199092 119896

120587)

19

sin(119909119896)

[0120587]

21Ro

senb

rock

11989121=

119898minus1

sum 119894=1

100(119909119894+1minus1199092 119894)2

+(119909119894minus1)212059711989121

1205971199091

=2(1199091minus1)minus4001199091(1199092minus1199092 1)

12059711989121

120597119909119896

=minus2001199092 119896minus1+202119909119896minus400119909119896(119909119896+1minus1199092 119896)minus212059711989121

120597119909119898

=200(119909119898minus1199092 119898minus1)

[minus5050]

22StochasticR

osenbrock

11989122=

119898minus1

sum 119894=1

100120576 119894(119909119894+1minus1199092 119894)2

+(119909119894minus1)212059711989122

1205971199091

=2(1199091minus1)minus400120576 11199091(1199092minus1199092 1)

12059711989122

120597119909119896

=2119909119896+200120576 119896(119909119896minus1199092 119896minus1)minus400120576 119896119909119896(119909119896+1minus1199092 119896)minus212059711989122

120597119909119898

=200120576 119894(119909119898minus1199092 119898minus1)

[minus5050]

23Trigon

ometric

11989123=

119898 sum 119894=1

[119898+119894(1minuscos119909119894)minussin119909119894minus

119898 sum 119895=1

cos119909119895]

2

12059711989123

120597119909119896

=2[119898+119896(1minuscos119909119894)minussin119909119894minus

119898 sum 119895=1

cos119909119895][ (119894+1)sin119909119896minuscos119909119896]

[minus10001000]

24Za

charov

11989124=

119898 sum 119894=1

1199092 119894+(

119898 sum 119894=1

05119894119909119894)

2

+(

119898 sum 119894=1

05119894119909119894)

4

12059711989124

120597119909119896

=2119909119896+119896

119898 sum 119894=1

05119894119909119894+2119896(

119898 sum 119894=1

05119894119909119894)

3

[minus510]

Mathematical Problems in Engineering 7

020

400

20400

5

10

15

20

xy minus20 minus20minus40 minus40

f(xy)

(a)

05

0

50

05

1

15

2

xy

f(xy)

minus5 minus5

times105

(b)

05

100

510

0500

10001500200025003000

xy

f(xy)

minus5 minus5minus10 minus10

(c)

x

0 05

10

yminus5 minus5

minus1

minus08

minus06

minus04

minus02

0

minus10 minus10

f(xy)

510

(d)

0

200

400

600

800

1000

xy

f(xy)

0 0

5

minus5 minus5

5

(e)

xy

0

100

200

300

400

f(xy)

550 0

1010

minus5 minus5minus10 minus10

(f)

xy

0

40

20

60

80

100

f(xy)

0 05

510

10

minus5 minus5minus10 minus10

(g)

y x

550 0

1010

minus5 minus5minus10 minus10

0

0402

0608

1

f(xy)

(h)

Figure 1 Surface plots of the two-variable benchmark functions used in this study (a) Ackley (b) Beale (c) Booth (d) cross-leg table (e)Himmelblau (f) Levy 13 (g) Matyas and (h) Schaffer functions

8 Mathematical Problems in Engineering

Table 2 Values of the meanminima and standard deviations obtained by the CS and GBCS algorithms compared with the value of the globalminima of the twenty-four benchmark problems

Number Benchmarkfunction

Number ofvariables

Globalmin

Number ofiterations

GBCS CSMean Std Dev Mean Std Dev

1 Ackley 2 0 1000 0 0 22204E-16 67752E-162 Beale 2 0 1000 0 0 57891E-30 12165E-293 Booth 2 0 1000 0 0 0 04 Cross-leg table 2 minus1 1000 minus11463E-2 7672E-3 minus62704E-3 36529E-35 Himmelblau 2 0 1000 17058E-28 2836E-28 25958E-19 53451E-196 Levy 13 2 0 1000 13498E-31 66809E-47 13498E-31 66809E-477 Matyas 2 0 1000 27691E-54 4728E-54 20407E-38 50616E-388 Schaffer 2 0 3000 0 0 74015E-18 19193E-179 Powell 4 0 1000 18694E-8 35848E-8 16296E-13 34802E-1310 Power sum 4 0 1000 18328E-4 16761E-4 25432E-4 18167E-411 Shekel 5 4 minus10536 200 minus10536 16289E-5 minus10536 18421E-212 Wood 4 0 1000 23726 22208 040838 033713 Cube 5 0 5000 12567 086542 5782E-8 25596E-714 Stochastic cube 5 0 5000 77438 69815 64369 5029215 Sphere 5 0 1000 25147E-38 51577E-38 11371E-21 12967E-2116 Hartmann 6 minus33224 200 minus33224 43959E-10 minus33215 60711E-417 Dixon-price 50 0 5000 47094E-2 16904E-1 66667E-1 26103E-618 Griewank 50 0 5000 0 0 33651E-10 94382E-10

19 StochasticGriewank 50 0 5000 72758E-13 28579E-12 69263 20451

20 Michaelwicz 50 5000 minus32263 13729 minus27383 1355121 Rosenbrock 50 0 5000 097368 05885 35286 37012

22 StochasticRosenbrock 50 0 5000 58599 19122 48944 46783

23 Trigonometric 50 0 5000 53560 45360 19435 4033724 Zacharov 50 0 5000 65769 13288 27031 52748

GBCS performed significantly better than CS as shown inFigure 2(d) On the other hand both algorithms were ableto identify the global minimum of the Himmelblau functionFigure 2(e) shows the evolution of the mean best valuesGBCS performed more effectively than CS Both algorithmswere also able to identify the minimum of the Levy 13 func-tion (Figure 2(f)) However GBCS was significantly moreeffective than CS This pattern was repeated with the Matyasfunction as shown in Figure 2(g)

The Schaffer function is multimodal Both GBCS and CSfailed to converge to the global minimum within the 10minus10tolerance at 1000 iterations Running both algorithms to 3000iterations resulted in GBCS reaching the global optimumwhile CS not as shown in Figure 2(f) Schaffer functionconcludes the 2-variable functions GBCS performed betterthan CS in all of them

The Powell function has four variables The evolution ofthe mean best values of CS and GBCS plotted in Figure 3(a)showed that performance of CS was better than GBCSalthough they were close to the global minimum Powell isone of the few test functions for which the use of the gradientdid not improve the performance of the CS algorithm

Minor improvements were observed with the power sumand the Shekel 5 functions Power sumrsquos global optimum wasnot obtained by the two algorithms as shown in Figure 3(b)as both algorithms seem to be trapped in a local minimumFor the Shekel 5 function Figure 3(c) the global optimumwas easily optimized with minor improvement in perfor-mance for the GBCS algorithm The global optimum for theWood function which has four variables as well was notachieved by both algorithms within 1000 iterations as shownin Figure 3(d) When they were run for 5000 GBCS seemedto be trapped in a local minimum and was not able to reachthe global optimum which was obtained by CS The Woodfunction is the only function for which GBCS performancewas much worse than that of the original CS algorithm

The performance of both algorithms for the Cube and theStochastic Cube functions was peculiar In both cases GBCSoutperformed CS at the early iterations At larger iterationsCS did better For the two functions as depicted in Figures3(e) and 3(f) the globalminimumwas not obtainedwith 1000iterations so both problems were run for 5000 CS was ableto come close to the global minimum for the Cube functionwhile GBCS seemed to be trapped in a local minimum For

Mathematical Problems in Engineering 9

0 500 1000

Func

tion

valu

e

Iterations

1010

100

10minus10

10minus20

(a)

0 500 1000

Func

tion

valu

eIterations

100

10minus20

10minus30

10minus10

10minus40

(b)

0 500 1000

Func

tion

valu

e

Iterations

1020

10minus20

10minus40

100

(c)

0 500 1000

Iterations

minus10minus4

minus10minus3

minus10minus2

minus10minus1

Func

tion

valu

e

(d)

0 500 1000

Iterations

1010

100

10minus10

10minus20

10minus30

Func

tion

valu

e

(e)

0 500 1000

Iterations

1020

100

10minus20

10minus40

Func

tion

valu

e(f)

0 500 1000

Iterations

GBCSCS

100

10minus20

10minus40

10minus60

Func

tion

valu

e

(g)

0 1000 2000 3000

Iterations

GBCSCS

100

10minus5

10minus10

10minus20

10minus15

Func

tion

valu

e

(h)

Figure 2 Evolution of mean best values for GBCS and the original CS algorithm for (a) Ackley (b) Beale (c) Booth (d) Cross-leg table (e)Himmelblau (f) Levy 13 (g) Matyas and (h) Schaffer functions

the Stochastic Cube function CS did slightly better at largeriterations but both algorithms failed to reach the globalminimum after 5000 iterations

The sphere function has five variables as the Cubefunction but it is easier to solve Both algorithms were ableto identify the global optimum as depicted in Figure 3(g)but GBCS considerably outperformed CS in terms of per-formance efficiency The Hartmann function which has 6

variables was relatively easy to solve for GBCS algorithms asshown in Figure 3(h) GBCS arrived at the global minimumbut CS could not do so within the tolerance GBCS alsooutperformed CS in terms of performance efficiency inreaching the global optimum of the Hartmann function

Figure 4 shows the performance of test functions with 50variablesThese are themost challenging problems due to thelarge domain space Figure 4(a) depicts the performance of

10 Mathematical Problems in Engineering

0 500 1000

Iterations

100

1010

10minus10

10minus20

Func

tion

valu

e

(a)

0 500 1000Iterations

102

100

10minus2

10minus4

Func

tion

valu

e(b)

0 50 100 150 200Iterations

minus100

minus101

minus102

Func

tion

valu

e

(c)

1020

100

10minus20

10minus40

Func

tion

valu

e

0 2000 4000 6000

Iterations

(d)

0 2000 4000 6000

Iterations

1010

105

100

10minus5

10minus10

Func

tion

valu

e

(e)

0 2000 4000 6000Iterations

1010

105

100

10minus5

Func

tion

valu

e(f)

0 500 1000

Iterations

1020

100

10minus20

10minus40

Func

tion

valu

e

GBCSCS

(g)

0 50 100 150 200

Iterations

Func

tion

valu

e

minus1004

minus1006

GBCSCS

(h)

Figure 3 Evolution of mean best values for GBCS and the original CS algorithm for (a) Powell (b) Power Sum (c) Shekel 5 (d) Wood (e)Cube (f) Stochastic Cube (g) Sphere and (h) Hartmann functions

both algorithms for the Dixon-Price function Clearly GBCSperformed better than CS in the early iterations Even thoughboth algorithms were not able to attain the global minimumtheminimum arrived at by GBCS at early iterations is orders-of-magnitude lower than that arrived at by CS

GBCS outperformed CS in both the Griewank(Figure 4(b)) and the Stochastic Griewank (Figure 4(c))For the Griewank function both algorithms were able toidentify the global minimum However GBCS arrived at the

global minimum at less than the half number of iterationscompared to CS For the Stochastic Griewank CS wasnot able to identify the global minimum even after 5000iterations The result predicted by CS was more than 10orders-of-magnitude higher than the result predicted byGBCS

Figure 4(d) shows the results for the Michaelwics func-tion To the best of the authorsrsquo knowledge the globalminimum for the 50-variableMichaelwics is not known since

Mathematical Problems in Engineering 11

0 2000 4000 6000

Iterations

1010

105

100

10minus5

(a)

0 2000 60004000

IterationsFu

nctio

n va

lue

1010

100

10minus10

10minus20

(b)

0 2000 60004000

Iterations

Func

tion

valu

e

105

100

10minus5

10minus10

10minus15

(c)

0 2000 60004000

Iterations

Func

tion

valu

e

minus1011

minus1013

minus1015

(d)

0 2000 60004000

Iterations

Func

tion

valu

e

105

1010

100

10minus5

(e)Fu

nctio

n va

lue

105

1010

1000 2000 60004000

Iterations

(f)

0 2000 60004000

Iterations

Func

tion

valu

e

105

106

104

103

GBCSCS

(g)

0 2000 60004000

Iterations

Func

tion

valu

e

105

1010

100

GBCSCS

(h)

Figure 4 Evolution of mean best values for GBCS and the original CS algorithm for (a) Dixon-Price (b) Griewank (c) Stochastic Griewank(d) Michaelwics (e) Rosenbrock (f) Stochastic Rosenbrock (g) Trigonometric and (h) Zacherov functions

there is no published literature identifying it GBCS identifiedaminimum that is lower than that identified byCS as depictedin Figure 4(d)

Although both algorithms were not able to identifythe global optima of the Rosenbrock and the StochasticRosenbrock functions as depicted in Figures 4(e) and 4(f)respectively the results obtained by GBCS were orders-of-magnitude lower than those obtained by CS The superiorityof the GBCS is clearly demonstrated with those two func-tions

Figure 4(g) shows the results for the Trigonometric func-tion For this difficult problem both algorithms were notable to reach the global minimum within 5000 iterations Inthis function nonetheless as in most cases studied GBCSachieved a lower result than CS

The last of the 50-variable functions tested was theZacherov function The results of Figure 4(h) also show thesuperiority of GBCS in reaching results that are orders-of-magnitude lower than those obtained by the original CSalgorithm

12 Mathematical Problems in Engineering

Table 2 shows a summary of the evaluation results for thetwenty-four benchmark problems GBCS was able to providebetter solutions to all challenging problemsGBCSwas able toachieve better performance in eighteen problems equivalentperformance in twoproblems andworse performance in fourout of the twenty-four problems In one case the StochasticGriewank function the global optimum was successfullyobtained by GBCS for this 50-variable problem while CSfailed to find the global optimum even after 5000 iterationsOn the other hand the four functions for which CS per-formed better were the Powell Wood Cube and StochasticCube functions

It is interesting to note that GBCS outperformed CSin handling two of the three stochastic functions tested inthis study This result confirms that despite the use of thegradient as a guide upon which some moves are basedthe stochastic nature of the algorithm remained unaffectedIn fact the additional information was used quite subtlyCuckoos still search for nests randomly but with the help ofsome intelligence

6 Conclusions

In this study we use the gradient of the objective functionwhich could be available or easily obtainable for severalobjective functions in engineering calculations to improvethe performance of one of the most promising stochasticalgorithms the cuckoo search The proposed modificationwas subtly implemented in the algorithm by changing thedirection of the local random walk of cuckoos towards thedirection of the minimum value from the point of viewof the cuckoorsquos location We evaluated this modification byattempting to find the global optimum of twenty-four bench-mark functionsThe newly developed GBCS algorithm led toimproved reliability and effectiveness of the algorithm in allbut four of tested benchmark problems which included threestochastic functions In some cases the global minimumcould not have been obtained via the original CS algorithmbut was obtained via GBCS Although the improvementin performance was not achieved with the entire set ofbenchmark problems GBCS proved to be more reliable andefficient in the majority of the tested problems

Conflict of InterestsThe authors declare that there is no conflict of interestsregarding the publication of this paper

References

[1] C A Floudas and C E Gounaris ldquoA review of recent advancesin global optimizationrdquo Journal of Global Optimization vol 45no 1 pp 3ndash38 2009

[2] L M Rios and N V Sahinidis ldquoDerivative-free optimizationa review of algorithms and comparison of software implemen-tationsrdquo Journal of Global Optimization vol 56 pp 1247ndash12932012

[3] X S Yang and S Deb ldquoCuckoo search via Levy flightsrdquo in Pro-ceedings of the World Congress on Nature amp Biologically InspiredComputing (NABIC rsquo09) pp 210ndash214 IEEE Coimbatore IndiaDecember 2009

[4] X-S Yang and S Deb ldquoCuckoo search recent advances andapplicationsrdquo Neural Computing and Applications vol 24 no1 pp 169ndash174 2014

[5] J Piechocki D Ambroziak A Palkowski and G RedlarskildquoUse of modified cuckoo search algorithm in the design processof integrated power systems for modern and energy self-sufficient farmsrdquo Applied Energy vol 114 pp 901ndash908 2014

[6] G Kanagaraj S Ponnambalam and N Jawahar ldquoA hybridcuckoo search and genetic algorithm for reliability-redundancyallocation problemsrdquo Computers amp Industrial Engineering vol66 no 4 pp 1115ndash1124 2013

[7] V Bhargava S Fateen and A Bonilla-Petriciolet ldquoCuckoosearch a new nature-inspired optimization method for phaseequilibrium calculationsrdquo Fluid Phase Equilibria vol 337 pp191ndash200 2013

[8] P KMohanty andD R Parhi ldquoCuckoo search algorithm for themobile robot navigationrdquo in Swarm Evolutionary and MemeticComputing pp 527ndash536 Springer New York NY USA 2013

[9] X S Yang and S Deb ldquoEngineering optimisation by cuckoosearchrdquo International Journal of Mathematical Modelling andNumerical Optimisation vol 1 no 4 pp 330ndash343 2010

Submit your manuscripts athttpwwwhindawicom

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical Problems in Engineering

Hindawi Publishing Corporationhttpwwwhindawicom

Differential EquationsInternational Journal of

Volume 2014

Applied MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Probability and StatisticsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical PhysicsAdvances in

Complex AnalysisJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

OptimizationJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

CombinatoricsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Operations ResearchAdvances in

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Function Spaces

Abstract and Applied AnalysisHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of Mathematics and Mathematical Sciences

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Algebra

Discrete Dynamics in Nature and Society

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Decision SciencesAdvances in

Discrete MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014 Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Stochastic AnalysisInternational Journal of

6 Mathematical Problems in Engineering

Table1Con

tinued

Num

ber

Nam

eFu

nctio

nandits

deriv

ative

Search

Dom

ain

17Dixon

-pric

e11989117=(1199091minus1)2+

119898 sum 119894=2

119894(21199092 119894minus119909119894minus1)2

12059711989117

1205971199091

=minus81199092 2+61199091minus2

12059711989117

120597119909119896

=(2119909119896minus41199092 119896+1)(119896+1)minus8119896119909119896(119909119896minus1minus21199092 119896)12059711989117

120597119909119898

=minus8119898119909119898(119909119898minus1minus21199092 119898)

[minus1010]

18Grie

wank

11989118=

1

4000[

119898 sum 119894=1

(119909119894minus100)2]minus[

119898 prod 119894=1

cos(119909119894minus100

radic119894

)]+1

12059711989118

120597119909119896

=(119909119896minus100)

2000

+1 radic119894tan(119909119896minus100

radic119894

)[

119898 prod 119894=1

cos(119909119894minus100

radic119894

)]

[minus600600]

19Grie

wanksto

chastic

11989119=

1

4000[

119898 sum 119894=1

120576 119894(119909119894minus100)2]minus[

119898 prod 119894=1

cos(119909119894minus100

radic119894

)]+1

12059711989118

120597119909119896

=120576 119896(119909119896minus100)

2000

+1 radic119894tan(119909119896minus100

radic119894

)[

119898 prod 119894=1

cos(119909119894minus100

radic119894

)]

[minus600600]

20Michaelw

icz

11989120=minus

119898 sum 119894=1

sin(119909119894)sin20

(1198941199092 119894

120587)

12059711989120

120597119909119896

=minuscos (119896minus119909)sin(1198961199092 119896

120587)

20

minus40119896119909119896

120587cos(1198961199092 119896

120587)sin(1198961199092 119896

120587)

19

sin(119909119896)

[0120587]

21Ro

senb

rock

11989121=

119898minus1

sum 119894=1

100(119909119894+1minus1199092 119894)2

+(119909119894minus1)212059711989121

1205971199091

=2(1199091minus1)minus4001199091(1199092minus1199092 1)

12059711989121

120597119909119896

=minus2001199092 119896minus1+202119909119896minus400119909119896(119909119896+1minus1199092 119896)minus212059711989121

120597119909119898

=200(119909119898minus1199092 119898minus1)

[minus5050]

22StochasticR

osenbrock

11989122=

119898minus1

sum 119894=1

100120576 119894(119909119894+1minus1199092 119894)2

+(119909119894minus1)212059711989122

1205971199091

=2(1199091minus1)minus400120576 11199091(1199092minus1199092 1)

12059711989122

120597119909119896

=2119909119896+200120576 119896(119909119896minus1199092 119896minus1)minus400120576 119896119909119896(119909119896+1minus1199092 119896)minus212059711989122

120597119909119898

=200120576 119894(119909119898minus1199092 119898minus1)

[minus5050]

23Trigon

ometric

11989123=

119898 sum 119894=1

[119898+119894(1minuscos119909119894)minussin119909119894minus

119898 sum 119895=1

cos119909119895]

2

12059711989123

120597119909119896

=2[119898+119896(1minuscos119909119894)minussin119909119894minus

119898 sum 119895=1

cos119909119895][ (119894+1)sin119909119896minuscos119909119896]

[minus10001000]

24Za

charov

11989124=

119898 sum 119894=1

1199092 119894+(

119898 sum 119894=1

05119894119909119894)

2

+(

119898 sum 119894=1

05119894119909119894)

4

12059711989124

120597119909119896

=2119909119896+119896

119898 sum 119894=1

05119894119909119894+2119896(

119898 sum 119894=1

05119894119909119894)

3

[minus510]

Mathematical Problems in Engineering 7

020

400

20400

5

10

15

20

xy minus20 minus20minus40 minus40

f(xy)

(a)

05

0

50

05

1

15

2

xy

f(xy)

minus5 minus5

times105

(b)

05

100

510

0500

10001500200025003000

xy

f(xy)

minus5 minus5minus10 minus10

(c)

x

0 05

10

yminus5 minus5

minus1

minus08

minus06

minus04

minus02

0

minus10 minus10

f(xy)

510

(d)

0

200

400

600

800

1000

xy

f(xy)

0 0

5

minus5 minus5

5

(e)

xy

0

100

200

300

400

f(xy)

550 0

1010

minus5 minus5minus10 minus10

(f)

xy

0

40

20

60

80

100

f(xy)

0 05

510

10

minus5 minus5minus10 minus10

(g)

y x

550 0

1010

minus5 minus5minus10 minus10

0

0402

0608

1

f(xy)

(h)

Figure 1 Surface plots of the two-variable benchmark functions used in this study (a) Ackley (b) Beale (c) Booth (d) cross-leg table (e)Himmelblau (f) Levy 13 (g) Matyas and (h) Schaffer functions

8 Mathematical Problems in Engineering

Table 2 Values of the meanminima and standard deviations obtained by the CS and GBCS algorithms compared with the value of the globalminima of the twenty-four benchmark problems

Number Benchmarkfunction

Number ofvariables

Globalmin

Number ofiterations

GBCS CSMean Std Dev Mean Std Dev

1 Ackley 2 0 1000 0 0 22204E-16 67752E-162 Beale 2 0 1000 0 0 57891E-30 12165E-293 Booth 2 0 1000 0 0 0 04 Cross-leg table 2 minus1 1000 minus11463E-2 7672E-3 minus62704E-3 36529E-35 Himmelblau 2 0 1000 17058E-28 2836E-28 25958E-19 53451E-196 Levy 13 2 0 1000 13498E-31 66809E-47 13498E-31 66809E-477 Matyas 2 0 1000 27691E-54 4728E-54 20407E-38 50616E-388 Schaffer 2 0 3000 0 0 74015E-18 19193E-179 Powell 4 0 1000 18694E-8 35848E-8 16296E-13 34802E-1310 Power sum 4 0 1000 18328E-4 16761E-4 25432E-4 18167E-411 Shekel 5 4 minus10536 200 minus10536 16289E-5 minus10536 18421E-212 Wood 4 0 1000 23726 22208 040838 033713 Cube 5 0 5000 12567 086542 5782E-8 25596E-714 Stochastic cube 5 0 5000 77438 69815 64369 5029215 Sphere 5 0 1000 25147E-38 51577E-38 11371E-21 12967E-2116 Hartmann 6 minus33224 200 minus33224 43959E-10 minus33215 60711E-417 Dixon-price 50 0 5000 47094E-2 16904E-1 66667E-1 26103E-618 Griewank 50 0 5000 0 0 33651E-10 94382E-10

19 StochasticGriewank 50 0 5000 72758E-13 28579E-12 69263 20451

20 Michaelwicz 50 5000 minus32263 13729 minus27383 1355121 Rosenbrock 50 0 5000 097368 05885 35286 37012

22 StochasticRosenbrock 50 0 5000 58599 19122 48944 46783

23 Trigonometric 50 0 5000 53560 45360 19435 4033724 Zacharov 50 0 5000 65769 13288 27031 52748

GBCS performed significantly better than CS as shown inFigure 2(d) On the other hand both algorithms were ableto identify the global minimum of the Himmelblau functionFigure 2(e) shows the evolution of the mean best valuesGBCS performed more effectively than CS Both algorithmswere also able to identify the minimum of the Levy 13 func-tion (Figure 2(f)) However GBCS was significantly moreeffective than CS This pattern was repeated with the Matyasfunction as shown in Figure 2(g)

The Schaffer function is multimodal Both GBCS and CSfailed to converge to the global minimum within the 10minus10tolerance at 1000 iterations Running both algorithms to 3000iterations resulted in GBCS reaching the global optimumwhile CS not as shown in Figure 2(f) Schaffer functionconcludes the 2-variable functions GBCS performed betterthan CS in all of them

The Powell function has four variables The evolution ofthe mean best values of CS and GBCS plotted in Figure 3(a)showed that performance of CS was better than GBCSalthough they were close to the global minimum Powell isone of the few test functions for which the use of the gradientdid not improve the performance of the CS algorithm

Minor improvements were observed with the power sumand the Shekel 5 functions Power sumrsquos global optimum wasnot obtained by the two algorithms as shown in Figure 3(b)as both algorithms seem to be trapped in a local minimumFor the Shekel 5 function Figure 3(c) the global optimumwas easily optimized with minor improvement in perfor-mance for the GBCS algorithm The global optimum for theWood function which has four variables as well was notachieved by both algorithms within 1000 iterations as shownin Figure 3(d) When they were run for 5000 GBCS seemedto be trapped in a local minimum and was not able to reachthe global optimum which was obtained by CS The Woodfunction is the only function for which GBCS performancewas much worse than that of the original CS algorithm

The performance of both algorithms for the Cube and theStochastic Cube functions was peculiar In both cases GBCSoutperformed CS at the early iterations At larger iterationsCS did better For the two functions as depicted in Figures3(e) and 3(f) the globalminimumwas not obtainedwith 1000iterations so both problems were run for 5000 CS was ableto come close to the global minimum for the Cube functionwhile GBCS seemed to be trapped in a local minimum For

Mathematical Problems in Engineering 9

0 500 1000

Func

tion

valu

e

Iterations

1010

100

10minus10

10minus20

(a)

0 500 1000

Func

tion

valu

eIterations

100

10minus20

10minus30

10minus10

10minus40

(b)

0 500 1000

Func

tion

valu

e

Iterations

1020

10minus20

10minus40

100

(c)

0 500 1000

Iterations

minus10minus4

minus10minus3

minus10minus2

minus10minus1

Func

tion

valu

e

(d)

0 500 1000

Iterations

1010

100

10minus10

10minus20

10minus30

Func

tion

valu

e

(e)

0 500 1000

Iterations

1020

100

10minus20

10minus40

Func

tion

valu

e(f)

0 500 1000

Iterations

GBCSCS

100

10minus20

10minus40

10minus60

Func

tion

valu

e

(g)

0 1000 2000 3000

Iterations

GBCSCS

100

10minus5

10minus10

10minus20

10minus15

Func

tion

valu

e

(h)

Figure 2 Evolution of mean best values for GBCS and the original CS algorithm for (a) Ackley (b) Beale (c) Booth (d) Cross-leg table (e)Himmelblau (f) Levy 13 (g) Matyas and (h) Schaffer functions

the Stochastic Cube function CS did slightly better at largeriterations but both algorithms failed to reach the globalminimum after 5000 iterations

The sphere function has five variables as the Cubefunction but it is easier to solve Both algorithms were ableto identify the global optimum as depicted in Figure 3(g)but GBCS considerably outperformed CS in terms of per-formance efficiency The Hartmann function which has 6

variables was relatively easy to solve for GBCS algorithms asshown in Figure 3(h) GBCS arrived at the global minimumbut CS could not do so within the tolerance GBCS alsooutperformed CS in terms of performance efficiency inreaching the global optimum of the Hartmann function

Figure 4 shows the performance of test functions with 50variablesThese are themost challenging problems due to thelarge domain space Figure 4(a) depicts the performance of

10 Mathematical Problems in Engineering

0 500 1000

Iterations

100

1010

10minus10

10minus20

Func

tion

valu

e

(a)

0 500 1000Iterations

102

100

10minus2

10minus4

Func

tion

valu

e(b)

0 50 100 150 200Iterations

minus100

minus101

minus102

Func

tion

valu

e

(c)

1020

100

10minus20

10minus40

Func

tion

valu

e

0 2000 4000 6000

Iterations

(d)

0 2000 4000 6000

Iterations

1010

105

100

10minus5

10minus10

Func

tion

valu

e

(e)

0 2000 4000 6000Iterations

1010

105

100

10minus5

Func

tion

valu

e(f)

0 500 1000

Iterations

1020

100

10minus20

10minus40

Func

tion

valu

e

GBCSCS

(g)

0 50 100 150 200

Iterations

Func

tion

valu

e

minus1004

minus1006

GBCSCS

(h)

Figure 3 Evolution of mean best values for GBCS and the original CS algorithm for (a) Powell (b) Power Sum (c) Shekel 5 (d) Wood (e)Cube (f) Stochastic Cube (g) Sphere and (h) Hartmann functions

both algorithms for the Dixon-Price function Clearly GBCSperformed better than CS in the early iterations Even thoughboth algorithms were not able to attain the global minimumtheminimum arrived at by GBCS at early iterations is orders-of-magnitude lower than that arrived at by CS

GBCS outperformed CS in both the Griewank(Figure 4(b)) and the Stochastic Griewank (Figure 4(c))For the Griewank function both algorithms were able toidentify the global minimum However GBCS arrived at the

global minimum at less than the half number of iterationscompared to CS For the Stochastic Griewank CS wasnot able to identify the global minimum even after 5000iterations The result predicted by CS was more than 10orders-of-magnitude higher than the result predicted byGBCS

Figure 4(d) shows the results for the Michaelwics func-tion To the best of the authorsrsquo knowledge the globalminimum for the 50-variableMichaelwics is not known since

Mathematical Problems in Engineering 11

0 2000 4000 6000

Iterations

1010

105

100

10minus5

(a)

0 2000 60004000

IterationsFu

nctio

n va

lue

1010

100

10minus10

10minus20

(b)

0 2000 60004000

Iterations

Func

tion

valu

e

105

100

10minus5

10minus10

10minus15

(c)

0 2000 60004000

Iterations

Func

tion

valu

e

minus1011

minus1013

minus1015

(d)

0 2000 60004000

Iterations

Func

tion

valu

e

105

1010

100

10minus5

(e)Fu

nctio

n va

lue

105

1010

1000 2000 60004000

Iterations

(f)

0 2000 60004000

Iterations

Func

tion

valu

e

105

106

104

103

GBCSCS

(g)

0 2000 60004000

Iterations

Func

tion

valu

e

105

1010

100

GBCSCS

(h)

Figure 4 Evolution of mean best values for GBCS and the original CS algorithm for (a) Dixon-Price (b) Griewank (c) Stochastic Griewank(d) Michaelwics (e) Rosenbrock (f) Stochastic Rosenbrock (g) Trigonometric and (h) Zacherov functions

there is no published literature identifying it GBCS identifiedaminimum that is lower than that identified byCS as depictedin Figure 4(d)

Although both algorithms were not able to identifythe global optima of the Rosenbrock and the StochasticRosenbrock functions as depicted in Figures 4(e) and 4(f)respectively the results obtained by GBCS were orders-of-magnitude lower than those obtained by CS The superiorityof the GBCS is clearly demonstrated with those two func-tions

Figure 4(g) shows the results for the Trigonometric func-tion For this difficult problem both algorithms were notable to reach the global minimum within 5000 iterations Inthis function nonetheless as in most cases studied GBCSachieved a lower result than CS

The last of the 50-variable functions tested was theZacherov function The results of Figure 4(h) also show thesuperiority of GBCS in reaching results that are orders-of-magnitude lower than those obtained by the original CSalgorithm

12 Mathematical Problems in Engineering

Table 2 shows a summary of the evaluation results for thetwenty-four benchmark problems GBCS was able to providebetter solutions to all challenging problemsGBCSwas able toachieve better performance in eighteen problems equivalentperformance in twoproblems andworse performance in fourout of the twenty-four problems In one case the StochasticGriewank function the global optimum was successfullyobtained by GBCS for this 50-variable problem while CSfailed to find the global optimum even after 5000 iterationsOn the other hand the four functions for which CS per-formed better were the Powell Wood Cube and StochasticCube functions

It is interesting to note that GBCS outperformed CSin handling two of the three stochastic functions tested inthis study This result confirms that despite the use of thegradient as a guide upon which some moves are basedthe stochastic nature of the algorithm remained unaffectedIn fact the additional information was used quite subtlyCuckoos still search for nests randomly but with the help ofsome intelligence

6 Conclusions

In this study we use the gradient of the objective functionwhich could be available or easily obtainable for severalobjective functions in engineering calculations to improvethe performance of one of the most promising stochasticalgorithms the cuckoo search The proposed modificationwas subtly implemented in the algorithm by changing thedirection of the local random walk of cuckoos towards thedirection of the minimum value from the point of viewof the cuckoorsquos location We evaluated this modification byattempting to find the global optimum of twenty-four bench-mark functionsThe newly developed GBCS algorithm led toimproved reliability and effectiveness of the algorithm in allbut four of tested benchmark problems which included threestochastic functions In some cases the global minimumcould not have been obtained via the original CS algorithmbut was obtained via GBCS Although the improvementin performance was not achieved with the entire set ofbenchmark problems GBCS proved to be more reliable andefficient in the majority of the tested problems

Conflict of InterestsThe authors declare that there is no conflict of interestsregarding the publication of this paper

References

[1] C A Floudas and C E Gounaris ldquoA review of recent advancesin global optimizationrdquo Journal of Global Optimization vol 45no 1 pp 3ndash38 2009

[2] L M Rios and N V Sahinidis ldquoDerivative-free optimizationa review of algorithms and comparison of software implemen-tationsrdquo Journal of Global Optimization vol 56 pp 1247ndash12932012

[3] X S Yang and S Deb ldquoCuckoo search via Levy flightsrdquo in Pro-ceedings of the World Congress on Nature amp Biologically InspiredComputing (NABIC rsquo09) pp 210ndash214 IEEE Coimbatore IndiaDecember 2009

[4] X-S Yang and S Deb ldquoCuckoo search recent advances andapplicationsrdquo Neural Computing and Applications vol 24 no1 pp 169ndash174 2014

[5] J Piechocki D Ambroziak A Palkowski and G RedlarskildquoUse of modified cuckoo search algorithm in the design processof integrated power systems for modern and energy self-sufficient farmsrdquo Applied Energy vol 114 pp 901ndash908 2014

[6] G Kanagaraj S Ponnambalam and N Jawahar ldquoA hybridcuckoo search and genetic algorithm for reliability-redundancyallocation problemsrdquo Computers amp Industrial Engineering vol66 no 4 pp 1115ndash1124 2013

[7] V Bhargava S Fateen and A Bonilla-Petriciolet ldquoCuckoosearch a new nature-inspired optimization method for phaseequilibrium calculationsrdquo Fluid Phase Equilibria vol 337 pp191ndash200 2013

[8] P KMohanty andD R Parhi ldquoCuckoo search algorithm for themobile robot navigationrdquo in Swarm Evolutionary and MemeticComputing pp 527ndash536 Springer New York NY USA 2013

[9] X S Yang and S Deb ldquoEngineering optimisation by cuckoosearchrdquo International Journal of Mathematical Modelling andNumerical Optimisation vol 1 no 4 pp 330ndash343 2010

Submit your manuscripts athttpwwwhindawicom

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical Problems in Engineering

Hindawi Publishing Corporationhttpwwwhindawicom

Differential EquationsInternational Journal of

Volume 2014

Applied MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Probability and StatisticsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical PhysicsAdvances in

Complex AnalysisJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

OptimizationJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

CombinatoricsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Operations ResearchAdvances in

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Function Spaces

Abstract and Applied AnalysisHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of Mathematics and Mathematical Sciences

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Algebra

Discrete Dynamics in Nature and Society

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Decision SciencesAdvances in

Discrete MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014 Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Stochastic AnalysisInternational Journal of

Mathematical Problems in Engineering 7

020

400

20400

5

10

15

20

xy minus20 minus20minus40 minus40

f(xy)

(a)

05

0

50

05

1

15

2

xy

f(xy)

minus5 minus5

times105

(b)

05

100

510

0500

10001500200025003000

xy

f(xy)

minus5 minus5minus10 minus10

(c)

x

0 05

10

yminus5 minus5

minus1

minus08

minus06

minus04

minus02

0

minus10 minus10

f(xy)

510

(d)

0

200

400

600

800

1000

xy

f(xy)

0 0

5

minus5 minus5

5

(e)

xy

0

100

200

300

400

f(xy)

550 0

1010

minus5 minus5minus10 minus10

(f)

xy

0

40

20

60

80

100

f(xy)

0 05

510

10

minus5 minus5minus10 minus10

(g)

y x

550 0

1010

minus5 minus5minus10 minus10

0

0402

0608

1

f(xy)

(h)

Figure 1 Surface plots of the two-variable benchmark functions used in this study (a) Ackley (b) Beale (c) Booth (d) cross-leg table (e)Himmelblau (f) Levy 13 (g) Matyas and (h) Schaffer functions

8 Mathematical Problems in Engineering

Table 2 Values of the meanminima and standard deviations obtained by the CS and GBCS algorithms compared with the value of the globalminima of the twenty-four benchmark problems

Number Benchmarkfunction

Number ofvariables

Globalmin

Number ofiterations

GBCS CSMean Std Dev Mean Std Dev

1 Ackley 2 0 1000 0 0 22204E-16 67752E-162 Beale 2 0 1000 0 0 57891E-30 12165E-293 Booth 2 0 1000 0 0 0 04 Cross-leg table 2 minus1 1000 minus11463E-2 7672E-3 minus62704E-3 36529E-35 Himmelblau 2 0 1000 17058E-28 2836E-28 25958E-19 53451E-196 Levy 13 2 0 1000 13498E-31 66809E-47 13498E-31 66809E-477 Matyas 2 0 1000 27691E-54 4728E-54 20407E-38 50616E-388 Schaffer 2 0 3000 0 0 74015E-18 19193E-179 Powell 4 0 1000 18694E-8 35848E-8 16296E-13 34802E-1310 Power sum 4 0 1000 18328E-4 16761E-4 25432E-4 18167E-411 Shekel 5 4 minus10536 200 minus10536 16289E-5 minus10536 18421E-212 Wood 4 0 1000 23726 22208 040838 033713 Cube 5 0 5000 12567 086542 5782E-8 25596E-714 Stochastic cube 5 0 5000 77438 69815 64369 5029215 Sphere 5 0 1000 25147E-38 51577E-38 11371E-21 12967E-2116 Hartmann 6 minus33224 200 minus33224 43959E-10 minus33215 60711E-417 Dixon-price 50 0 5000 47094E-2 16904E-1 66667E-1 26103E-618 Griewank 50 0 5000 0 0 33651E-10 94382E-10

19 StochasticGriewank 50 0 5000 72758E-13 28579E-12 69263 20451

20 Michaelwicz 50 5000 minus32263 13729 minus27383 1355121 Rosenbrock 50 0 5000 097368 05885 35286 37012

22 StochasticRosenbrock 50 0 5000 58599 19122 48944 46783

23 Trigonometric 50 0 5000 53560 45360 19435 4033724 Zacharov 50 0 5000 65769 13288 27031 52748

GBCS performed significantly better than CS as shown inFigure 2(d) On the other hand both algorithms were ableto identify the global minimum of the Himmelblau functionFigure 2(e) shows the evolution of the mean best valuesGBCS performed more effectively than CS Both algorithmswere also able to identify the minimum of the Levy 13 func-tion (Figure 2(f)) However GBCS was significantly moreeffective than CS This pattern was repeated with the Matyasfunction as shown in Figure 2(g)

The Schaffer function is multimodal Both GBCS and CSfailed to converge to the global minimum within the 10minus10tolerance at 1000 iterations Running both algorithms to 3000iterations resulted in GBCS reaching the global optimumwhile CS not as shown in Figure 2(f) Schaffer functionconcludes the 2-variable functions GBCS performed betterthan CS in all of them

The Powell function has four variables The evolution ofthe mean best values of CS and GBCS plotted in Figure 3(a)showed that performance of CS was better than GBCSalthough they were close to the global minimum Powell isone of the few test functions for which the use of the gradientdid not improve the performance of the CS algorithm

Minor improvements were observed with the power sumand the Shekel 5 functions Power sumrsquos global optimum wasnot obtained by the two algorithms as shown in Figure 3(b)as both algorithms seem to be trapped in a local minimumFor the Shekel 5 function Figure 3(c) the global optimumwas easily optimized with minor improvement in perfor-mance for the GBCS algorithm The global optimum for theWood function which has four variables as well was notachieved by both algorithms within 1000 iterations as shownin Figure 3(d) When they were run for 5000 GBCS seemedto be trapped in a local minimum and was not able to reachthe global optimum which was obtained by CS The Woodfunction is the only function for which GBCS performancewas much worse than that of the original CS algorithm

The performance of both algorithms for the Cube and theStochastic Cube functions was peculiar In both cases GBCSoutperformed CS at the early iterations At larger iterationsCS did better For the two functions as depicted in Figures3(e) and 3(f) the globalminimumwas not obtainedwith 1000iterations so both problems were run for 5000 CS was ableto come close to the global minimum for the Cube functionwhile GBCS seemed to be trapped in a local minimum For

Mathematical Problems in Engineering 9

0 500 1000

Func

tion

valu

e

Iterations

1010

100

10minus10

10minus20

(a)

0 500 1000

Func

tion

valu

eIterations

100

10minus20

10minus30

10minus10

10minus40

(b)

0 500 1000

Func

tion

valu

e

Iterations

1020

10minus20

10minus40

100

(c)

0 500 1000

Iterations

minus10minus4

minus10minus3

minus10minus2

minus10minus1

Func

tion

valu

e

(d)

0 500 1000

Iterations

1010

100

10minus10

10minus20

10minus30

Func

tion

valu

e

(e)

0 500 1000

Iterations

1020

100

10minus20

10minus40

Func

tion

valu

e(f)

0 500 1000

Iterations

GBCSCS

100

10minus20

10minus40

10minus60

Func

tion

valu

e

(g)

0 1000 2000 3000

Iterations

GBCSCS

100

10minus5

10minus10

10minus20

10minus15

Func

tion

valu

e

(h)

Figure 2 Evolution of mean best values for GBCS and the original CS algorithm for (a) Ackley (b) Beale (c) Booth (d) Cross-leg table (e)Himmelblau (f) Levy 13 (g) Matyas and (h) Schaffer functions

the Stochastic Cube function CS did slightly better at largeriterations but both algorithms failed to reach the globalminimum after 5000 iterations

The sphere function has five variables as the Cubefunction but it is easier to solve Both algorithms were ableto identify the global optimum as depicted in Figure 3(g)but GBCS considerably outperformed CS in terms of per-formance efficiency The Hartmann function which has 6

variables was relatively easy to solve for GBCS algorithms asshown in Figure 3(h) GBCS arrived at the global minimumbut CS could not do so within the tolerance GBCS alsooutperformed CS in terms of performance efficiency inreaching the global optimum of the Hartmann function

Figure 4 shows the performance of test functions with 50variablesThese are themost challenging problems due to thelarge domain space Figure 4(a) depicts the performance of

10 Mathematical Problems in Engineering

0 500 1000

Iterations

100

1010

10minus10

10minus20

Func

tion

valu

e

(a)

0 500 1000Iterations

102

100

10minus2

10minus4

Func

tion

valu

e(b)

0 50 100 150 200Iterations

minus100

minus101

minus102

Func

tion

valu

e

(c)

1020

100

10minus20

10minus40

Func

tion

valu

e

0 2000 4000 6000

Iterations

(d)

0 2000 4000 6000

Iterations

1010

105

100

10minus5

10minus10

Func

tion

valu

e

(e)

0 2000 4000 6000Iterations

1010

105

100

10minus5

Func

tion

valu

e(f)

0 500 1000

Iterations

1020

100

10minus20

10minus40

Func

tion

valu

e

GBCSCS

(g)

0 50 100 150 200

Iterations

Func

tion

valu

e

minus1004

minus1006

GBCSCS

(h)

Figure 3 Evolution of mean best values for GBCS and the original CS algorithm for (a) Powell (b) Power Sum (c) Shekel 5 (d) Wood (e)Cube (f) Stochastic Cube (g) Sphere and (h) Hartmann functions

both algorithms for the Dixon-Price function Clearly GBCSperformed better than CS in the early iterations Even thoughboth algorithms were not able to attain the global minimumtheminimum arrived at by GBCS at early iterations is orders-of-magnitude lower than that arrived at by CS

GBCS outperformed CS in both the Griewank(Figure 4(b)) and the Stochastic Griewank (Figure 4(c))For the Griewank function both algorithms were able toidentify the global minimum However GBCS arrived at the

global minimum at less than the half number of iterationscompared to CS For the Stochastic Griewank CS wasnot able to identify the global minimum even after 5000iterations The result predicted by CS was more than 10orders-of-magnitude higher than the result predicted byGBCS

Figure 4(d) shows the results for the Michaelwics func-tion To the best of the authorsrsquo knowledge the globalminimum for the 50-variableMichaelwics is not known since

Mathematical Problems in Engineering 11

0 2000 4000 6000

Iterations

1010

105

100

10minus5

(a)

0 2000 60004000

IterationsFu

nctio

n va

lue

1010

100

10minus10

10minus20

(b)

0 2000 60004000

Iterations

Func

tion

valu

e

105

100

10minus5

10minus10

10minus15

(c)

0 2000 60004000

Iterations

Func

tion

valu

e

minus1011

minus1013

minus1015

(d)

0 2000 60004000

Iterations

Func

tion

valu

e

105

1010

100

10minus5

(e)Fu

nctio

n va

lue

105

1010

1000 2000 60004000

Iterations

(f)

0 2000 60004000

Iterations

Func

tion

valu

e

105

106

104

103

GBCSCS

(g)

0 2000 60004000

Iterations

Func

tion

valu

e

105

1010

100

GBCSCS

(h)

Figure 4 Evolution of mean best values for GBCS and the original CS algorithm for (a) Dixon-Price (b) Griewank (c) Stochastic Griewank(d) Michaelwics (e) Rosenbrock (f) Stochastic Rosenbrock (g) Trigonometric and (h) Zacherov functions

there is no published literature identifying it GBCS identifiedaminimum that is lower than that identified byCS as depictedin Figure 4(d)

Although both algorithms were not able to identifythe global optima of the Rosenbrock and the StochasticRosenbrock functions as depicted in Figures 4(e) and 4(f)respectively the results obtained by GBCS were orders-of-magnitude lower than those obtained by CS The superiorityof the GBCS is clearly demonstrated with those two func-tions

Figure 4(g) shows the results for the Trigonometric func-tion For this difficult problem both algorithms were notable to reach the global minimum within 5000 iterations Inthis function nonetheless as in most cases studied GBCSachieved a lower result than CS

The last of the 50-variable functions tested was theZacherov function The results of Figure 4(h) also show thesuperiority of GBCS in reaching results that are orders-of-magnitude lower than those obtained by the original CSalgorithm

12 Mathematical Problems in Engineering

Table 2 shows a summary of the evaluation results for thetwenty-four benchmark problems GBCS was able to providebetter solutions to all challenging problemsGBCSwas able toachieve better performance in eighteen problems equivalentperformance in twoproblems andworse performance in fourout of the twenty-four problems In one case the StochasticGriewank function the global optimum was successfullyobtained by GBCS for this 50-variable problem while CSfailed to find the global optimum even after 5000 iterationsOn the other hand the four functions for which CS per-formed better were the Powell Wood Cube and StochasticCube functions

It is interesting to note that GBCS outperformed CSin handling two of the three stochastic functions tested inthis study This result confirms that despite the use of thegradient as a guide upon which some moves are basedthe stochastic nature of the algorithm remained unaffectedIn fact the additional information was used quite subtlyCuckoos still search for nests randomly but with the help ofsome intelligence

6 Conclusions

In this study we use the gradient of the objective functionwhich could be available or easily obtainable for severalobjective functions in engineering calculations to improvethe performance of one of the most promising stochasticalgorithms the cuckoo search The proposed modificationwas subtly implemented in the algorithm by changing thedirection of the local random walk of cuckoos towards thedirection of the minimum value from the point of viewof the cuckoorsquos location We evaluated this modification byattempting to find the global optimum of twenty-four bench-mark functionsThe newly developed GBCS algorithm led toimproved reliability and effectiveness of the algorithm in allbut four of tested benchmark problems which included threestochastic functions In some cases the global minimumcould not have been obtained via the original CS algorithmbut was obtained via GBCS Although the improvementin performance was not achieved with the entire set ofbenchmark problems GBCS proved to be more reliable andefficient in the majority of the tested problems

Conflict of InterestsThe authors declare that there is no conflict of interestsregarding the publication of this paper

References

[1] C A Floudas and C E Gounaris ldquoA review of recent advancesin global optimizationrdquo Journal of Global Optimization vol 45no 1 pp 3ndash38 2009

[2] L M Rios and N V Sahinidis ldquoDerivative-free optimizationa review of algorithms and comparison of software implemen-tationsrdquo Journal of Global Optimization vol 56 pp 1247ndash12932012

[3] X S Yang and S Deb ldquoCuckoo search via Levy flightsrdquo in Pro-ceedings of the World Congress on Nature amp Biologically InspiredComputing (NABIC rsquo09) pp 210ndash214 IEEE Coimbatore IndiaDecember 2009

[4] X-S Yang and S Deb ldquoCuckoo search recent advances andapplicationsrdquo Neural Computing and Applications vol 24 no1 pp 169ndash174 2014

[5] J Piechocki D Ambroziak A Palkowski and G RedlarskildquoUse of modified cuckoo search algorithm in the design processof integrated power systems for modern and energy self-sufficient farmsrdquo Applied Energy vol 114 pp 901ndash908 2014

[6] G Kanagaraj S Ponnambalam and N Jawahar ldquoA hybridcuckoo search and genetic algorithm for reliability-redundancyallocation problemsrdquo Computers amp Industrial Engineering vol66 no 4 pp 1115ndash1124 2013

[7] V Bhargava S Fateen and A Bonilla-Petriciolet ldquoCuckoosearch a new nature-inspired optimization method for phaseequilibrium calculationsrdquo Fluid Phase Equilibria vol 337 pp191ndash200 2013

[8] P KMohanty andD R Parhi ldquoCuckoo search algorithm for themobile robot navigationrdquo in Swarm Evolutionary and MemeticComputing pp 527ndash536 Springer New York NY USA 2013

[9] X S Yang and S Deb ldquoEngineering optimisation by cuckoosearchrdquo International Journal of Mathematical Modelling andNumerical Optimisation vol 1 no 4 pp 330ndash343 2010

Submit your manuscripts athttpwwwhindawicom

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical Problems in Engineering

Hindawi Publishing Corporationhttpwwwhindawicom

Differential EquationsInternational Journal of

Volume 2014

Applied MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Probability and StatisticsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical PhysicsAdvances in

Complex AnalysisJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

OptimizationJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

CombinatoricsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Operations ResearchAdvances in

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Function Spaces

Abstract and Applied AnalysisHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of Mathematics and Mathematical Sciences

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Algebra

Discrete Dynamics in Nature and Society

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Decision SciencesAdvances in

Discrete MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014 Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Stochastic AnalysisInternational Journal of

8 Mathematical Problems in Engineering

Table 2 Values of the meanminima and standard deviations obtained by the CS and GBCS algorithms compared with the value of the globalminima of the twenty-four benchmark problems

Number Benchmarkfunction

Number ofvariables

Globalmin

Number ofiterations

GBCS CSMean Std Dev Mean Std Dev

1 Ackley 2 0 1000 0 0 22204E-16 67752E-162 Beale 2 0 1000 0 0 57891E-30 12165E-293 Booth 2 0 1000 0 0 0 04 Cross-leg table 2 minus1 1000 minus11463E-2 7672E-3 minus62704E-3 36529E-35 Himmelblau 2 0 1000 17058E-28 2836E-28 25958E-19 53451E-196 Levy 13 2 0 1000 13498E-31 66809E-47 13498E-31 66809E-477 Matyas 2 0 1000 27691E-54 4728E-54 20407E-38 50616E-388 Schaffer 2 0 3000 0 0 74015E-18 19193E-179 Powell 4 0 1000 18694E-8 35848E-8 16296E-13 34802E-1310 Power sum 4 0 1000 18328E-4 16761E-4 25432E-4 18167E-411 Shekel 5 4 minus10536 200 minus10536 16289E-5 minus10536 18421E-212 Wood 4 0 1000 23726 22208 040838 033713 Cube 5 0 5000 12567 086542 5782E-8 25596E-714 Stochastic cube 5 0 5000 77438 69815 64369 5029215 Sphere 5 0 1000 25147E-38 51577E-38 11371E-21 12967E-2116 Hartmann 6 minus33224 200 minus33224 43959E-10 minus33215 60711E-417 Dixon-price 50 0 5000 47094E-2 16904E-1 66667E-1 26103E-618 Griewank 50 0 5000 0 0 33651E-10 94382E-10

19 StochasticGriewank 50 0 5000 72758E-13 28579E-12 69263 20451

20 Michaelwicz 50 5000 minus32263 13729 minus27383 1355121 Rosenbrock 50 0 5000 097368 05885 35286 37012

22 StochasticRosenbrock 50 0 5000 58599 19122 48944 46783

23 Trigonometric 50 0 5000 53560 45360 19435 4033724 Zacharov 50 0 5000 65769 13288 27031 52748

GBCS performed significantly better than CS as shown inFigure 2(d) On the other hand both algorithms were ableto identify the global minimum of the Himmelblau functionFigure 2(e) shows the evolution of the mean best valuesGBCS performed more effectively than CS Both algorithmswere also able to identify the minimum of the Levy 13 func-tion (Figure 2(f)) However GBCS was significantly moreeffective than CS This pattern was repeated with the Matyasfunction as shown in Figure 2(g)

The Schaffer function is multimodal Both GBCS and CSfailed to converge to the global minimum within the 10minus10tolerance at 1000 iterations Running both algorithms to 3000iterations resulted in GBCS reaching the global optimumwhile CS not as shown in Figure 2(f) Schaffer functionconcludes the 2-variable functions GBCS performed betterthan CS in all of them

The Powell function has four variables The evolution ofthe mean best values of CS and GBCS plotted in Figure 3(a)showed that performance of CS was better than GBCSalthough they were close to the global minimum Powell isone of the few test functions for which the use of the gradientdid not improve the performance of the CS algorithm

Minor improvements were observed with the power sumand the Shekel 5 functions Power sumrsquos global optimum wasnot obtained by the two algorithms as shown in Figure 3(b)as both algorithms seem to be trapped in a local minimumFor the Shekel 5 function Figure 3(c) the global optimumwas easily optimized with minor improvement in perfor-mance for the GBCS algorithm The global optimum for theWood function which has four variables as well was notachieved by both algorithms within 1000 iterations as shownin Figure 3(d) When they were run for 5000 GBCS seemedto be trapped in a local minimum and was not able to reachthe global optimum which was obtained by CS The Woodfunction is the only function for which GBCS performancewas much worse than that of the original CS algorithm

The performance of both algorithms for the Cube and theStochastic Cube functions was peculiar In both cases GBCSoutperformed CS at the early iterations At larger iterationsCS did better For the two functions as depicted in Figures3(e) and 3(f) the globalminimumwas not obtainedwith 1000iterations so both problems were run for 5000 CS was ableto come close to the global minimum for the Cube functionwhile GBCS seemed to be trapped in a local minimum For

Mathematical Problems in Engineering 9

0 500 1000

Func

tion

valu

e

Iterations

1010

100

10minus10

10minus20

(a)

0 500 1000

Func

tion

valu

eIterations

100

10minus20

10minus30

10minus10

10minus40

(b)

0 500 1000

Func

tion

valu

e

Iterations

1020

10minus20

10minus40

100

(c)

0 500 1000

Iterations

minus10minus4

minus10minus3

minus10minus2

minus10minus1

Func

tion

valu

e

(d)

0 500 1000

Iterations

1010

100

10minus10

10minus20

10minus30

Func

tion

valu

e

(e)

0 500 1000

Iterations

1020

100

10minus20

10minus40

Func

tion

valu

e(f)

0 500 1000

Iterations

GBCSCS

100

10minus20

10minus40

10minus60

Func

tion

valu

e

(g)

0 1000 2000 3000

Iterations

GBCSCS

100

10minus5

10minus10

10minus20

10minus15

Func

tion

valu

e

(h)

Figure 2 Evolution of mean best values for GBCS and the original CS algorithm for (a) Ackley (b) Beale (c) Booth (d) Cross-leg table (e)Himmelblau (f) Levy 13 (g) Matyas and (h) Schaffer functions

the Stochastic Cube function CS did slightly better at largeriterations but both algorithms failed to reach the globalminimum after 5000 iterations

The sphere function has five variables as the Cubefunction but it is easier to solve Both algorithms were ableto identify the global optimum as depicted in Figure 3(g)but GBCS considerably outperformed CS in terms of per-formance efficiency The Hartmann function which has 6

variables was relatively easy to solve for GBCS algorithms asshown in Figure 3(h) GBCS arrived at the global minimumbut CS could not do so within the tolerance GBCS alsooutperformed CS in terms of performance efficiency inreaching the global optimum of the Hartmann function

Figure 4 shows the performance of test functions with 50variablesThese are themost challenging problems due to thelarge domain space Figure 4(a) depicts the performance of

10 Mathematical Problems in Engineering

0 500 1000

Iterations

100

1010

10minus10

10minus20

Func

tion

valu

e

(a)

0 500 1000Iterations

102

100

10minus2

10minus4

Func

tion

valu

e(b)

0 50 100 150 200Iterations

minus100

minus101

minus102

Func

tion

valu

e

(c)

1020

100

10minus20

10minus40

Func

tion

valu

e

0 2000 4000 6000

Iterations

(d)

0 2000 4000 6000

Iterations

1010

105

100

10minus5

10minus10

Func

tion

valu

e

(e)

0 2000 4000 6000Iterations

1010

105

100

10minus5

Func

tion

valu

e(f)

0 500 1000

Iterations

1020

100

10minus20

10minus40

Func

tion

valu

e

GBCSCS

(g)

0 50 100 150 200

Iterations

Func

tion

valu

e

minus1004

minus1006

GBCSCS

(h)

Figure 3 Evolution of mean best values for GBCS and the original CS algorithm for (a) Powell (b) Power Sum (c) Shekel 5 (d) Wood (e)Cube (f) Stochastic Cube (g) Sphere and (h) Hartmann functions

both algorithms for the Dixon-Price function Clearly GBCSperformed better than CS in the early iterations Even thoughboth algorithms were not able to attain the global minimumtheminimum arrived at by GBCS at early iterations is orders-of-magnitude lower than that arrived at by CS

GBCS outperformed CS in both the Griewank(Figure 4(b)) and the Stochastic Griewank (Figure 4(c))For the Griewank function both algorithms were able toidentify the global minimum However GBCS arrived at the

global minimum at less than the half number of iterationscompared to CS For the Stochastic Griewank CS wasnot able to identify the global minimum even after 5000iterations The result predicted by CS was more than 10orders-of-magnitude higher than the result predicted byGBCS

Figure 4(d) shows the results for the Michaelwics func-tion To the best of the authorsrsquo knowledge the globalminimum for the 50-variableMichaelwics is not known since

Mathematical Problems in Engineering 11

0 2000 4000 6000

Iterations

1010

105

100

10minus5

(a)

0 2000 60004000

IterationsFu

nctio

n va

lue

1010

100

10minus10

10minus20

(b)

0 2000 60004000

Iterations

Func

tion

valu

e

105

100

10minus5

10minus10

10minus15

(c)

0 2000 60004000

Iterations

Func

tion

valu

e

minus1011

minus1013

minus1015

(d)

0 2000 60004000

Iterations

Func

tion

valu

e

105

1010

100

10minus5

(e)Fu

nctio

n va

lue

105

1010

1000 2000 60004000

Iterations

(f)

0 2000 60004000

Iterations

Func

tion

valu

e

105

106

104

103

GBCSCS

(g)

0 2000 60004000

Iterations

Func

tion

valu

e

105

1010

100

GBCSCS

(h)

Figure 4 Evolution of mean best values for GBCS and the original CS algorithm for (a) Dixon-Price (b) Griewank (c) Stochastic Griewank(d) Michaelwics (e) Rosenbrock (f) Stochastic Rosenbrock (g) Trigonometric and (h) Zacherov functions

there is no published literature identifying it GBCS identifiedaminimum that is lower than that identified byCS as depictedin Figure 4(d)

Although both algorithms were not able to identifythe global optima of the Rosenbrock and the StochasticRosenbrock functions as depicted in Figures 4(e) and 4(f)respectively the results obtained by GBCS were orders-of-magnitude lower than those obtained by CS The superiorityof the GBCS is clearly demonstrated with those two func-tions

Figure 4(g) shows the results for the Trigonometric func-tion For this difficult problem both algorithms were notable to reach the global minimum within 5000 iterations Inthis function nonetheless as in most cases studied GBCSachieved a lower result than CS

The last of the 50-variable functions tested was theZacherov function The results of Figure 4(h) also show thesuperiority of GBCS in reaching results that are orders-of-magnitude lower than those obtained by the original CSalgorithm

12 Mathematical Problems in Engineering

Table 2 shows a summary of the evaluation results for thetwenty-four benchmark problems GBCS was able to providebetter solutions to all challenging problemsGBCSwas able toachieve better performance in eighteen problems equivalentperformance in twoproblems andworse performance in fourout of the twenty-four problems In one case the StochasticGriewank function the global optimum was successfullyobtained by GBCS for this 50-variable problem while CSfailed to find the global optimum even after 5000 iterationsOn the other hand the four functions for which CS per-formed better were the Powell Wood Cube and StochasticCube functions

It is interesting to note that GBCS outperformed CSin handling two of the three stochastic functions tested inthis study This result confirms that despite the use of thegradient as a guide upon which some moves are basedthe stochastic nature of the algorithm remained unaffectedIn fact the additional information was used quite subtlyCuckoos still search for nests randomly but with the help ofsome intelligence

6 Conclusions

In this study we use the gradient of the objective functionwhich could be available or easily obtainable for severalobjective functions in engineering calculations to improvethe performance of one of the most promising stochasticalgorithms the cuckoo search The proposed modificationwas subtly implemented in the algorithm by changing thedirection of the local random walk of cuckoos towards thedirection of the minimum value from the point of viewof the cuckoorsquos location We evaluated this modification byattempting to find the global optimum of twenty-four bench-mark functionsThe newly developed GBCS algorithm led toimproved reliability and effectiveness of the algorithm in allbut four of tested benchmark problems which included threestochastic functions In some cases the global minimumcould not have been obtained via the original CS algorithmbut was obtained via GBCS Although the improvementin performance was not achieved with the entire set ofbenchmark problems GBCS proved to be more reliable andefficient in the majority of the tested problems

Conflict of InterestsThe authors declare that there is no conflict of interestsregarding the publication of this paper

References

[1] C A Floudas and C E Gounaris ldquoA review of recent advancesin global optimizationrdquo Journal of Global Optimization vol 45no 1 pp 3ndash38 2009

[2] L M Rios and N V Sahinidis ldquoDerivative-free optimizationa review of algorithms and comparison of software implemen-tationsrdquo Journal of Global Optimization vol 56 pp 1247ndash12932012

[3] X S Yang and S Deb ldquoCuckoo search via Levy flightsrdquo in Pro-ceedings of the World Congress on Nature amp Biologically InspiredComputing (NABIC rsquo09) pp 210ndash214 IEEE Coimbatore IndiaDecember 2009

[4] X-S Yang and S Deb ldquoCuckoo search recent advances andapplicationsrdquo Neural Computing and Applications vol 24 no1 pp 169ndash174 2014

[5] J Piechocki D Ambroziak A Palkowski and G RedlarskildquoUse of modified cuckoo search algorithm in the design processof integrated power systems for modern and energy self-sufficient farmsrdquo Applied Energy vol 114 pp 901ndash908 2014

[6] G Kanagaraj S Ponnambalam and N Jawahar ldquoA hybridcuckoo search and genetic algorithm for reliability-redundancyallocation problemsrdquo Computers amp Industrial Engineering vol66 no 4 pp 1115ndash1124 2013

[7] V Bhargava S Fateen and A Bonilla-Petriciolet ldquoCuckoosearch a new nature-inspired optimization method for phaseequilibrium calculationsrdquo Fluid Phase Equilibria vol 337 pp191ndash200 2013

[8] P KMohanty andD R Parhi ldquoCuckoo search algorithm for themobile robot navigationrdquo in Swarm Evolutionary and MemeticComputing pp 527ndash536 Springer New York NY USA 2013

[9] X S Yang and S Deb ldquoEngineering optimisation by cuckoosearchrdquo International Journal of Mathematical Modelling andNumerical Optimisation vol 1 no 4 pp 330ndash343 2010

Submit your manuscripts athttpwwwhindawicom

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical Problems in Engineering

Hindawi Publishing Corporationhttpwwwhindawicom

Differential EquationsInternational Journal of

Volume 2014

Applied MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Probability and StatisticsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical PhysicsAdvances in

Complex AnalysisJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

OptimizationJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

CombinatoricsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Operations ResearchAdvances in

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Function Spaces

Abstract and Applied AnalysisHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of Mathematics and Mathematical Sciences

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Algebra

Discrete Dynamics in Nature and Society

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Decision SciencesAdvances in

Discrete MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014 Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Stochastic AnalysisInternational Journal of

Mathematical Problems in Engineering 9

0 500 1000

Func

tion

valu

e

Iterations

1010

100

10minus10

10minus20

(a)

0 500 1000

Func

tion

valu

eIterations

100

10minus20

10minus30

10minus10

10minus40

(b)

0 500 1000

Func

tion

valu

e

Iterations

1020

10minus20

10minus40

100

(c)

0 500 1000

Iterations

minus10minus4

minus10minus3

minus10minus2

minus10minus1

Func

tion

valu

e

(d)

0 500 1000

Iterations

1010

100

10minus10

10minus20

10minus30

Func

tion

valu

e

(e)

0 500 1000

Iterations

1020

100

10minus20

10minus40

Func

tion

valu

e(f)

0 500 1000

Iterations

GBCSCS

100

10minus20

10minus40

10minus60

Func

tion

valu

e

(g)

0 1000 2000 3000

Iterations

GBCSCS

100

10minus5

10minus10

10minus20

10minus15

Func

tion

valu

e

(h)

Figure 2 Evolution of mean best values for GBCS and the original CS algorithm for (a) Ackley (b) Beale (c) Booth (d) Cross-leg table (e)Himmelblau (f) Levy 13 (g) Matyas and (h) Schaffer functions

the Stochastic Cube function CS did slightly better at largeriterations but both algorithms failed to reach the globalminimum after 5000 iterations

The sphere function has five variables as the Cubefunction but it is easier to solve Both algorithms were ableto identify the global optimum as depicted in Figure 3(g)but GBCS considerably outperformed CS in terms of per-formance efficiency The Hartmann function which has 6

variables was relatively easy to solve for GBCS algorithms asshown in Figure 3(h) GBCS arrived at the global minimumbut CS could not do so within the tolerance GBCS alsooutperformed CS in terms of performance efficiency inreaching the global optimum of the Hartmann function

Figure 4 shows the performance of test functions with 50variablesThese are themost challenging problems due to thelarge domain space Figure 4(a) depicts the performance of

10 Mathematical Problems in Engineering

0 500 1000

Iterations

100

1010

10minus10

10minus20

Func

tion

valu

e

(a)

0 500 1000Iterations

102

100

10minus2

10minus4

Func

tion

valu

e(b)

0 50 100 150 200Iterations

minus100

minus101

minus102

Func

tion

valu

e

(c)

1020

100

10minus20

10minus40

Func

tion

valu

e

0 2000 4000 6000

Iterations

(d)

0 2000 4000 6000

Iterations

1010

105

100

10minus5

10minus10

Func

tion

valu

e

(e)

0 2000 4000 6000Iterations

1010

105

100

10minus5

Func

tion

valu

e(f)

0 500 1000

Iterations

1020

100

10minus20

10minus40

Func

tion

valu

e

GBCSCS

(g)

0 50 100 150 200

Iterations

Func

tion

valu

e

minus1004

minus1006

GBCSCS

(h)

Figure 3 Evolution of mean best values for GBCS and the original CS algorithm for (a) Powell (b) Power Sum (c) Shekel 5 (d) Wood (e)Cube (f) Stochastic Cube (g) Sphere and (h) Hartmann functions

both algorithms for the Dixon-Price function Clearly GBCSperformed better than CS in the early iterations Even thoughboth algorithms were not able to attain the global minimumtheminimum arrived at by GBCS at early iterations is orders-of-magnitude lower than that arrived at by CS

GBCS outperformed CS in both the Griewank(Figure 4(b)) and the Stochastic Griewank (Figure 4(c))For the Griewank function both algorithms were able toidentify the global minimum However GBCS arrived at the

global minimum at less than the half number of iterationscompared to CS For the Stochastic Griewank CS wasnot able to identify the global minimum even after 5000iterations The result predicted by CS was more than 10orders-of-magnitude higher than the result predicted byGBCS

Figure 4(d) shows the results for the Michaelwics func-tion To the best of the authorsrsquo knowledge the globalminimum for the 50-variableMichaelwics is not known since

Mathematical Problems in Engineering 11

0 2000 4000 6000

Iterations

1010

105

100

10minus5

(a)

0 2000 60004000

IterationsFu

nctio

n va

lue

1010

100

10minus10

10minus20

(b)

0 2000 60004000

Iterations

Func

tion

valu

e

105

100

10minus5

10minus10

10minus15

(c)

0 2000 60004000

Iterations

Func

tion

valu

e

minus1011

minus1013

minus1015

(d)

0 2000 60004000

Iterations

Func

tion

valu

e

105

1010

100

10minus5

(e)Fu

nctio

n va

lue

105

1010

1000 2000 60004000

Iterations

(f)

0 2000 60004000

Iterations

Func

tion

valu

e

105

106

104

103

GBCSCS

(g)

0 2000 60004000

Iterations

Func

tion

valu

e

105

1010

100

GBCSCS

(h)

Figure 4 Evolution of mean best values for GBCS and the original CS algorithm for (a) Dixon-Price (b) Griewank (c) Stochastic Griewank(d) Michaelwics (e) Rosenbrock (f) Stochastic Rosenbrock (g) Trigonometric and (h) Zacherov functions

there is no published literature identifying it GBCS identifiedaminimum that is lower than that identified byCS as depictedin Figure 4(d)

Although both algorithms were not able to identifythe global optima of the Rosenbrock and the StochasticRosenbrock functions as depicted in Figures 4(e) and 4(f)respectively the results obtained by GBCS were orders-of-magnitude lower than those obtained by CS The superiorityof the GBCS is clearly demonstrated with those two func-tions

Figure 4(g) shows the results for the Trigonometric func-tion For this difficult problem both algorithms were notable to reach the global minimum within 5000 iterations Inthis function nonetheless as in most cases studied GBCSachieved a lower result than CS

The last of the 50-variable functions tested was theZacherov function The results of Figure 4(h) also show thesuperiority of GBCS in reaching results that are orders-of-magnitude lower than those obtained by the original CSalgorithm

12 Mathematical Problems in Engineering

Table 2 shows a summary of the evaluation results for thetwenty-four benchmark problems GBCS was able to providebetter solutions to all challenging problemsGBCSwas able toachieve better performance in eighteen problems equivalentperformance in twoproblems andworse performance in fourout of the twenty-four problems In one case the StochasticGriewank function the global optimum was successfullyobtained by GBCS for this 50-variable problem while CSfailed to find the global optimum even after 5000 iterationsOn the other hand the four functions for which CS per-formed better were the Powell Wood Cube and StochasticCube functions

It is interesting to note that GBCS outperformed CSin handling two of the three stochastic functions tested inthis study This result confirms that despite the use of thegradient as a guide upon which some moves are basedthe stochastic nature of the algorithm remained unaffectedIn fact the additional information was used quite subtlyCuckoos still search for nests randomly but with the help ofsome intelligence

6 Conclusions

In this study we use the gradient of the objective functionwhich could be available or easily obtainable for severalobjective functions in engineering calculations to improvethe performance of one of the most promising stochasticalgorithms the cuckoo search The proposed modificationwas subtly implemented in the algorithm by changing thedirection of the local random walk of cuckoos towards thedirection of the minimum value from the point of viewof the cuckoorsquos location We evaluated this modification byattempting to find the global optimum of twenty-four bench-mark functionsThe newly developed GBCS algorithm led toimproved reliability and effectiveness of the algorithm in allbut four of tested benchmark problems which included threestochastic functions In some cases the global minimumcould not have been obtained via the original CS algorithmbut was obtained via GBCS Although the improvementin performance was not achieved with the entire set ofbenchmark problems GBCS proved to be more reliable andefficient in the majority of the tested problems

Conflict of InterestsThe authors declare that there is no conflict of interestsregarding the publication of this paper

References

[1] C A Floudas and C E Gounaris ldquoA review of recent advancesin global optimizationrdquo Journal of Global Optimization vol 45no 1 pp 3ndash38 2009

[2] L M Rios and N V Sahinidis ldquoDerivative-free optimizationa review of algorithms and comparison of software implemen-tationsrdquo Journal of Global Optimization vol 56 pp 1247ndash12932012

[3] X S Yang and S Deb ldquoCuckoo search via Levy flightsrdquo in Pro-ceedings of the World Congress on Nature amp Biologically InspiredComputing (NABIC rsquo09) pp 210ndash214 IEEE Coimbatore IndiaDecember 2009

[4] X-S Yang and S Deb ldquoCuckoo search recent advances andapplicationsrdquo Neural Computing and Applications vol 24 no1 pp 169ndash174 2014

[5] J Piechocki D Ambroziak A Palkowski and G RedlarskildquoUse of modified cuckoo search algorithm in the design processof integrated power systems for modern and energy self-sufficient farmsrdquo Applied Energy vol 114 pp 901ndash908 2014

[6] G Kanagaraj S Ponnambalam and N Jawahar ldquoA hybridcuckoo search and genetic algorithm for reliability-redundancyallocation problemsrdquo Computers amp Industrial Engineering vol66 no 4 pp 1115ndash1124 2013

[7] V Bhargava S Fateen and A Bonilla-Petriciolet ldquoCuckoosearch a new nature-inspired optimization method for phaseequilibrium calculationsrdquo Fluid Phase Equilibria vol 337 pp191ndash200 2013

[8] P KMohanty andD R Parhi ldquoCuckoo search algorithm for themobile robot navigationrdquo in Swarm Evolutionary and MemeticComputing pp 527ndash536 Springer New York NY USA 2013

[9] X S Yang and S Deb ldquoEngineering optimisation by cuckoosearchrdquo International Journal of Mathematical Modelling andNumerical Optimisation vol 1 no 4 pp 330ndash343 2010

Submit your manuscripts athttpwwwhindawicom

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical Problems in Engineering

Hindawi Publishing Corporationhttpwwwhindawicom

Differential EquationsInternational Journal of

Volume 2014

Applied MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Probability and StatisticsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical PhysicsAdvances in

Complex AnalysisJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

OptimizationJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

CombinatoricsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Operations ResearchAdvances in

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Function Spaces

Abstract and Applied AnalysisHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of Mathematics and Mathematical Sciences

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Algebra

Discrete Dynamics in Nature and Society

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Decision SciencesAdvances in

Discrete MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014 Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Stochastic AnalysisInternational Journal of

10 Mathematical Problems in Engineering

0 500 1000

Iterations

100

1010

10minus10

10minus20

Func

tion

valu

e

(a)

0 500 1000Iterations

102

100

10minus2

10minus4

Func

tion

valu

e(b)

0 50 100 150 200Iterations

minus100

minus101

minus102

Func

tion

valu

e

(c)

1020

100

10minus20

10minus40

Func

tion

valu

e

0 2000 4000 6000

Iterations

(d)

0 2000 4000 6000

Iterations

1010

105

100

10minus5

10minus10

Func

tion

valu

e

(e)

0 2000 4000 6000Iterations

1010

105

100

10minus5

Func

tion

valu

e(f)

0 500 1000

Iterations

1020

100

10minus20

10minus40

Func

tion

valu

e

GBCSCS

(g)

0 50 100 150 200

Iterations

Func

tion

valu

e

minus1004

minus1006

GBCSCS

(h)

Figure 3 Evolution of mean best values for GBCS and the original CS algorithm for (a) Powell (b) Power Sum (c) Shekel 5 (d) Wood (e)Cube (f) Stochastic Cube (g) Sphere and (h) Hartmann functions

both algorithms for the Dixon-Price function Clearly GBCSperformed better than CS in the early iterations Even thoughboth algorithms were not able to attain the global minimumtheminimum arrived at by GBCS at early iterations is orders-of-magnitude lower than that arrived at by CS

GBCS outperformed CS in both the Griewank(Figure 4(b)) and the Stochastic Griewank (Figure 4(c))For the Griewank function both algorithms were able toidentify the global minimum However GBCS arrived at the

global minimum at less than the half number of iterationscompared to CS For the Stochastic Griewank CS wasnot able to identify the global minimum even after 5000iterations The result predicted by CS was more than 10orders-of-magnitude higher than the result predicted byGBCS

Figure 4(d) shows the results for the Michaelwics func-tion To the best of the authorsrsquo knowledge the globalminimum for the 50-variableMichaelwics is not known since

Mathematical Problems in Engineering 11

0 2000 4000 6000

Iterations

1010

105

100

10minus5

(a)

0 2000 60004000

IterationsFu

nctio

n va

lue

1010

100

10minus10

10minus20

(b)

0 2000 60004000

Iterations

Func

tion

valu

e

105

100

10minus5

10minus10

10minus15

(c)

0 2000 60004000

Iterations

Func

tion

valu

e

minus1011

minus1013

minus1015

(d)

0 2000 60004000

Iterations

Func

tion

valu

e

105

1010

100

10minus5

(e)Fu

nctio

n va

lue

105

1010

1000 2000 60004000

Iterations

(f)

0 2000 60004000

Iterations

Func

tion

valu

e

105

106

104

103

GBCSCS

(g)

0 2000 60004000

Iterations

Func

tion

valu

e

105

1010

100

GBCSCS

(h)

Figure 4 Evolution of mean best values for GBCS and the original CS algorithm for (a) Dixon-Price (b) Griewank (c) Stochastic Griewank(d) Michaelwics (e) Rosenbrock (f) Stochastic Rosenbrock (g) Trigonometric and (h) Zacherov functions

there is no published literature identifying it GBCS identifiedaminimum that is lower than that identified byCS as depictedin Figure 4(d)

Although both algorithms were not able to identifythe global optima of the Rosenbrock and the StochasticRosenbrock functions as depicted in Figures 4(e) and 4(f)respectively the results obtained by GBCS were orders-of-magnitude lower than those obtained by CS The superiorityof the GBCS is clearly demonstrated with those two func-tions

Figure 4(g) shows the results for the Trigonometric func-tion For this difficult problem both algorithms were notable to reach the global minimum within 5000 iterations Inthis function nonetheless as in most cases studied GBCSachieved a lower result than CS

The last of the 50-variable functions tested was theZacherov function The results of Figure 4(h) also show thesuperiority of GBCS in reaching results that are orders-of-magnitude lower than those obtained by the original CSalgorithm

12 Mathematical Problems in Engineering

Table 2 shows a summary of the evaluation results for thetwenty-four benchmark problems GBCS was able to providebetter solutions to all challenging problemsGBCSwas able toachieve better performance in eighteen problems equivalentperformance in twoproblems andworse performance in fourout of the twenty-four problems In one case the StochasticGriewank function the global optimum was successfullyobtained by GBCS for this 50-variable problem while CSfailed to find the global optimum even after 5000 iterationsOn the other hand the four functions for which CS per-formed better were the Powell Wood Cube and StochasticCube functions

It is interesting to note that GBCS outperformed CSin handling two of the three stochastic functions tested inthis study This result confirms that despite the use of thegradient as a guide upon which some moves are basedthe stochastic nature of the algorithm remained unaffectedIn fact the additional information was used quite subtlyCuckoos still search for nests randomly but with the help ofsome intelligence

6 Conclusions

In this study we use the gradient of the objective functionwhich could be available or easily obtainable for severalobjective functions in engineering calculations to improvethe performance of one of the most promising stochasticalgorithms the cuckoo search The proposed modificationwas subtly implemented in the algorithm by changing thedirection of the local random walk of cuckoos towards thedirection of the minimum value from the point of viewof the cuckoorsquos location We evaluated this modification byattempting to find the global optimum of twenty-four bench-mark functionsThe newly developed GBCS algorithm led toimproved reliability and effectiveness of the algorithm in allbut four of tested benchmark problems which included threestochastic functions In some cases the global minimumcould not have been obtained via the original CS algorithmbut was obtained via GBCS Although the improvementin performance was not achieved with the entire set ofbenchmark problems GBCS proved to be more reliable andefficient in the majority of the tested problems

Conflict of InterestsThe authors declare that there is no conflict of interestsregarding the publication of this paper

References

[1] C A Floudas and C E Gounaris ldquoA review of recent advancesin global optimizationrdquo Journal of Global Optimization vol 45no 1 pp 3ndash38 2009

[2] L M Rios and N V Sahinidis ldquoDerivative-free optimizationa review of algorithms and comparison of software implemen-tationsrdquo Journal of Global Optimization vol 56 pp 1247ndash12932012

[3] X S Yang and S Deb ldquoCuckoo search via Levy flightsrdquo in Pro-ceedings of the World Congress on Nature amp Biologically InspiredComputing (NABIC rsquo09) pp 210ndash214 IEEE Coimbatore IndiaDecember 2009

[4] X-S Yang and S Deb ldquoCuckoo search recent advances andapplicationsrdquo Neural Computing and Applications vol 24 no1 pp 169ndash174 2014

[5] J Piechocki D Ambroziak A Palkowski and G RedlarskildquoUse of modified cuckoo search algorithm in the design processof integrated power systems for modern and energy self-sufficient farmsrdquo Applied Energy vol 114 pp 901ndash908 2014

[6] G Kanagaraj S Ponnambalam and N Jawahar ldquoA hybridcuckoo search and genetic algorithm for reliability-redundancyallocation problemsrdquo Computers amp Industrial Engineering vol66 no 4 pp 1115ndash1124 2013

[7] V Bhargava S Fateen and A Bonilla-Petriciolet ldquoCuckoosearch a new nature-inspired optimization method for phaseequilibrium calculationsrdquo Fluid Phase Equilibria vol 337 pp191ndash200 2013

[8] P KMohanty andD R Parhi ldquoCuckoo search algorithm for themobile robot navigationrdquo in Swarm Evolutionary and MemeticComputing pp 527ndash536 Springer New York NY USA 2013

[9] X S Yang and S Deb ldquoEngineering optimisation by cuckoosearchrdquo International Journal of Mathematical Modelling andNumerical Optimisation vol 1 no 4 pp 330ndash343 2010

Submit your manuscripts athttpwwwhindawicom

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical Problems in Engineering

Hindawi Publishing Corporationhttpwwwhindawicom

Differential EquationsInternational Journal of

Volume 2014

Applied MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Probability and StatisticsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical PhysicsAdvances in

Complex AnalysisJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

OptimizationJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

CombinatoricsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Operations ResearchAdvances in

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Function Spaces

Abstract and Applied AnalysisHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of Mathematics and Mathematical Sciences

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Algebra

Discrete Dynamics in Nature and Society

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Decision SciencesAdvances in

Discrete MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014 Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Stochastic AnalysisInternational Journal of

Mathematical Problems in Engineering 11

0 2000 4000 6000

Iterations

1010

105

100

10minus5

(a)

0 2000 60004000

IterationsFu

nctio

n va

lue

1010

100

10minus10

10minus20

(b)

0 2000 60004000

Iterations

Func

tion

valu

e

105

100

10minus5

10minus10

10minus15

(c)

0 2000 60004000

Iterations

Func

tion

valu

e

minus1011

minus1013

minus1015

(d)

0 2000 60004000

Iterations

Func

tion

valu

e

105

1010

100

10minus5

(e)Fu

nctio

n va

lue

105

1010

1000 2000 60004000

Iterations

(f)

0 2000 60004000

Iterations

Func

tion

valu

e

105

106

104

103

GBCSCS

(g)

0 2000 60004000

Iterations

Func

tion

valu

e

105

1010

100

GBCSCS

(h)

Figure 4 Evolution of mean best values for GBCS and the original CS algorithm for (a) Dixon-Price (b) Griewank (c) Stochastic Griewank(d) Michaelwics (e) Rosenbrock (f) Stochastic Rosenbrock (g) Trigonometric and (h) Zacherov functions

there is no published literature identifying it GBCS identifiedaminimum that is lower than that identified byCS as depictedin Figure 4(d)

Although both algorithms were not able to identifythe global optima of the Rosenbrock and the StochasticRosenbrock functions as depicted in Figures 4(e) and 4(f)respectively the results obtained by GBCS were orders-of-magnitude lower than those obtained by CS The superiorityof the GBCS is clearly demonstrated with those two func-tions

Figure 4(g) shows the results for the Trigonometric func-tion For this difficult problem both algorithms were notable to reach the global minimum within 5000 iterations Inthis function nonetheless as in most cases studied GBCSachieved a lower result than CS

The last of the 50-variable functions tested was theZacherov function The results of Figure 4(h) also show thesuperiority of GBCS in reaching results that are orders-of-magnitude lower than those obtained by the original CSalgorithm

12 Mathematical Problems in Engineering

Table 2 shows a summary of the evaluation results for thetwenty-four benchmark problems GBCS was able to providebetter solutions to all challenging problemsGBCSwas able toachieve better performance in eighteen problems equivalentperformance in twoproblems andworse performance in fourout of the twenty-four problems In one case the StochasticGriewank function the global optimum was successfullyobtained by GBCS for this 50-variable problem while CSfailed to find the global optimum even after 5000 iterationsOn the other hand the four functions for which CS per-formed better were the Powell Wood Cube and StochasticCube functions

It is interesting to note that GBCS outperformed CSin handling two of the three stochastic functions tested inthis study This result confirms that despite the use of thegradient as a guide upon which some moves are basedthe stochastic nature of the algorithm remained unaffectedIn fact the additional information was used quite subtlyCuckoos still search for nests randomly but with the help ofsome intelligence

6 Conclusions

In this study we use the gradient of the objective functionwhich could be available or easily obtainable for severalobjective functions in engineering calculations to improvethe performance of one of the most promising stochasticalgorithms the cuckoo search The proposed modificationwas subtly implemented in the algorithm by changing thedirection of the local random walk of cuckoos towards thedirection of the minimum value from the point of viewof the cuckoorsquos location We evaluated this modification byattempting to find the global optimum of twenty-four bench-mark functionsThe newly developed GBCS algorithm led toimproved reliability and effectiveness of the algorithm in allbut four of tested benchmark problems which included threestochastic functions In some cases the global minimumcould not have been obtained via the original CS algorithmbut was obtained via GBCS Although the improvementin performance was not achieved with the entire set ofbenchmark problems GBCS proved to be more reliable andefficient in the majority of the tested problems

Conflict of InterestsThe authors declare that there is no conflict of interestsregarding the publication of this paper

References

[1] C A Floudas and C E Gounaris ldquoA review of recent advancesin global optimizationrdquo Journal of Global Optimization vol 45no 1 pp 3ndash38 2009

[2] L M Rios and N V Sahinidis ldquoDerivative-free optimizationa review of algorithms and comparison of software implemen-tationsrdquo Journal of Global Optimization vol 56 pp 1247ndash12932012

[3] X S Yang and S Deb ldquoCuckoo search via Levy flightsrdquo in Pro-ceedings of the World Congress on Nature amp Biologically InspiredComputing (NABIC rsquo09) pp 210ndash214 IEEE Coimbatore IndiaDecember 2009

[4] X-S Yang and S Deb ldquoCuckoo search recent advances andapplicationsrdquo Neural Computing and Applications vol 24 no1 pp 169ndash174 2014

[5] J Piechocki D Ambroziak A Palkowski and G RedlarskildquoUse of modified cuckoo search algorithm in the design processof integrated power systems for modern and energy self-sufficient farmsrdquo Applied Energy vol 114 pp 901ndash908 2014

[6] G Kanagaraj S Ponnambalam and N Jawahar ldquoA hybridcuckoo search and genetic algorithm for reliability-redundancyallocation problemsrdquo Computers amp Industrial Engineering vol66 no 4 pp 1115ndash1124 2013

[7] V Bhargava S Fateen and A Bonilla-Petriciolet ldquoCuckoosearch a new nature-inspired optimization method for phaseequilibrium calculationsrdquo Fluid Phase Equilibria vol 337 pp191ndash200 2013

[8] P KMohanty andD R Parhi ldquoCuckoo search algorithm for themobile robot navigationrdquo in Swarm Evolutionary and MemeticComputing pp 527ndash536 Springer New York NY USA 2013

[9] X S Yang and S Deb ldquoEngineering optimisation by cuckoosearchrdquo International Journal of Mathematical Modelling andNumerical Optimisation vol 1 no 4 pp 330ndash343 2010

Submit your manuscripts athttpwwwhindawicom

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical Problems in Engineering

Hindawi Publishing Corporationhttpwwwhindawicom

Differential EquationsInternational Journal of

Volume 2014

Applied MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Probability and StatisticsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical PhysicsAdvances in

Complex AnalysisJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

OptimizationJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

CombinatoricsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Operations ResearchAdvances in

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Function Spaces

Abstract and Applied AnalysisHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of Mathematics and Mathematical Sciences

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Algebra

Discrete Dynamics in Nature and Society

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Decision SciencesAdvances in

Discrete MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014 Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Stochastic AnalysisInternational Journal of

12 Mathematical Problems in Engineering

Table 2 shows a summary of the evaluation results for thetwenty-four benchmark problems GBCS was able to providebetter solutions to all challenging problemsGBCSwas able toachieve better performance in eighteen problems equivalentperformance in twoproblems andworse performance in fourout of the twenty-four problems In one case the StochasticGriewank function the global optimum was successfullyobtained by GBCS for this 50-variable problem while CSfailed to find the global optimum even after 5000 iterationsOn the other hand the four functions for which CS per-formed better were the Powell Wood Cube and StochasticCube functions

It is interesting to note that GBCS outperformed CSin handling two of the three stochastic functions tested inthis study This result confirms that despite the use of thegradient as a guide upon which some moves are basedthe stochastic nature of the algorithm remained unaffectedIn fact the additional information was used quite subtlyCuckoos still search for nests randomly but with the help ofsome intelligence

6 Conclusions

In this study we use the gradient of the objective functionwhich could be available or easily obtainable for severalobjective functions in engineering calculations to improvethe performance of one of the most promising stochasticalgorithms the cuckoo search The proposed modificationwas subtly implemented in the algorithm by changing thedirection of the local random walk of cuckoos towards thedirection of the minimum value from the point of viewof the cuckoorsquos location We evaluated this modification byattempting to find the global optimum of twenty-four bench-mark functionsThe newly developed GBCS algorithm led toimproved reliability and effectiveness of the algorithm in allbut four of tested benchmark problems which included threestochastic functions In some cases the global minimumcould not have been obtained via the original CS algorithmbut was obtained via GBCS Although the improvementin performance was not achieved with the entire set ofbenchmark problems GBCS proved to be more reliable andefficient in the majority of the tested problems

Conflict of InterestsThe authors declare that there is no conflict of interestsregarding the publication of this paper

References

[1] C A Floudas and C E Gounaris ldquoA review of recent advancesin global optimizationrdquo Journal of Global Optimization vol 45no 1 pp 3ndash38 2009

[2] L M Rios and N V Sahinidis ldquoDerivative-free optimizationa review of algorithms and comparison of software implemen-tationsrdquo Journal of Global Optimization vol 56 pp 1247ndash12932012

[3] X S Yang and S Deb ldquoCuckoo search via Levy flightsrdquo in Pro-ceedings of the World Congress on Nature amp Biologically InspiredComputing (NABIC rsquo09) pp 210ndash214 IEEE Coimbatore IndiaDecember 2009

[4] X-S Yang and S Deb ldquoCuckoo search recent advances andapplicationsrdquo Neural Computing and Applications vol 24 no1 pp 169ndash174 2014

[5] J Piechocki D Ambroziak A Palkowski and G RedlarskildquoUse of modified cuckoo search algorithm in the design processof integrated power systems for modern and energy self-sufficient farmsrdquo Applied Energy vol 114 pp 901ndash908 2014

[6] G Kanagaraj S Ponnambalam and N Jawahar ldquoA hybridcuckoo search and genetic algorithm for reliability-redundancyallocation problemsrdquo Computers amp Industrial Engineering vol66 no 4 pp 1115ndash1124 2013

[7] V Bhargava S Fateen and A Bonilla-Petriciolet ldquoCuckoosearch a new nature-inspired optimization method for phaseequilibrium calculationsrdquo Fluid Phase Equilibria vol 337 pp191ndash200 2013

[8] P KMohanty andD R Parhi ldquoCuckoo search algorithm for themobile robot navigationrdquo in Swarm Evolutionary and MemeticComputing pp 527ndash536 Springer New York NY USA 2013

[9] X S Yang and S Deb ldquoEngineering optimisation by cuckoosearchrdquo International Journal of Mathematical Modelling andNumerical Optimisation vol 1 no 4 pp 330ndash343 2010

Submit your manuscripts athttpwwwhindawicom

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical Problems in Engineering

Hindawi Publishing Corporationhttpwwwhindawicom

Differential EquationsInternational Journal of

Volume 2014

Applied MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Probability and StatisticsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical PhysicsAdvances in

Complex AnalysisJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

OptimizationJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

CombinatoricsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Operations ResearchAdvances in

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Function Spaces

Abstract and Applied AnalysisHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of Mathematics and Mathematical Sciences

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Algebra

Discrete Dynamics in Nature and Society

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Decision SciencesAdvances in

Discrete MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014 Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Stochastic AnalysisInternational Journal of

Submit your manuscripts athttpwwwhindawicom

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical Problems in Engineering

Hindawi Publishing Corporationhttpwwwhindawicom

Differential EquationsInternational Journal of

Volume 2014

Applied MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Probability and StatisticsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical PhysicsAdvances in

Complex AnalysisJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

OptimizationJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

CombinatoricsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Operations ResearchAdvances in

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Function Spaces

Abstract and Applied AnalysisHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of Mathematics and Mathematical Sciences

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Algebra

Discrete Dynamics in Nature and Society

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Decision SciencesAdvances in

Discrete MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014 Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Stochastic AnalysisInternational Journal of