improved backtracking search algorithm based on...

14
Research Article Improved Backtracking Search Algorithm Based on Population Control Factor and Optimal Learning Strategy Lei Zhao, 1 Zhicheng Jia, 1 Lei Chen, 2,3 and Yanju Guo 1 1 School of Electronic Information Engineering, Hebei University of Technology, Tianjin 300401, China 2 School of Information Engineering, Tianjin University of Commerce, Tianjin 300134, China 3 School of Precision Instrument and Opto-Electronics Engineering, Tianjin University, Tianjin 300072, China Correspondence should be addressed to Lei Chen; [email protected] Received 3 January 2017; Revised 10 May 2017; Accepted 12 June 2017; Published 24 July 2017 Academic Editor: Erik Cuevas Copyright © 2017 Lei Zhao et al. is is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. Backtracking search algorithm (BSA) is a relatively new evolutionary algorithm, which has a good optimization performance just like other population-based algorithms. However, there is also an insufficiency in BSA regarding its convergence speed and convergence precision. For solving the problem shown in BSA, this article proposes an improved BSA named COBSA. Enlightened by particle swarm optimization (PSO) algorithm, population control factor is added to the variation equation aiming to improve the convergence speed of BSA, so as to make algorithm have a better ability of escaping the local optimum. In addition, enlightened by differential evolution (DE) algorithm, this article proposes a novel evolutionary equation based on the fact that the disadvantaged group will search just around the best individual chosen from previous iteration to enhance the ability of local search. Simulation experiments based on a set of 18 benchmark functions show that, in general, COBSA displays obvious superiority in convergence speed and convergence precision when compared with BSA and the comparison algorithms. 1. Introduction In the rapid development of science and technology today, people set a high demand for optimization algorithms, and evolutionary algorithms are popular with people owing to its simple structure and less solving parameters, such as particle swarm optimization (PSO) algorithm [1], ant colony optimization (ACO) algorithm [2], and genetic algorithm (GA) [3]. However, a higher request is set for optimization algorithms when the multidimensional, multimodal, and nonlinear problems arise in academic research. So many scholars presented some modified evolutionary algorithms aiming to find the best values for system’s parameters in different circumstances in recent decades. By taking advantage of the optimum solution, literature [4] mends the search equation when PSO algorithm operates aiming to accelerate convergence speed. Enlightened by biological evolution, literature [5] developed differential evolution (DE) algorithm, and an adaptive evolutionary strategy based on DE is proposed in literature [6] to raise efficiency of DE algorithm. Differential search algorithm (DSA), equipping the new crossover and mutation strategy, is revealed in the literature [7]. Learning from natural system, literatures [8–10] put forward artificial bee colony (ABC) algorithm enlightened by the natural behavior of bee, cuckoo search (CS) algorithm enlightened by flight behavior of cuckoo, and bat algorithm (BA) enlightened by foraging behavior of bat. Besides, these evolutionary algorithms have found increasingly wide utilization in many fields successfully, such as blind source separation [11], hyperspectral unmixing [12], and image processing [13]. BSA is such a novel computation method evolved by Civicioglu in 2013 [14]. BSA employs fewer parameters and has a simpler process that is efficient, fast, and able to handle various conditions and that makes it more easily to accept different complex problems. Literature [14] demonstrates that BSA has similar superiority to other evolutionary algorithms with strength of global search. Owing to its simplicity and high-efficiency, BSA has caught many scholars’ attention [15, 16]. However, there are also some challenging complications arisen in BSA. For example, the convergence speed of BSA Hindawi Mathematical Problems in Engineering Volume 2017, Article ID 3017608, 13 pages https://doi.org/10.1155/2017/3017608

Upload: others

Post on 13-Sep-2020

6 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Improved Backtracking Search Algorithm Based on ...downloads.hindawi.com/journals/mpe/2017/3017608.pdfMathematicalProblemsinEngineering 3 3. The Improved Backtracking Search Algorithm

Research ArticleImproved Backtracking Search Algorithm Based on PopulationControl Factor and Optimal Learning Strategy

Lei Zhao1 Zhicheng Jia1 Lei Chen23 and Yanju Guo1

1School of Electronic Information Engineering Hebei University of Technology Tianjin 300401 China2School of Information Engineering Tianjin University of Commerce Tianjin 300134 China3School of Precision Instrument and Opto-Electronics Engineering Tianjin University Tianjin 300072 China

Correspondence should be addressed to Lei Chen chenleitjcueducn

Received 3 January 2017 Revised 10 May 2017 Accepted 12 June 2017 Published 24 July 2017

Academic Editor Erik Cuevas

Copyright copy 2017 Lei Zhao et alThis is an open access article distributed under the Creative Commons Attribution License whichpermits unrestricted use distribution and reproduction in any medium provided the original work is properly cited

Backtracking search algorithm (BSA) is a relatively new evolutionary algorithm which has a good optimization performancejust like other population-based algorithms However there is also an insufficiency in BSA regarding its convergence speed andconvergence precision For solving the problem shown in BSA this article proposes an improved BSA named COBSA Enlightenedby particle swarm optimization (PSO) algorithm population control factor is added to the variation equation aiming to improve theconvergence speed of BSA so as to make algorithm have a better ability of escaping the local optimum In addition enlightened bydifferential evolution (DE) algorithm this article proposes a novel evolutionary equation based on the fact that the disadvantagedgroup will search just around the best individual chosen from previous iteration to enhance the ability of local search Simulationexperiments based on a set of 18 benchmark functions show that in general COBSA displays obvious superiority in convergencespeed and convergence precision when compared with BSA and the comparison algorithms

1 Introduction

In the rapid development of science and technology todaypeople set a high demand for optimization algorithms andevolutionary algorithms are popular with people owing toits simple structure and less solving parameters such asparticle swarm optimization (PSO) algorithm [1] ant colonyoptimization (ACO) algorithm [2] and genetic algorithm(GA) [3] However a higher request is set for optimizationalgorithms when the multidimensional multimodal andnonlinear problems arise in academic research So manyscholars presented some modified evolutionary algorithmsaiming to find the best values for systemrsquos parametersin different circumstances in recent decades By takingadvantage of the optimum solution literature [4] mendsthe search equation when PSO algorithm operates aimingto accelerate convergence speed Enlightened by biologicalevolution literature [5] developed differential evolution (DE)algorithm and an adaptive evolutionary strategy based onDE is proposed in literature [6] to raise efficiency of DEalgorithm Differential search algorithm (DSA) equipping

the new crossover and mutation strategy is revealed inthe literature [7] Learning from natural system literatures[8ndash10] put forward artificial bee colony (ABC) algorithmenlightened by the natural behavior of bee cuckoo search(CS) algorithm enlightened by flight behavior of cuckooand bat algorithm (BA) enlightened by foraging behaviorof bat Besides these evolutionary algorithms have foundincreasingly wide utilization in many fields successfully suchas blind source separation [11] hyperspectral unmixing [12]and image processing [13]

BSA is such a novel computation method evolved byCivicioglu in 2013 [14] BSA employs fewer parameters andhas a simpler process that is efficient fast and able to handlevarious conditions and that makes it more easily to acceptdifferent complex problems Literature [14] demonstrates thatBSA has similar superiority to other evolutionary algorithmswith strength of global search Owing to its simplicity andhigh-efficiency BSA has caught many scholarsrsquo attention [1516]

However there are also some challenging complicationsarisen in BSA For example the convergence speed of BSA

HindawiMathematical Problems in EngineeringVolume 2017 Article ID 3017608 13 pageshttpsdoiorg10115520173017608

2 Mathematical Problems in Engineering

is typically slower than these typical evolutionary algorithms(DE and PSO) in the early stage as a result of ignoringthe importance of optimal individual that makes BSA dif-ficult to achieve the satisfactory outcome when handlingsome complex problemsTherefore accelerating convergencespeed and improving convergence precision have become tworelatively valuable and significative targets in BSArsquos study Anumber of modified BSA aiming to reach above two targetshave hence been put forward since its appearance Forexample literature [17] focuses on improving convergenceprecision by proposing the optimal evolution equation andoptimal search equation but brings extra evaluations Lit-eratures [18 19] focus on convergence precision and globeconvergence by employing variation coefficient and selectionmechanism but sacrifice iteration numbers as a result Animproved BSA is proposed in this article following thepurpose of accelerating convergence speed and improvingconvergence precision when iteration numbers are lowerFirstly to accelerate the convergence speed of populationin the early stage population control factor is added to theevolution equation What is more an evolutionary equationbased on the fact that the disadvantaged groupwill search justaround the best individual chosen from previous iterationis proposed to enhance the ability of local search Simula-tion experiments based on 18 benchmark functions showthat the improved algorithm can improve performance andproductivity of BSA effectively and is a feasible evolutionaryalgorithm

The rest of this article is arranged as follows This articlewill describe BSA briefly in the second section the improvedalgorithmwill be introduced in the third section in the fourthsection the simulation results will be presented the fifthsection is the discussion about the variance 119902 appearing in thefirst strategy in the last this article will draw a conclusion

2 Backtracking Search Algorithm

Similar to other evolutionary algorithms BSA is also apopulation-based evolutionary algorithm designed to finda global optimum the solving process can be described bydividing its operation into five steps as is done in otherevolutionary algorithms population initialization histor-ical population setting population evolution populationcrossover and greedy selection Specific steps of the basicBSA can be described as follows

21 Population Initialization 119873 numbers of randomlyemerged 119863-dimensional vectors make up the initial solutionof BSA And the random initialization method can besummarized as follows

Pop119894119895 sim 119880 (low119895 up119895) (1)

where 119894 isin [1 2 3 119873] and 119895 isin [1 2 3 119863] low andup stand for the lower and upper bound of search spacerespectively 119880 is the uniform distribution function

22 Historical Population Setting BSA chooses the previouspopulation randomly by appointing the Oldpop as history

population employed with the intent of controlling the searchspace and remembers the position until the Oldpop getschanged What makes BSA have memory function is twonumbers 119886 and 119887 generated randomly when the cycle startsIf 119886 lt 119887 then Oldpop = Pop meanwhile the order of theindividual in Oldpop will be arrayed randomly

23 Population Evolution At this stage BSAwill generate thenew population making full use of its previous experimentbased on Pop and Oldpop evolutionary equation is asfollows

119872 = Pop + 119865 lowast (Oldpop minus Pop) (2)

where 119865119894 = 3 lowast rand 119894 isin [1 2 3 119873] and rand is a randomnumber in the rang [0 1] 119865 is a parameter controlling rangeof the search direction matrix (Oldpop minus Pop)24 Population Crossover The new population crossoverstrategy different from DE shown in BSA is that the mixedproportion parameter is employed to control the numbers ofcross particle Equation is given as follows

119879119894119895 = 119872119894119895 map119894119895 = 1Pop119894119895 map119894119895 = 0 (3)

where map is119873times119863matrix which only contains 0 and 1 andits initial value is 1 Its specific calculation formula is given asfollows

map119894119906(1[119898119903lowastrandlowast119863]) = 0 119886 lt 119887map119894randi(119863) = 0 119886 ge 119887 (4)

where randi is an integer selected from 0 to 119863 randomly119898119903is a mixed proportion parameter and119898119903 = 1 rand is chosenrandomly in the range [0 1] and so do 119886 and 119887 119906 is an integervector sorted randomly and 119906 isin [1 2 3 119863] The map inthe crossover stage controls the numbers of individual thatwill mutate by using [119898119903lowastrandlowast119863] and randi(119863) When 119886 lt119887 map119894 is a two-element vector which containsmany randompositions otherwise a vector just containing an element ldquo0rdquowill be shown in the map119894 Some individuals generated inthe crossover stage can overflow the limited search space as aresult of crossover strategy For this a boundary control to theindividuals beyond the search space will be conducted using(1) again

25 Greedy Selection When BSA finishes above steps thenew population 119879119894 that have better position than originalpopulation Pop119894 are used to update the Pop119894 based on greedyselection and become new candidate solution Meanwhileif the best individual of new population has a lower fitnessthan optimumobtained from previous iterations the originalone will be replaced to be the new optimum Pop119894 andoptimum will remain when conditions mentioned above arenot satisfied

Mathematical Problems in Engineering 3

3 The Improved BacktrackingSearch Algorithm

In the above part BSA is introduced briefly and it can beknown easily that BSA equipping a single control parametera trial population (Oldpop) and a simple structure hasa good ability to explore search space in the early stageHowever it is a common phenomenon for evolutionaryalgorithms falling into local optimum easily when facing themultimodal and nonlinear problems and so does BSA Inthis regard an improved BSA based on population controlfactor and optimal learning strategy enlightened by PSO andDE respectively is presented in this article On the one handpopulation control factor that regulates search direction ofpopulation is employed when BSA operates on the otherhand disadvantaged group is guided to have optimal learningwith the help of the optimal individual of the previousiteration Specific improvements are as follows

31 Improved BSA Based on Population Control Factor(CBSA) Shortage shown in the BSA is that the Pop in thevariation equation (2) has stochastic character which makesBSA lack memory about position of population when thetrend of expanding search space and exploring the newsearch area follows Namely Pop has the trend of globaloptimizationTherefore we can take global search firstly andlocal fine search later into account Enlightened by literature[20] inertia weight is employed to the evolutionary equation(2) and the new evolutionary equation is as follows

119872 = 119908 lowast Pop + 119865 lowast (Oldpop minus Pop) (5)

where 119908 is a variable parameter changing with the increaseof iteration numbers linearly So the target we just referredabove can be reached on condition that the algorithm hasa stronger ability of global search when 119908 is larger and thealgorithm tends to local search when 119908 is smaller Thuswe can employ a global coefficient adaptively to improvethe search accuracy and search efficiency when facing somecomplex optimization problems But every coin has twosides This strategy cannot reflect the actual optimizationsearch process when the algorithm we just proposed facessome nonlinear problems as a result of linearity shown in 119908so this article presents the following search equation again

119872 = 119898 lowast 119908 lowast Pop + 119865 lowast (Oldpop minus Pop) (6)

where 119898 is a 119873 times 119863 matrix and the elements in matrix 119898fit normal distribution following the condition that mean is119901 = 1 while variance is 119902 = 03 Thus a disturbance willbe produced to (119908 lowast Pop) to help algorithm escape the localoptimumwhen algorithm handles some nonlinear problemsIn the last the final evolution equation is summarized as

119872 = 119888 lowast Pop + 119865 lowast (Oldpop minus Pop) (7)

where 119888 = 119898 lowast 119908 named population control factorThe above evolutionary equation can not only improve theconvergence speed with the help of 119908 but also enhance theability of escaping the local optimum with the help of 119898

New individual willappear in the optimalsolution nearby in ahigh probability

New individual willbe away from theoptimal solution ina small probability

minus4 minus2minus8 6 80 2 4minus6

Figure 1 Standard normal distribution

Based on simulation experiments it is proved strongly thatthe improved algorithm has a great performance on theconvergence speed

32 Improved BSA Based on Optimal Learning Strategy(OBSA) Another shortage shown in BSA is that BSA doesnot acquire enough experience taking full advantage of thebest current individual from DE Besides it is importantfor evolutionary algorithms to find a balance between globalsearch and local search It can be known that BSA in focuson global optimization with (2) displays a poor ability oflocal search because it ignores the importance of optimumindividualTheoptimum individual in the current populationis very meaningful source that can be used to improvelocal search In this regard enlightened by literature [5]and literature [21] this article proposes the following searchequation

V119894119895 = 119909best119895 + 119898 lowast Pop119894119895 (8)

where 119898 = 3 lowast randn 119909best119895 is the optimum solutionin the current population and randn is standard normaldistribution Obvious observation can be acquired fromFigure 1 that the new individual dominated by (8) can berandom enough for local search owing a high probability thatthe new individual will appear around the best individualNamely the evolutionary equation described by (8) is goodat local search but poor at global search

33 Improved BSA Based on Population Control Factor andOptimal Learning Strategy (COBSA) With purpose of solv-ing the shortage of slow convergence speed in the earlystage and low convergence precision in the late stage shownin BSA an improved BSA based on population controlfactor and optimal learning strategy in combination withSections 31 and 32 is presented in this article The improvedBSAnamedCOBSAproduces the newpopulation by employ-ing (7) and optimal learning strategy based on (8) will beapplied to the disadvantaged group that has a worse behaviorthan previous population The specific steps of the COBSAare given as follows

Step 1 Initialize the population Pop and Oldpop

4 Mathematical Problems in Engineering

Step 2 Generate two numbers 119886 and 119887 randomly if 119886 lt 119887Oldpop = Pop and Oldpop will be arrayed randomly later

Step 3 Produce a new population on the basis of initialpopulation by employing (7)

Step 4 Generate cross matrix map by using (4)

Step 5 Take cross strategy through (3)

Step 6 Initialize the individual that is out of boundary

Step 7 Seek the value of the population and disadvantagedgroup is guided to have optimal learning by employing (8)

Step 8 Select the optimal population according to the greedyselection mechanism

Step 9 Judge whether the cyclemeets the target if not returnto Step 2

Step 10 Output optimal solution

4 Experiment and Result Analysis

In order to verify effectiveness of the improved algorithmproposed in this article COBSA is applied to minimize a setof 18 benchmark functions commonly used in optimizationalgorithms Table 1 presents in detail function form searchspace and function dimension used for the evolutionaryalgorithms in the tests Some problems will arise whenCOBSA operates owing to the different characters shown inthe 18 benchmark functions 1198911sim1198915 11989111sim11989113 and 11989117 areunimodal and continuous functions 1198916sim1198918 11989110 11989114 11989116and 11989118 are multipeak and the number of local minimumvalues will increase obviously when dimension119863 gets larger1198919 is the Rosenbrock function which is unimodal when 119863 =2 and 3 but numerous local minimum values will arise whenproblem dimension is high [22]11989115 is the three-hump camelfunction in 119863 = 2 having three local minimum values Thisarticle adjusts the form of some functions for convenienceto compare when different versions in different literaturesshown in these functions are considered

Enlightened by literatures [23ndash25] the followingmethodsare designed to evaluate the performance of COBSA (1)convergence performance in fixed iteration (2) iterationperformance in fixed precision (3) comparison algorithmsemployed to compare with COBSA (4) statistical test basedon the Wilcoxon Signed-Rank Test used to analyse theproperty of COBSA and comparison algorithms

The parameters of algorithm used in the evaluationmethods (1) (2) (3) and (4) are listed as follows thepopulation size is 30 the maximum cycle of times to beevaluated for each benchmark functions is 1000 and the finalresult is the average value based on 20 independent runningtimes

41 Convergence Performance in Fixed Iteration In thissection in order to analyse how much the population con-trol factor and the optimal learning strategy contribute to

improve the performance of BSA this article compares theconvergence speed and convergence precision of BSA CBSAOBSA and COBSA based on the 18 objective functions whenthe maximum cycle is 1000 And the fitness of function takesthe logarithmof 10 to display the difference shown in differentstrategy more obviously

Important observation on convergence speed and con-vergence precision of different algorithms can be obtainedfrom the result displayed in Figure 2 A better performanceon convergence speed shown in CBSA is displayed whenBSA works under the same condition which implies theefficiency of evolution equation (7) powerfully It also can beobtained from Figure 2 that OBSA has better convergenceprecision than that of BSA in the late stage which shows thecorrectness of search equation (8) proposed in Section 32Meanwhile we can learn that the COBSA in combinationwith CBSA and OBSA displaying faster speed and higherprecision in Figure 2 has a more obvious influence on theperformance of BSA when compared with BSA CBSA andOBSA

42 Iteration Performance in Fixed Precision In this sectionthe performance of each algorithm depended on the itera-tion numbers when evolution precision which is limited isemployed to make a comparison All the algorithms havebeen run 20 times independently based on 18 objectivefunctions For each function the current cycle will stopeither when the precision is less than the limitation given inTable 2 or when the maximum cycle is satisfied The resultsincluding BSA CBSA OBSA and COBSA are displayed inTable 2 in terms of min max mean SR and EI Specificexpression of each parameter we just mentioned is as followsthe success rate (SR) = numbers of achieving target precisiondivide total numbers of experiment expected iterations (EI)= the numbers of particle times average iteration numbers dividesuccess rate And ldquoinfinrdquo appearing in Table 2 indicates that theexpected iterations are infinite

An interesting result is that BSA does not meet the targetprecision for all functions except Sumpower 1198913 and the SRof BSA is just 015 on function Sumpower 1198913 when that ofCBSA OBSA and COBSA is 1 Besides for other functionsa higher SR and less EI are displayed when CBSA and OBSAare compared with BSA while COBSA has 100 success ratefor all functions with least expected iteration The iterationperformance in fixed precision in this section also proves thesuperiority of COBSA

43 Experiment Based on Comparison Algorithms In thissection some relatively new evolutionary algorithms evolv-ing in recent decades are employed to compare with COBSAABC (limit = 100) MABC (limit = 100) BA (119860 = 05 119903 = 05119891max = 2 119891min = 0 120572 = 09 and 120574 = 09) DSA and BSAare included The result is judged in terms of mean standarddeviation (std) of the solution obtained from 20 independenttests

As it can be obtained from Table 3 COBSA offers ahigher convergence precision on almost all functions whilethe stander deviation of COBSA is also much smaller thanthat of the comparison algorithms In particular COBSA can

Mathematical Problems in Engineering 5

Table 1 Test function

Function Function expression Dim Search space

Sphere 1198911 119891(119909) = 119899sum119894=1

1199092119894 30 [minus100 100]Sumsquare 1198912 119891(119909) = 119899sum

119894=1

1198941199092119894 60 [minus10 10]Sumpower 1198913 119891(119909) = 119899sum

119894=1

10038161003816100381610038161199091198941003816100381610038161003816119894+1 60 [minus1 1]Schwefel222 1198914 119891(119909) = 119899sum

119894=1

10038161003816100381610038161199091198941003816100381610038161003816 +119899prod119894=1

10038161003816100381610038161199091198941003816100381610038161003816 100 [minus10 10]Exponential 1198915 119891(119909) = exp(05 lowast 119899sum

119894=1

119909119894) minus 1 60 [minus128 128]Alpine 1198916 119891(119909) = 119899sum

119894=1

1003816100381610038161003816119909119894 sin (119909119894) + 011199091198941003816100381610038161003816 60 [minus10 10]Rastrigin 1198917 119891(119909) = 119899sum

119894=1

(1199092119894 minus 10 cos 2120587119909119894) + 10 100 [minus100 100]Griewank 1198918 119891(119909) = 14000

119899sum119894=1

(119909119894)2 minus 119899prod119894=1

cos( 119909119894radic119909119894) + 1 60 [minus600 600]Rosenbrock 1198919 119891(119909) = 119899minus1sum

119894=1

[100 (1199092119894 minus 119909119894+1)2 + (119909119894 minus 1)2] 30 [minus30 30]Ackley 11989110 119891 (119909) = minus20 exp(minus02radic 1119899

119899sum119894=1

1199092119894) minus exp(1119899119899sum119894=1

cos (2120587119909119894)) + 20 + 119890 60 [minus32 32]R-H-Ellipsoid11989111 119891(119909) = 119899sum

119894=1

119894sum119895=1

1199092119895 80 [minus100 100]

Schwefel12 11989112 119891(119909) = 119899sum119894=1

( 119894sum119895=1

119909119895)2

10 [minus30 30]Elliptic 11989113 119891(119909) = 119899sum

119894=1

(106)(119894minus1)(119899minus1) 1199092119894 30 [minus100 100]

NCRastrigin11989114119891(119909) = 119899sum

119894=1

(1199102119894 minus 10 cos 2120587119910119894) + 10

119910119894 = 119909119894 10038161003816100381610038161199091198941003816100381610038161003816 lt 12 round(2119909119894)2 |119909119894| ge 12

50 [minus512 512]

T-H camel 11989115 119891(119909) = 211990921 minus 10511990941 + 119909616 + 11990911199092 + 11990922 2 [minus5 5]

Schaffer 11989116 119891(119909) = ( 119899sum119894=1

1199092119894)025 (sin2 (50( 119899sum

119894=1

1199092119894)01) + 1) 10 [minus100 100]

Powell 11989117 119891(119909) = 1198994sum119894=1

[(1199094119894minus3 + 101199094119894minus2)2 + 5 (1199094119894minus1 minus 1199094119894)2 + (1199094119894minus2 minus 21199094119894minus1)4 + 10 (1199094119894minus3 + 1199094119894)4] 70 [5 minus4]

Weierstrass 11989118 119891 (119909) = 119899sum119894=1

(119896maxsum119896=0

[119886119896 cos (2120587119887119896 (119909119894 + 05))]) minus 119899119896maxsum119896=0

[119886119896 cos (212058711988711989605)]119886 = 05 119887 = 3 119896max = 20 60 [minus05 05]

find optimal solution when facing Rastrigin 1198917 Griewank1198918 NCRastrigin 11989114 and Weierstrass 11989118 However in thecase of Rosenbrock 1198919 the convergence precision of COBSAis worse compared to that of ABC and MABC since ABCis just one magnitude order higher when compared withCOBSA on Rosenbrock 1198919 the superiority of ABC andMABC is not very obvious Besides COBSA displays a morestable search process than ABC and MABC owing to a lowerstander deviation on Rosenbrock 1198919

44 Experiment Based on Wilcoxon Signed-Rank Test Acomparative method based on statistical analysis by usingWilcoxon Signed-Rank Test is employed to evaluate theperformance of COBSA and comparison algorithms

Table 4 displays the statistical results using the averagevalues based on 20 independent running times of COBSAand comparison algorithms to handle the objective functionsin Table 1 The result presents that the COBSA displaying abetter statistical property successfully outperforms all of the

6 Mathematical Problems in Engineering

Sphere f1

BSCBS

OBSCOBS

10minus40

10minus30

10minus20

10minus10

100

1010

Fitn

ess

200 400 600 800 10000Generation

Sumsquare f2

BSCBS

OBSCOBS

10minus30

10minus20

10minus10

100

1010

Fitn

ess

200 400 600 800 10000Generation

Alpine f6

BSCBS

OBSCOBS

10minus25

10minus20

10minus15

10minus10

10minus5

100

105

Fitn

ess

200 400 600 800 10000Generation

Schwefel222 f4

BSCBS

OBSCOBS

10minus40

10minus20

100

1060

1040

1020

Fitn

ess

200 400 600 800 10000Generation

Sumpower f3

BSCBS

OBSCOBS

10minus150

10minus100

10minus50

100

1050

Fitn

ess

200 400 600 800 10000Generation

Exponential f5

BSCBS

OBSCOBS

10minus25

10minus20

10minus15

10minus10

10minus5

100

105

Fitn

ess

200 400 600 800 10000Generation

Figure 2 Continued

Mathematical Problems in Engineering 7

Griewank f8

BSCBS

OBSCOBS

10minus20

10minus15

10minus10

10minus5

100

105

Fitn

ess

200 400 600 800 10000Generation

Rastrigin f7

BSCBS

OBSCOBS

10minus15

10minus10

10minus5

100

1010

105

Fitn

ess

200 400 600 800 10000Generation

Ackley f10

BSCBS

OBSCOBS

10minus15

10minus10

10minus5

100

105Fi

tnes

s

200 400 600 800 10000Generation

Rosenbrock f9

BSCBS

OBSCOBS

100

1010

108

106

104

102

Fitn

ess

200 400 600 800 10000Generation

R-H-Ellipsoid f11

BSCBS

OBSCOBS

1010

100

Fitn

ess

10minus30

10minus20

10minus10

200 400 600 800 10000Generation

Schwefel12 f12

BSCBS

OBSCOBS

10minus25

10minus20

10minus15

10minus10

10minus5

100

105

Fitn

ess

200 400 600 800 10000Generation

Figure 2 Continued

8 Mathematical Problems in Engineering

Elliptic f13

BSCBS

OBSCOBS

1030

1010

1020

100

Fitn

ess

10minus20

10minus10

200 400 600 800 10000Generation

NCRastrigin f14

BSCBS

OBSCOBS

10minus15

10minus10

10minus5

100

105

Fitn

ess

200 400 600 800 10000Generation

T-H camel f15

BSCBS

OBSCOBS

100

Fitn

ess

10minus120

10minus40

10minus60

10minus80

10minus100

10minus20

200 400 600 800 10000Generation

Schaffer f16

BSCBS

OBSCOBS

10minus1

10minus2

10minus3

10minus4

100

101

102Fi

tnes

s

200 400 600 800 10000Generation

Powell f17

BSCBS

OBSCOBS

100

105

Fitn

ess

10minus25

10minus20

10minus15

10minus10

10minus5

200 400 600 800 10000Generation

Weierstrass f18

BSCBS

OBSCOBS

10minus15

10minus10

10minus5

100

105

Fitn

ess

200 400 600 800 10000Generation

Figure 2 Convergence performance in fixed iteration

Mathematical Problems in Engineering 9

Table 2 Iteration performance in fixed precision

Function Limitation Algorithm Iterations Success rate Expected iterationsMin Max Mean

1198911 119890 minus 24BSA 1000 1000 1000 0 infinCBSA 845 925 877 1 26310OBSA 870 1000 974 03 97400COBSA 635 734 692 1 20760

1198912 119890 minus 25BSA 1000 1000 1000 0 infinCBSA 938 1000 971 095 30663OBSA 720 1000 957 035 82028COBSA 664 802 742 1 22260

1198913 119890 minus 20BSA 927 1000 993 015 198600CBSA 334 442 381 1 11430OBSA 211 353 288 1 8640COBSA 159 259 223 1 6690

1198914 119890 minus 17BSA 1000 1000 1000 0 infinCBSA 844 981 877 1 26310OBSA 791 1000 961 035 82371COBSA 688 851 747 1 22410

1198915 119890 minus 16BSA 1000 1000 1000 0 infinCBSA 757 837 791 1 23730OBSA 725 1000 952 05 57120COBSA 613 755 675 1 20250

1198916 119890 minus 18BSA 1000 1000 1000 0 infinCBSA 913 1000 954 09 31800OBSA 951 1000 993 02 148950COBSA 711 830 784 1 23520

1198917 119890 minus 5BSA 1000 1000 1000 0 infinCBSA 561 646 599 1 17970OBSA 250 541 453 1 13590COBSA 354 488 422 1 12660

1198918 119890 minus 7BSA 1000 1000 1000 0 infinCBSA 477 558 529 1 15870OBSA 329 615 465 1 13680COBSA 247 469 411 1 12330

1198919 30

BSA 1000 1000 1000 0 infinCBSA 215 298 267 1 8010OBSA 163 349 244 1 7320COBSA 141 241 189 1 5670

11989110 119890 minus 14BSA 1000 1000 1000 0 infinCBSA 958 1000 979 045 65266OBSA 830 1000 989 03 98900COBSA 687 860 784 1 23520

11989111 119890 minus 21BSA 1000 1000 1000 0 infinCBSA 873 947 914 1 27420OBSA 693 1000 927 09 30900COBSA 629 816 710 1 21300

11989112 119890 minus 14BSA 1000 1000 1000 0 infinCBSA 892 1000 986 035 84514OBSA 766 1000 883 09 29433COBSA 532 710 624 1 18720

10 Mathematical Problems in Engineering

Table 2 Continued

Function Limitation Algorithm Iterations Success rate Expected iterationsMin Max Mean

11989113 119890 minus 07BSA 1000 1000 1000 0 infinCBSA 630 700 667 1 20010OBSA 505 1000 821 085 28976COBSA 524 611 566 1 16980

11989114 119890 minus 05BSA 1000 1000 1000 0 infinCBSA 590 790 679 1 20370OBSA 286 960 608 1 18240COBSA 424 556 492 1 14760

11989115 119890 minus 80BSA 1000 1000 1000 0 infinCBSA 869 1000 973 06 48650OBSA 803 1000 991 045 66066COBSA 763 893 786 1 10560

11989116 001

BSA 1000 1000 1000 0 infinCBSA 800 1000 948 07 40628OBSA 596 1000 831 08 31162COBSA 475 667 577 1 17310

11989117 119890 minus 20BSA 1000 1000 1000 0 infinCBSA 956 1000 990 05 59400OBSA 658 1000 854 075 34160COBSA 545 761 689 1 20670

11989118 001

BSA 1000 1000 1000 0 infinCBSA 256 323 294 1 8820OBSA 172 1000 496 085 17505COBSA 218 327 260 1 7800

comparison algorithms based on theWilcoxon Signed-RankTest with a statistical significance value 120572 = 005

According to the above analyses of Sections 41 42 43and 44 conclusion can be summarized safely as followsCOBSA can improve the performance of BSA effectively andit is a feasible computation method

5 The Choice of Variance 119902In Section 31 the variance 119902 of matrix119898 plays an importantrole in affecting the performance of CBSA When 119902 takes0 the matrix 119898 will lose its role When 119902 increases from0 to 1 the fluctuation to population Pop will also increasecorrespondingly However a lower convergence speed willbe displayed owing to the obvious fluctuation to populationPop when 119902 is large Therefore four objective functionsSumsquare 1198912 Schwefel222 1198914 Alpine 1198916 and Ackley 11989110are employed to analyse the effect of variance 119902 The fitnessof function takes the logarithm of 10 to show and observe thecurves obviously

The result is better when the fitness of function issmaller owing to the minimization process shown in thesefunctions We can obtain from Figure 3 that variance 119902affects the result obviously CBSA plays a better performanceon Sumsquare 1198912 and Alpine 1198916 when 119902 = 03 When119902 takes 04 the fitness of Schwefel222 1198914 is better ForAckley 11989110 a lower precision is obtained when 119902 is around

03 Therefore the selective variance 119902 is set at 03 in thisarticle

Besides we can also observe from Figure 3 that theinfluence of matrix 119898 cannot be ignored When variance 119902is 0 the element in matrix119898 is 1 and as a result the matrix119898will lose its role As it can be obtained from Figure 3 a betterperformance is displayed when 119902 is around 03 rather than 0which proves the significance of matrix1198986 Conclusion

In order to solve the problem of slow convergencespeed and low convergence precision shown in BSAthis article proposes an improved algorithm namedCOBSA through employing population control factor tothe variation equation and helping disadvantaged grouphave optimal learning The experiments based on 18benchmark functions prove that COBSA is an effectiveevolutionary algorithm and it is a feasible solution toimprove the convergence speed and convergence precision ofBSA

However COBSA displays a lower convergence precisionin the function Rosenbrock 1198919 when compared with ABCand MABC How to make algorithm display better perfor-mance on functions Rosenbrock 1198919 is the next step in ourwork Besides it is worth applying the algorithm proposed in

Mathematical Problems in Engineering 11

Table3Ex

perim

entb

ased

oncomparis

onalgorithm

Functio

n1198911

11989121198913

11989141198915

11989161198917

11989181198919

1198911011989111

1198911211989113

1198911411989115

1198911611989117

11989118ABC M

ean

925119890minus 11

530119890minus 05

445119890minus 08

016130119890minus 03

0237854

003459

035720119890minus 03

363774119890+ 14

1187589119890minus 18

385562

024Std

311119890minus 10

372119890minus 04

232119890minus 08

007437119890minus 04

0122466

005341

032197119890minus 02

207145119890+ 15

224479119890minus 18

112560

004MABC Mean

729119890minus 16

969119890minus 10

687119890minus 15

276119890minus 02

128119890minus 06

795119890minus 05

397356119890minus03

2306230119890minus 03

193119890minus 05

1707625119890minus 08

190219119890minus 12

2512506

013Std

900119890minus 17

205119890minus 09

249119890minus 14

570119890minus 03

269119890minus 07

733119890minus 05

611001

4462871119890minus 03

994119890minus 06

1014230119890minus 07

148978119890minus 12

0391313

034DSA M

ean

550119890minus 03

146501119890minus 17

769005

308104119890+ 03

10117117

3755135

1044503119890+ 16

4932680119890minus 36

2145701

381Std

630119890minus 03

087119119890minus 16

224002

13316171

0122957

3234796

930571119890+ 16

447305119890minus 35

0385193

068BS

A Mean

001237

309119890minus 17

1158

009757

127119890+ 03

11214134

3498661

012252119890+ 25

6467464119890minus 47

157934

494Std

092128

614119890minus 17

208002

30313081

0052738

2772706

008321119890+ 25

534205119890minus 46

019545

073BA

Mean

917119890minus 07

020484119890minus 07

476119890+ 31

017195672119890+03

1018684

1925794119890minus 06

515119890minus 08

502119890+ 25

40900

456119890minus 12

1181

348365

Std

410119890minus 08

035216119890minus 08

213119890+ 32

016639

19500

45010827

014355119890minus 07

230119890minus 09

225119890+ 25

4700202119890minus 12

268107

1634

COBS

A

Mean

201119890minus 32

323119890minus 28

433119890minus 111

395119890minus 20

251119890minus 22

399119890minus 21

00

2738409119890minus 15

147119890minus 27

101119890 minus 20

138119890minus 18

0750119890minus 104

432119890minus 04

542119890minus 27

0

Std

879119890minus 32

143119890minus 27

147119890minus 110

201119890minus 21

750119890minus 22

119119890minus 20

00

013

110119890minus 15

586119890minus 27

433119890minus 20

616119890minus 17

0313119890minus 103

120119890minus 03

227119890minus 26

0

12 Mathematical Problems in Engineering

Table 4 Statistical test based on Wilcoxon Signed-Rank Test (120572 = 005)Algorithm versus COBSA 119901 value 119877+ 119877minus WinnerABC versus COBSA 250119890 minus 03 16 155 COBSAMABC versus COBSA 210119890 minus 03 15 156 COBSADSA versus COBSA 196119890 minus 04 0 171 COBSABSA versus COBSA 196119890 minus 04 0 171 COBSABA versus COBSA 196119890 minus 04 0 171 COBSA

Somesquare f2

01 02 03 04 05 06 07 08 090q

minus30

minus28

minus26

minus24

minus22

minus20

minus18

minus16

minus14

minus12

Fitn

ess

01 02 03 04 05 06 07 08 090q

minus22

minus20

minus18

minus16

minus14

minus12

minus10

minus8

minus6

Fitn

ess

Schwefel222 f4

Ackley f10

01 02 03 04 05 06 07 08 090q

minus15

minus14

minus13

minus12

minus11

minus10

minus9

minus8

Fitn

ess

Alpine f6

minus22

minus20

minus18

minus16

minus14

minus12

minus10

minus8

Fitn

ess

01 02 03 04 05 06 07 08 090q

Figure 3 The influence of 119902

this article to the blind source separation and hyperspectralimage processing

Conflicts of Interest

The authors declare that there are no conflicts of interestregarding the publication of this paper

Acknowledgments

The work is supported by National Nature Science Foun-dation of China (no 61401307) China Postdoctoral ScienceFoundation (no 2014M561184) Tianjin Research Program

of Application Foundation and Advanced Technology ofChina (no 15JCYBJC17100) and Tianjin Research Programof Science and Technology Commissioner of China (16JCT-PJC48400)

References

[1] J Kennedy and R Eberhart ldquoParticle swarm optimizationrdquoin Proceedings of the IEEE International Conference on NeuralNetworks pp 1942ndash1948 Perth Australia December 1995

[2] M Dorigo and T Stutzle Ant Colony Optimization MIT PressCombridge MA USA 2004

Mathematical Problems in Engineering 13

[3] J H HollandAdaptation in Natural and Artificial Systems MITPress Combridge MA USA 1992

[4] J J Liang A K Qin P N Suganthan and S BaskarldquoComprehensive learning particle swarm optimizer for globaloptimization of multimodal functionsrdquo IEEE Transactions onEvolutionary Computation vol 10 no 3 pp 281ndash295 2006

[5] K V Price R M Storn and J A Lampinen Differential Evo-lution A Practical Approach to Global Optimization SpringerBerlin Germany 2005

[6] J Zhang and A C Sanderson ldquoJADE Adaptive differentialevolution with optional external archiverdquo IEEE Transactions onEvolutionary Computation vol 13 no 5 pp 945ndash958 2009

[7] P Civicioglu ldquoTransforming geocentric cartesian coordinatesto geodetic coordinates by using differential search algorithmrdquoComputers and Geosciences vol 46 pp 229ndash247 2012

[8] D Karaboga and B Akay ldquoA comparative study of artificial beecolony algorithmrdquo Applied Mathematics and Computation vol214 no 1 pp 108ndash132 2009

[9] X S Yang and S Deb ldquoCuckoo search via Levy flightsrdquo inProceedings of the World Congress on Nature and BiologicallyInspired Computing (NABIC rsquo09) pp 210ndash214 CoimbatoreIndia December 2009

[10] X S Yang ldquoA new metaheuristic bat-inspired algorithmrdquo inNature Inspired Cooperative Strategies for Optimization (NICSOrsquo10) J R Gonzalez D A Pelta C Cruz G Terrazas andN Krasnogor Eds vol 284 of Studies in ComputationalIntelligence pp 65ndash74 Springer Berlin Germany 2010

[11] L Chen and L Y Zhang ldquoBlind signal separation algorithmbased on temporal predictability and differential search algo-rithmrdquo Journal on Communications vol 35 no 6 pp 117ndash1252014

[12] Z C Jia Y Y Xue L Chen Y J Guo and H D Xu ldquoBlindseparation algorithm for hyperspectral image based on thedenoising reduction and the bat optimizationrdquo Acta PhotonicaSinica vol 45 no 5 Article ID 0511001 2016

[13] J H Liang and M Ma ldquoArtificial bee colony algorithm basedresearch on image segmentationrdquo Computer Engineering andApplication vol 48 no 8 pp 194ndash197 2012

[14] P Civicioglu ldquoBacktracking search optimization algorithm fornumerical optimization problemsrdquo Applied Mathematics andComputation vol 219 no 15 pp 8121ndash8144 2013

[15] Z L Zhang and S Y Liu ldquoDecision-theoretic rough set attributereduction based on backtracking search algorithmrdquo ComputerEngineering and Applications vol 52 no 10 pp 71ndash74 2016

[16] X Chen S Y Liu and YWang ldquoEmergency resources schedul-ing based on improved backtracking search optimization algo-rithmrdquo Computer Applications and Software vol 32 no 12 pp235ndash238 2015

[17] M D Li and H Zhao ldquoBacktracking search optimizationalgorithm with comprehensive learning strategyrdquo System Engi-neering and Electronics vol 37 no 4 pp 958ndash963 2015

[18] X J Wang S Y Liu and W K Tian ldquoImproved backtrackingsearch optimization algorithm with new effective mutationscale factor and greedy crossover strategyrdquo Journal of ComputerApplications vol 34 no 9 pp 2543ndash2546 2014

[19] W K Tian S Y Liu and X J Wang ldquoStudy and improvementof backtracking search optimization algorithm based on differ-ential evolutionrdquoApplication Research of Computers vol 32 no6 pp 1653ndash1662 2015

[20] Y Shi and R Eberhart ldquoA modified particle swarm opti-mizerrdquo in Proceedings of the IEEE International Conference

on Evolutionary Computation and IEEE World Congress onComputational Intelligence (Cat No98TH8360) pp 69ndash73Anchorage Alaska USA May 1998

[21] W F Gao and S Y Liu ldquoA modified artificial bee colonyalgorithmrdquo Computers amp Operations Research vol 39 no 3 pp687ndash697 2012

[22] YW Shang and Y H Qiu ldquoA note on the extended Rosenbrockfunctionrdquo Evolutionary Computation vol 14 no 1 pp 119ndash1262006

[23] WHu and Z S Li ldquoA simpler andmore effective particle swarmoptimization algorithmrdquo Journal of Software vol 18 no 4 pp861ndash868 2007

[24] J Derrac S Garcıa D Molina and F Herrera ldquoA practicaltutorial on the use of nonparametric statistical tests as amethodology for comparing evolutionary and swarm intelli-gence algorithmsrdquo Swarm and Evolutionary Computation vol1 no 1 pp 3ndash18 2011

[25] S Garcıa D Molina M Lozano and F Herrera ldquoA study onthe use of non-parametric tests for analyzing the evolutionaryalgorithmsrsquo behaviour a case study on the CECrsquo2005 SpecialSession on Real Parameter Optimizationrdquo Journal of Heuristicsvol 15 no 6 pp 617ndash644 2009

Submit your manuscripts athttpswwwhindawicom

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical Problems in Engineering

Hindawi Publishing Corporationhttpwwwhindawicom

Differential EquationsInternational Journal of

Volume 2014

Applied MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Probability and StatisticsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical PhysicsAdvances in

Complex AnalysisJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

OptimizationJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

CombinatoricsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Operations ResearchAdvances in

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Function Spaces

Abstract and Applied AnalysisHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of Mathematics and Mathematical Sciences

Hindawi Publishing Corporationhttpwwwhindawicom Volume 201

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Algebra

Discrete Dynamics in Nature and Society

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Decision SciencesAdvances in

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014 Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Stochastic AnalysisInternational Journal of

Page 2: Improved Backtracking Search Algorithm Based on ...downloads.hindawi.com/journals/mpe/2017/3017608.pdfMathematicalProblemsinEngineering 3 3. The Improved Backtracking Search Algorithm

2 Mathematical Problems in Engineering

is typically slower than these typical evolutionary algorithms(DE and PSO) in the early stage as a result of ignoringthe importance of optimal individual that makes BSA dif-ficult to achieve the satisfactory outcome when handlingsome complex problemsTherefore accelerating convergencespeed and improving convergence precision have become tworelatively valuable and significative targets in BSArsquos study Anumber of modified BSA aiming to reach above two targetshave hence been put forward since its appearance Forexample literature [17] focuses on improving convergenceprecision by proposing the optimal evolution equation andoptimal search equation but brings extra evaluations Lit-eratures [18 19] focus on convergence precision and globeconvergence by employing variation coefficient and selectionmechanism but sacrifice iteration numbers as a result Animproved BSA is proposed in this article following thepurpose of accelerating convergence speed and improvingconvergence precision when iteration numbers are lowerFirstly to accelerate the convergence speed of populationin the early stage population control factor is added to theevolution equation What is more an evolutionary equationbased on the fact that the disadvantaged groupwill search justaround the best individual chosen from previous iterationis proposed to enhance the ability of local search Simula-tion experiments based on 18 benchmark functions showthat the improved algorithm can improve performance andproductivity of BSA effectively and is a feasible evolutionaryalgorithm

The rest of this article is arranged as follows This articlewill describe BSA briefly in the second section the improvedalgorithmwill be introduced in the third section in the fourthsection the simulation results will be presented the fifthsection is the discussion about the variance 119902 appearing in thefirst strategy in the last this article will draw a conclusion

2 Backtracking Search Algorithm

Similar to other evolutionary algorithms BSA is also apopulation-based evolutionary algorithm designed to finda global optimum the solving process can be described bydividing its operation into five steps as is done in otherevolutionary algorithms population initialization histor-ical population setting population evolution populationcrossover and greedy selection Specific steps of the basicBSA can be described as follows

21 Population Initialization 119873 numbers of randomlyemerged 119863-dimensional vectors make up the initial solutionof BSA And the random initialization method can besummarized as follows

Pop119894119895 sim 119880 (low119895 up119895) (1)

where 119894 isin [1 2 3 119873] and 119895 isin [1 2 3 119863] low andup stand for the lower and upper bound of search spacerespectively 119880 is the uniform distribution function

22 Historical Population Setting BSA chooses the previouspopulation randomly by appointing the Oldpop as history

population employed with the intent of controlling the searchspace and remembers the position until the Oldpop getschanged What makes BSA have memory function is twonumbers 119886 and 119887 generated randomly when the cycle startsIf 119886 lt 119887 then Oldpop = Pop meanwhile the order of theindividual in Oldpop will be arrayed randomly

23 Population Evolution At this stage BSAwill generate thenew population making full use of its previous experimentbased on Pop and Oldpop evolutionary equation is asfollows

119872 = Pop + 119865 lowast (Oldpop minus Pop) (2)

where 119865119894 = 3 lowast rand 119894 isin [1 2 3 119873] and rand is a randomnumber in the rang [0 1] 119865 is a parameter controlling rangeof the search direction matrix (Oldpop minus Pop)24 Population Crossover The new population crossoverstrategy different from DE shown in BSA is that the mixedproportion parameter is employed to control the numbers ofcross particle Equation is given as follows

119879119894119895 = 119872119894119895 map119894119895 = 1Pop119894119895 map119894119895 = 0 (3)

where map is119873times119863matrix which only contains 0 and 1 andits initial value is 1 Its specific calculation formula is given asfollows

map119894119906(1[119898119903lowastrandlowast119863]) = 0 119886 lt 119887map119894randi(119863) = 0 119886 ge 119887 (4)

where randi is an integer selected from 0 to 119863 randomly119898119903is a mixed proportion parameter and119898119903 = 1 rand is chosenrandomly in the range [0 1] and so do 119886 and 119887 119906 is an integervector sorted randomly and 119906 isin [1 2 3 119863] The map inthe crossover stage controls the numbers of individual thatwill mutate by using [119898119903lowastrandlowast119863] and randi(119863) When 119886 lt119887 map119894 is a two-element vector which containsmany randompositions otherwise a vector just containing an element ldquo0rdquowill be shown in the map119894 Some individuals generated inthe crossover stage can overflow the limited search space as aresult of crossover strategy For this a boundary control to theindividuals beyond the search space will be conducted using(1) again

25 Greedy Selection When BSA finishes above steps thenew population 119879119894 that have better position than originalpopulation Pop119894 are used to update the Pop119894 based on greedyselection and become new candidate solution Meanwhileif the best individual of new population has a lower fitnessthan optimumobtained from previous iterations the originalone will be replaced to be the new optimum Pop119894 andoptimum will remain when conditions mentioned above arenot satisfied

Mathematical Problems in Engineering 3

3 The Improved BacktrackingSearch Algorithm

In the above part BSA is introduced briefly and it can beknown easily that BSA equipping a single control parametera trial population (Oldpop) and a simple structure hasa good ability to explore search space in the early stageHowever it is a common phenomenon for evolutionaryalgorithms falling into local optimum easily when facing themultimodal and nonlinear problems and so does BSA Inthis regard an improved BSA based on population controlfactor and optimal learning strategy enlightened by PSO andDE respectively is presented in this article On the one handpopulation control factor that regulates search direction ofpopulation is employed when BSA operates on the otherhand disadvantaged group is guided to have optimal learningwith the help of the optimal individual of the previousiteration Specific improvements are as follows

31 Improved BSA Based on Population Control Factor(CBSA) Shortage shown in the BSA is that the Pop in thevariation equation (2) has stochastic character which makesBSA lack memory about position of population when thetrend of expanding search space and exploring the newsearch area follows Namely Pop has the trend of globaloptimizationTherefore we can take global search firstly andlocal fine search later into account Enlightened by literature[20] inertia weight is employed to the evolutionary equation(2) and the new evolutionary equation is as follows

119872 = 119908 lowast Pop + 119865 lowast (Oldpop minus Pop) (5)

where 119908 is a variable parameter changing with the increaseof iteration numbers linearly So the target we just referredabove can be reached on condition that the algorithm hasa stronger ability of global search when 119908 is larger and thealgorithm tends to local search when 119908 is smaller Thuswe can employ a global coefficient adaptively to improvethe search accuracy and search efficiency when facing somecomplex optimization problems But every coin has twosides This strategy cannot reflect the actual optimizationsearch process when the algorithm we just proposed facessome nonlinear problems as a result of linearity shown in 119908so this article presents the following search equation again

119872 = 119898 lowast 119908 lowast Pop + 119865 lowast (Oldpop minus Pop) (6)

where 119898 is a 119873 times 119863 matrix and the elements in matrix 119898fit normal distribution following the condition that mean is119901 = 1 while variance is 119902 = 03 Thus a disturbance willbe produced to (119908 lowast Pop) to help algorithm escape the localoptimumwhen algorithm handles some nonlinear problemsIn the last the final evolution equation is summarized as

119872 = 119888 lowast Pop + 119865 lowast (Oldpop minus Pop) (7)

where 119888 = 119898 lowast 119908 named population control factorThe above evolutionary equation can not only improve theconvergence speed with the help of 119908 but also enhance theability of escaping the local optimum with the help of 119898

New individual willappear in the optimalsolution nearby in ahigh probability

New individual willbe away from theoptimal solution ina small probability

minus4 minus2minus8 6 80 2 4minus6

Figure 1 Standard normal distribution

Based on simulation experiments it is proved strongly thatthe improved algorithm has a great performance on theconvergence speed

32 Improved BSA Based on Optimal Learning Strategy(OBSA) Another shortage shown in BSA is that BSA doesnot acquire enough experience taking full advantage of thebest current individual from DE Besides it is importantfor evolutionary algorithms to find a balance between globalsearch and local search It can be known that BSA in focuson global optimization with (2) displays a poor ability oflocal search because it ignores the importance of optimumindividualTheoptimum individual in the current populationis very meaningful source that can be used to improvelocal search In this regard enlightened by literature [5]and literature [21] this article proposes the following searchequation

V119894119895 = 119909best119895 + 119898 lowast Pop119894119895 (8)

where 119898 = 3 lowast randn 119909best119895 is the optimum solutionin the current population and randn is standard normaldistribution Obvious observation can be acquired fromFigure 1 that the new individual dominated by (8) can berandom enough for local search owing a high probability thatthe new individual will appear around the best individualNamely the evolutionary equation described by (8) is goodat local search but poor at global search

33 Improved BSA Based on Population Control Factor andOptimal Learning Strategy (COBSA) With purpose of solv-ing the shortage of slow convergence speed in the earlystage and low convergence precision in the late stage shownin BSA an improved BSA based on population controlfactor and optimal learning strategy in combination withSections 31 and 32 is presented in this article The improvedBSAnamedCOBSAproduces the newpopulation by employ-ing (7) and optimal learning strategy based on (8) will beapplied to the disadvantaged group that has a worse behaviorthan previous population The specific steps of the COBSAare given as follows

Step 1 Initialize the population Pop and Oldpop

4 Mathematical Problems in Engineering

Step 2 Generate two numbers 119886 and 119887 randomly if 119886 lt 119887Oldpop = Pop and Oldpop will be arrayed randomly later

Step 3 Produce a new population on the basis of initialpopulation by employing (7)

Step 4 Generate cross matrix map by using (4)

Step 5 Take cross strategy through (3)

Step 6 Initialize the individual that is out of boundary

Step 7 Seek the value of the population and disadvantagedgroup is guided to have optimal learning by employing (8)

Step 8 Select the optimal population according to the greedyselection mechanism

Step 9 Judge whether the cyclemeets the target if not returnto Step 2

Step 10 Output optimal solution

4 Experiment and Result Analysis

In order to verify effectiveness of the improved algorithmproposed in this article COBSA is applied to minimize a setof 18 benchmark functions commonly used in optimizationalgorithms Table 1 presents in detail function form searchspace and function dimension used for the evolutionaryalgorithms in the tests Some problems will arise whenCOBSA operates owing to the different characters shown inthe 18 benchmark functions 1198911sim1198915 11989111sim11989113 and 11989117 areunimodal and continuous functions 1198916sim1198918 11989110 11989114 11989116and 11989118 are multipeak and the number of local minimumvalues will increase obviously when dimension119863 gets larger1198919 is the Rosenbrock function which is unimodal when 119863 =2 and 3 but numerous local minimum values will arise whenproblem dimension is high [22]11989115 is the three-hump camelfunction in 119863 = 2 having three local minimum values Thisarticle adjusts the form of some functions for convenienceto compare when different versions in different literaturesshown in these functions are considered

Enlightened by literatures [23ndash25] the followingmethodsare designed to evaluate the performance of COBSA (1)convergence performance in fixed iteration (2) iterationperformance in fixed precision (3) comparison algorithmsemployed to compare with COBSA (4) statistical test basedon the Wilcoxon Signed-Rank Test used to analyse theproperty of COBSA and comparison algorithms

The parameters of algorithm used in the evaluationmethods (1) (2) (3) and (4) are listed as follows thepopulation size is 30 the maximum cycle of times to beevaluated for each benchmark functions is 1000 and the finalresult is the average value based on 20 independent runningtimes

41 Convergence Performance in Fixed Iteration In thissection in order to analyse how much the population con-trol factor and the optimal learning strategy contribute to

improve the performance of BSA this article compares theconvergence speed and convergence precision of BSA CBSAOBSA and COBSA based on the 18 objective functions whenthe maximum cycle is 1000 And the fitness of function takesthe logarithmof 10 to display the difference shown in differentstrategy more obviously

Important observation on convergence speed and con-vergence precision of different algorithms can be obtainedfrom the result displayed in Figure 2 A better performanceon convergence speed shown in CBSA is displayed whenBSA works under the same condition which implies theefficiency of evolution equation (7) powerfully It also can beobtained from Figure 2 that OBSA has better convergenceprecision than that of BSA in the late stage which shows thecorrectness of search equation (8) proposed in Section 32Meanwhile we can learn that the COBSA in combinationwith CBSA and OBSA displaying faster speed and higherprecision in Figure 2 has a more obvious influence on theperformance of BSA when compared with BSA CBSA andOBSA

42 Iteration Performance in Fixed Precision In this sectionthe performance of each algorithm depended on the itera-tion numbers when evolution precision which is limited isemployed to make a comparison All the algorithms havebeen run 20 times independently based on 18 objectivefunctions For each function the current cycle will stopeither when the precision is less than the limitation given inTable 2 or when the maximum cycle is satisfied The resultsincluding BSA CBSA OBSA and COBSA are displayed inTable 2 in terms of min max mean SR and EI Specificexpression of each parameter we just mentioned is as followsthe success rate (SR) = numbers of achieving target precisiondivide total numbers of experiment expected iterations (EI)= the numbers of particle times average iteration numbers dividesuccess rate And ldquoinfinrdquo appearing in Table 2 indicates that theexpected iterations are infinite

An interesting result is that BSA does not meet the targetprecision for all functions except Sumpower 1198913 and the SRof BSA is just 015 on function Sumpower 1198913 when that ofCBSA OBSA and COBSA is 1 Besides for other functionsa higher SR and less EI are displayed when CBSA and OBSAare compared with BSA while COBSA has 100 success ratefor all functions with least expected iteration The iterationperformance in fixed precision in this section also proves thesuperiority of COBSA

43 Experiment Based on Comparison Algorithms In thissection some relatively new evolutionary algorithms evolv-ing in recent decades are employed to compare with COBSAABC (limit = 100) MABC (limit = 100) BA (119860 = 05 119903 = 05119891max = 2 119891min = 0 120572 = 09 and 120574 = 09) DSA and BSAare included The result is judged in terms of mean standarddeviation (std) of the solution obtained from 20 independenttests

As it can be obtained from Table 3 COBSA offers ahigher convergence precision on almost all functions whilethe stander deviation of COBSA is also much smaller thanthat of the comparison algorithms In particular COBSA can

Mathematical Problems in Engineering 5

Table 1 Test function

Function Function expression Dim Search space

Sphere 1198911 119891(119909) = 119899sum119894=1

1199092119894 30 [minus100 100]Sumsquare 1198912 119891(119909) = 119899sum

119894=1

1198941199092119894 60 [minus10 10]Sumpower 1198913 119891(119909) = 119899sum

119894=1

10038161003816100381610038161199091198941003816100381610038161003816119894+1 60 [minus1 1]Schwefel222 1198914 119891(119909) = 119899sum

119894=1

10038161003816100381610038161199091198941003816100381610038161003816 +119899prod119894=1

10038161003816100381610038161199091198941003816100381610038161003816 100 [minus10 10]Exponential 1198915 119891(119909) = exp(05 lowast 119899sum

119894=1

119909119894) minus 1 60 [minus128 128]Alpine 1198916 119891(119909) = 119899sum

119894=1

1003816100381610038161003816119909119894 sin (119909119894) + 011199091198941003816100381610038161003816 60 [minus10 10]Rastrigin 1198917 119891(119909) = 119899sum

119894=1

(1199092119894 minus 10 cos 2120587119909119894) + 10 100 [minus100 100]Griewank 1198918 119891(119909) = 14000

119899sum119894=1

(119909119894)2 minus 119899prod119894=1

cos( 119909119894radic119909119894) + 1 60 [minus600 600]Rosenbrock 1198919 119891(119909) = 119899minus1sum

119894=1

[100 (1199092119894 minus 119909119894+1)2 + (119909119894 minus 1)2] 30 [minus30 30]Ackley 11989110 119891 (119909) = minus20 exp(minus02radic 1119899

119899sum119894=1

1199092119894) minus exp(1119899119899sum119894=1

cos (2120587119909119894)) + 20 + 119890 60 [minus32 32]R-H-Ellipsoid11989111 119891(119909) = 119899sum

119894=1

119894sum119895=1

1199092119895 80 [minus100 100]

Schwefel12 11989112 119891(119909) = 119899sum119894=1

( 119894sum119895=1

119909119895)2

10 [minus30 30]Elliptic 11989113 119891(119909) = 119899sum

119894=1

(106)(119894minus1)(119899minus1) 1199092119894 30 [minus100 100]

NCRastrigin11989114119891(119909) = 119899sum

119894=1

(1199102119894 minus 10 cos 2120587119910119894) + 10

119910119894 = 119909119894 10038161003816100381610038161199091198941003816100381610038161003816 lt 12 round(2119909119894)2 |119909119894| ge 12

50 [minus512 512]

T-H camel 11989115 119891(119909) = 211990921 minus 10511990941 + 119909616 + 11990911199092 + 11990922 2 [minus5 5]

Schaffer 11989116 119891(119909) = ( 119899sum119894=1

1199092119894)025 (sin2 (50( 119899sum

119894=1

1199092119894)01) + 1) 10 [minus100 100]

Powell 11989117 119891(119909) = 1198994sum119894=1

[(1199094119894minus3 + 101199094119894minus2)2 + 5 (1199094119894minus1 minus 1199094119894)2 + (1199094119894minus2 minus 21199094119894minus1)4 + 10 (1199094119894minus3 + 1199094119894)4] 70 [5 minus4]

Weierstrass 11989118 119891 (119909) = 119899sum119894=1

(119896maxsum119896=0

[119886119896 cos (2120587119887119896 (119909119894 + 05))]) minus 119899119896maxsum119896=0

[119886119896 cos (212058711988711989605)]119886 = 05 119887 = 3 119896max = 20 60 [minus05 05]

find optimal solution when facing Rastrigin 1198917 Griewank1198918 NCRastrigin 11989114 and Weierstrass 11989118 However in thecase of Rosenbrock 1198919 the convergence precision of COBSAis worse compared to that of ABC and MABC since ABCis just one magnitude order higher when compared withCOBSA on Rosenbrock 1198919 the superiority of ABC andMABC is not very obvious Besides COBSA displays a morestable search process than ABC and MABC owing to a lowerstander deviation on Rosenbrock 1198919

44 Experiment Based on Wilcoxon Signed-Rank Test Acomparative method based on statistical analysis by usingWilcoxon Signed-Rank Test is employed to evaluate theperformance of COBSA and comparison algorithms

Table 4 displays the statistical results using the averagevalues based on 20 independent running times of COBSAand comparison algorithms to handle the objective functionsin Table 1 The result presents that the COBSA displaying abetter statistical property successfully outperforms all of the

6 Mathematical Problems in Engineering

Sphere f1

BSCBS

OBSCOBS

10minus40

10minus30

10minus20

10minus10

100

1010

Fitn

ess

200 400 600 800 10000Generation

Sumsquare f2

BSCBS

OBSCOBS

10minus30

10minus20

10minus10

100

1010

Fitn

ess

200 400 600 800 10000Generation

Alpine f6

BSCBS

OBSCOBS

10minus25

10minus20

10minus15

10minus10

10minus5

100

105

Fitn

ess

200 400 600 800 10000Generation

Schwefel222 f4

BSCBS

OBSCOBS

10minus40

10minus20

100

1060

1040

1020

Fitn

ess

200 400 600 800 10000Generation

Sumpower f3

BSCBS

OBSCOBS

10minus150

10minus100

10minus50

100

1050

Fitn

ess

200 400 600 800 10000Generation

Exponential f5

BSCBS

OBSCOBS

10minus25

10minus20

10minus15

10minus10

10minus5

100

105

Fitn

ess

200 400 600 800 10000Generation

Figure 2 Continued

Mathematical Problems in Engineering 7

Griewank f8

BSCBS

OBSCOBS

10minus20

10minus15

10minus10

10minus5

100

105

Fitn

ess

200 400 600 800 10000Generation

Rastrigin f7

BSCBS

OBSCOBS

10minus15

10minus10

10minus5

100

1010

105

Fitn

ess

200 400 600 800 10000Generation

Ackley f10

BSCBS

OBSCOBS

10minus15

10minus10

10minus5

100

105Fi

tnes

s

200 400 600 800 10000Generation

Rosenbrock f9

BSCBS

OBSCOBS

100

1010

108

106

104

102

Fitn

ess

200 400 600 800 10000Generation

R-H-Ellipsoid f11

BSCBS

OBSCOBS

1010

100

Fitn

ess

10minus30

10minus20

10minus10

200 400 600 800 10000Generation

Schwefel12 f12

BSCBS

OBSCOBS

10minus25

10minus20

10minus15

10minus10

10minus5

100

105

Fitn

ess

200 400 600 800 10000Generation

Figure 2 Continued

8 Mathematical Problems in Engineering

Elliptic f13

BSCBS

OBSCOBS

1030

1010

1020

100

Fitn

ess

10minus20

10minus10

200 400 600 800 10000Generation

NCRastrigin f14

BSCBS

OBSCOBS

10minus15

10minus10

10minus5

100

105

Fitn

ess

200 400 600 800 10000Generation

T-H camel f15

BSCBS

OBSCOBS

100

Fitn

ess

10minus120

10minus40

10minus60

10minus80

10minus100

10minus20

200 400 600 800 10000Generation

Schaffer f16

BSCBS

OBSCOBS

10minus1

10minus2

10minus3

10minus4

100

101

102Fi

tnes

s

200 400 600 800 10000Generation

Powell f17

BSCBS

OBSCOBS

100

105

Fitn

ess

10minus25

10minus20

10minus15

10minus10

10minus5

200 400 600 800 10000Generation

Weierstrass f18

BSCBS

OBSCOBS

10minus15

10minus10

10minus5

100

105

Fitn

ess

200 400 600 800 10000Generation

Figure 2 Convergence performance in fixed iteration

Mathematical Problems in Engineering 9

Table 2 Iteration performance in fixed precision

Function Limitation Algorithm Iterations Success rate Expected iterationsMin Max Mean

1198911 119890 minus 24BSA 1000 1000 1000 0 infinCBSA 845 925 877 1 26310OBSA 870 1000 974 03 97400COBSA 635 734 692 1 20760

1198912 119890 minus 25BSA 1000 1000 1000 0 infinCBSA 938 1000 971 095 30663OBSA 720 1000 957 035 82028COBSA 664 802 742 1 22260

1198913 119890 minus 20BSA 927 1000 993 015 198600CBSA 334 442 381 1 11430OBSA 211 353 288 1 8640COBSA 159 259 223 1 6690

1198914 119890 minus 17BSA 1000 1000 1000 0 infinCBSA 844 981 877 1 26310OBSA 791 1000 961 035 82371COBSA 688 851 747 1 22410

1198915 119890 minus 16BSA 1000 1000 1000 0 infinCBSA 757 837 791 1 23730OBSA 725 1000 952 05 57120COBSA 613 755 675 1 20250

1198916 119890 minus 18BSA 1000 1000 1000 0 infinCBSA 913 1000 954 09 31800OBSA 951 1000 993 02 148950COBSA 711 830 784 1 23520

1198917 119890 minus 5BSA 1000 1000 1000 0 infinCBSA 561 646 599 1 17970OBSA 250 541 453 1 13590COBSA 354 488 422 1 12660

1198918 119890 minus 7BSA 1000 1000 1000 0 infinCBSA 477 558 529 1 15870OBSA 329 615 465 1 13680COBSA 247 469 411 1 12330

1198919 30

BSA 1000 1000 1000 0 infinCBSA 215 298 267 1 8010OBSA 163 349 244 1 7320COBSA 141 241 189 1 5670

11989110 119890 minus 14BSA 1000 1000 1000 0 infinCBSA 958 1000 979 045 65266OBSA 830 1000 989 03 98900COBSA 687 860 784 1 23520

11989111 119890 minus 21BSA 1000 1000 1000 0 infinCBSA 873 947 914 1 27420OBSA 693 1000 927 09 30900COBSA 629 816 710 1 21300

11989112 119890 minus 14BSA 1000 1000 1000 0 infinCBSA 892 1000 986 035 84514OBSA 766 1000 883 09 29433COBSA 532 710 624 1 18720

10 Mathematical Problems in Engineering

Table 2 Continued

Function Limitation Algorithm Iterations Success rate Expected iterationsMin Max Mean

11989113 119890 minus 07BSA 1000 1000 1000 0 infinCBSA 630 700 667 1 20010OBSA 505 1000 821 085 28976COBSA 524 611 566 1 16980

11989114 119890 minus 05BSA 1000 1000 1000 0 infinCBSA 590 790 679 1 20370OBSA 286 960 608 1 18240COBSA 424 556 492 1 14760

11989115 119890 minus 80BSA 1000 1000 1000 0 infinCBSA 869 1000 973 06 48650OBSA 803 1000 991 045 66066COBSA 763 893 786 1 10560

11989116 001

BSA 1000 1000 1000 0 infinCBSA 800 1000 948 07 40628OBSA 596 1000 831 08 31162COBSA 475 667 577 1 17310

11989117 119890 minus 20BSA 1000 1000 1000 0 infinCBSA 956 1000 990 05 59400OBSA 658 1000 854 075 34160COBSA 545 761 689 1 20670

11989118 001

BSA 1000 1000 1000 0 infinCBSA 256 323 294 1 8820OBSA 172 1000 496 085 17505COBSA 218 327 260 1 7800

comparison algorithms based on theWilcoxon Signed-RankTest with a statistical significance value 120572 = 005

According to the above analyses of Sections 41 42 43and 44 conclusion can be summarized safely as followsCOBSA can improve the performance of BSA effectively andit is a feasible computation method

5 The Choice of Variance 119902In Section 31 the variance 119902 of matrix119898 plays an importantrole in affecting the performance of CBSA When 119902 takes0 the matrix 119898 will lose its role When 119902 increases from0 to 1 the fluctuation to population Pop will also increasecorrespondingly However a lower convergence speed willbe displayed owing to the obvious fluctuation to populationPop when 119902 is large Therefore four objective functionsSumsquare 1198912 Schwefel222 1198914 Alpine 1198916 and Ackley 11989110are employed to analyse the effect of variance 119902 The fitnessof function takes the logarithm of 10 to show and observe thecurves obviously

The result is better when the fitness of function issmaller owing to the minimization process shown in thesefunctions We can obtain from Figure 3 that variance 119902affects the result obviously CBSA plays a better performanceon Sumsquare 1198912 and Alpine 1198916 when 119902 = 03 When119902 takes 04 the fitness of Schwefel222 1198914 is better ForAckley 11989110 a lower precision is obtained when 119902 is around

03 Therefore the selective variance 119902 is set at 03 in thisarticle

Besides we can also observe from Figure 3 that theinfluence of matrix 119898 cannot be ignored When variance 119902is 0 the element in matrix119898 is 1 and as a result the matrix119898will lose its role As it can be obtained from Figure 3 a betterperformance is displayed when 119902 is around 03 rather than 0which proves the significance of matrix1198986 Conclusion

In order to solve the problem of slow convergencespeed and low convergence precision shown in BSAthis article proposes an improved algorithm namedCOBSA through employing population control factor tothe variation equation and helping disadvantaged grouphave optimal learning The experiments based on 18benchmark functions prove that COBSA is an effectiveevolutionary algorithm and it is a feasible solution toimprove the convergence speed and convergence precision ofBSA

However COBSA displays a lower convergence precisionin the function Rosenbrock 1198919 when compared with ABCand MABC How to make algorithm display better perfor-mance on functions Rosenbrock 1198919 is the next step in ourwork Besides it is worth applying the algorithm proposed in

Mathematical Problems in Engineering 11

Table3Ex

perim

entb

ased

oncomparis

onalgorithm

Functio

n1198911

11989121198913

11989141198915

11989161198917

11989181198919

1198911011989111

1198911211989113

1198911411989115

1198911611989117

11989118ABC M

ean

925119890minus 11

530119890minus 05

445119890minus 08

016130119890minus 03

0237854

003459

035720119890minus 03

363774119890+ 14

1187589119890minus 18

385562

024Std

311119890minus 10

372119890minus 04

232119890minus 08

007437119890minus 04

0122466

005341

032197119890minus 02

207145119890+ 15

224479119890minus 18

112560

004MABC Mean

729119890minus 16

969119890minus 10

687119890minus 15

276119890minus 02

128119890minus 06

795119890minus 05

397356119890minus03

2306230119890minus 03

193119890minus 05

1707625119890minus 08

190219119890minus 12

2512506

013Std

900119890minus 17

205119890minus 09

249119890minus 14

570119890minus 03

269119890minus 07

733119890minus 05

611001

4462871119890minus 03

994119890minus 06

1014230119890minus 07

148978119890minus 12

0391313

034DSA M

ean

550119890minus 03

146501119890minus 17

769005

308104119890+ 03

10117117

3755135

1044503119890+ 16

4932680119890minus 36

2145701

381Std

630119890minus 03

087119119890minus 16

224002

13316171

0122957

3234796

930571119890+ 16

447305119890minus 35

0385193

068BS

A Mean

001237

309119890minus 17

1158

009757

127119890+ 03

11214134

3498661

012252119890+ 25

6467464119890minus 47

157934

494Std

092128

614119890minus 17

208002

30313081

0052738

2772706

008321119890+ 25

534205119890minus 46

019545

073BA

Mean

917119890minus 07

020484119890minus 07

476119890+ 31

017195672119890+03

1018684

1925794119890minus 06

515119890minus 08

502119890+ 25

40900

456119890minus 12

1181

348365

Std

410119890minus 08

035216119890minus 08

213119890+ 32

016639

19500

45010827

014355119890minus 07

230119890minus 09

225119890+ 25

4700202119890minus 12

268107

1634

COBS

A

Mean

201119890minus 32

323119890minus 28

433119890minus 111

395119890minus 20

251119890minus 22

399119890minus 21

00

2738409119890minus 15

147119890minus 27

101119890 minus 20

138119890minus 18

0750119890minus 104

432119890minus 04

542119890minus 27

0

Std

879119890minus 32

143119890minus 27

147119890minus 110

201119890minus 21

750119890minus 22

119119890minus 20

00

013

110119890minus 15

586119890minus 27

433119890minus 20

616119890minus 17

0313119890minus 103

120119890minus 03

227119890minus 26

0

12 Mathematical Problems in Engineering

Table 4 Statistical test based on Wilcoxon Signed-Rank Test (120572 = 005)Algorithm versus COBSA 119901 value 119877+ 119877minus WinnerABC versus COBSA 250119890 minus 03 16 155 COBSAMABC versus COBSA 210119890 minus 03 15 156 COBSADSA versus COBSA 196119890 minus 04 0 171 COBSABSA versus COBSA 196119890 minus 04 0 171 COBSABA versus COBSA 196119890 minus 04 0 171 COBSA

Somesquare f2

01 02 03 04 05 06 07 08 090q

minus30

minus28

minus26

minus24

minus22

minus20

minus18

minus16

minus14

minus12

Fitn

ess

01 02 03 04 05 06 07 08 090q

minus22

minus20

minus18

minus16

minus14

minus12

minus10

minus8

minus6

Fitn

ess

Schwefel222 f4

Ackley f10

01 02 03 04 05 06 07 08 090q

minus15

minus14

minus13

minus12

minus11

minus10

minus9

minus8

Fitn

ess

Alpine f6

minus22

minus20

minus18

minus16

minus14

minus12

minus10

minus8

Fitn

ess

01 02 03 04 05 06 07 08 090q

Figure 3 The influence of 119902

this article to the blind source separation and hyperspectralimage processing

Conflicts of Interest

The authors declare that there are no conflicts of interestregarding the publication of this paper

Acknowledgments

The work is supported by National Nature Science Foun-dation of China (no 61401307) China Postdoctoral ScienceFoundation (no 2014M561184) Tianjin Research Program

of Application Foundation and Advanced Technology ofChina (no 15JCYBJC17100) and Tianjin Research Programof Science and Technology Commissioner of China (16JCT-PJC48400)

References

[1] J Kennedy and R Eberhart ldquoParticle swarm optimizationrdquoin Proceedings of the IEEE International Conference on NeuralNetworks pp 1942ndash1948 Perth Australia December 1995

[2] M Dorigo and T Stutzle Ant Colony Optimization MIT PressCombridge MA USA 2004

Mathematical Problems in Engineering 13

[3] J H HollandAdaptation in Natural and Artificial Systems MITPress Combridge MA USA 1992

[4] J J Liang A K Qin P N Suganthan and S BaskarldquoComprehensive learning particle swarm optimizer for globaloptimization of multimodal functionsrdquo IEEE Transactions onEvolutionary Computation vol 10 no 3 pp 281ndash295 2006

[5] K V Price R M Storn and J A Lampinen Differential Evo-lution A Practical Approach to Global Optimization SpringerBerlin Germany 2005

[6] J Zhang and A C Sanderson ldquoJADE Adaptive differentialevolution with optional external archiverdquo IEEE Transactions onEvolutionary Computation vol 13 no 5 pp 945ndash958 2009

[7] P Civicioglu ldquoTransforming geocentric cartesian coordinatesto geodetic coordinates by using differential search algorithmrdquoComputers and Geosciences vol 46 pp 229ndash247 2012

[8] D Karaboga and B Akay ldquoA comparative study of artificial beecolony algorithmrdquo Applied Mathematics and Computation vol214 no 1 pp 108ndash132 2009

[9] X S Yang and S Deb ldquoCuckoo search via Levy flightsrdquo inProceedings of the World Congress on Nature and BiologicallyInspired Computing (NABIC rsquo09) pp 210ndash214 CoimbatoreIndia December 2009

[10] X S Yang ldquoA new metaheuristic bat-inspired algorithmrdquo inNature Inspired Cooperative Strategies for Optimization (NICSOrsquo10) J R Gonzalez D A Pelta C Cruz G Terrazas andN Krasnogor Eds vol 284 of Studies in ComputationalIntelligence pp 65ndash74 Springer Berlin Germany 2010

[11] L Chen and L Y Zhang ldquoBlind signal separation algorithmbased on temporal predictability and differential search algo-rithmrdquo Journal on Communications vol 35 no 6 pp 117ndash1252014

[12] Z C Jia Y Y Xue L Chen Y J Guo and H D Xu ldquoBlindseparation algorithm for hyperspectral image based on thedenoising reduction and the bat optimizationrdquo Acta PhotonicaSinica vol 45 no 5 Article ID 0511001 2016

[13] J H Liang and M Ma ldquoArtificial bee colony algorithm basedresearch on image segmentationrdquo Computer Engineering andApplication vol 48 no 8 pp 194ndash197 2012

[14] P Civicioglu ldquoBacktracking search optimization algorithm fornumerical optimization problemsrdquo Applied Mathematics andComputation vol 219 no 15 pp 8121ndash8144 2013

[15] Z L Zhang and S Y Liu ldquoDecision-theoretic rough set attributereduction based on backtracking search algorithmrdquo ComputerEngineering and Applications vol 52 no 10 pp 71ndash74 2016

[16] X Chen S Y Liu and YWang ldquoEmergency resources schedul-ing based on improved backtracking search optimization algo-rithmrdquo Computer Applications and Software vol 32 no 12 pp235ndash238 2015

[17] M D Li and H Zhao ldquoBacktracking search optimizationalgorithm with comprehensive learning strategyrdquo System Engi-neering and Electronics vol 37 no 4 pp 958ndash963 2015

[18] X J Wang S Y Liu and W K Tian ldquoImproved backtrackingsearch optimization algorithm with new effective mutationscale factor and greedy crossover strategyrdquo Journal of ComputerApplications vol 34 no 9 pp 2543ndash2546 2014

[19] W K Tian S Y Liu and X J Wang ldquoStudy and improvementof backtracking search optimization algorithm based on differ-ential evolutionrdquoApplication Research of Computers vol 32 no6 pp 1653ndash1662 2015

[20] Y Shi and R Eberhart ldquoA modified particle swarm opti-mizerrdquo in Proceedings of the IEEE International Conference

on Evolutionary Computation and IEEE World Congress onComputational Intelligence (Cat No98TH8360) pp 69ndash73Anchorage Alaska USA May 1998

[21] W F Gao and S Y Liu ldquoA modified artificial bee colonyalgorithmrdquo Computers amp Operations Research vol 39 no 3 pp687ndash697 2012

[22] YW Shang and Y H Qiu ldquoA note on the extended Rosenbrockfunctionrdquo Evolutionary Computation vol 14 no 1 pp 119ndash1262006

[23] WHu and Z S Li ldquoA simpler andmore effective particle swarmoptimization algorithmrdquo Journal of Software vol 18 no 4 pp861ndash868 2007

[24] J Derrac S Garcıa D Molina and F Herrera ldquoA practicaltutorial on the use of nonparametric statistical tests as amethodology for comparing evolutionary and swarm intelli-gence algorithmsrdquo Swarm and Evolutionary Computation vol1 no 1 pp 3ndash18 2011

[25] S Garcıa D Molina M Lozano and F Herrera ldquoA study onthe use of non-parametric tests for analyzing the evolutionaryalgorithmsrsquo behaviour a case study on the CECrsquo2005 SpecialSession on Real Parameter Optimizationrdquo Journal of Heuristicsvol 15 no 6 pp 617ndash644 2009

Submit your manuscripts athttpswwwhindawicom

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical Problems in Engineering

Hindawi Publishing Corporationhttpwwwhindawicom

Differential EquationsInternational Journal of

Volume 2014

Applied MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Probability and StatisticsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical PhysicsAdvances in

Complex AnalysisJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

OptimizationJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

CombinatoricsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Operations ResearchAdvances in

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Function Spaces

Abstract and Applied AnalysisHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of Mathematics and Mathematical Sciences

Hindawi Publishing Corporationhttpwwwhindawicom Volume 201

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Algebra

Discrete Dynamics in Nature and Society

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Decision SciencesAdvances in

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014 Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Stochastic AnalysisInternational Journal of

Page 3: Improved Backtracking Search Algorithm Based on ...downloads.hindawi.com/journals/mpe/2017/3017608.pdfMathematicalProblemsinEngineering 3 3. The Improved Backtracking Search Algorithm

Mathematical Problems in Engineering 3

3 The Improved BacktrackingSearch Algorithm

In the above part BSA is introduced briefly and it can beknown easily that BSA equipping a single control parametera trial population (Oldpop) and a simple structure hasa good ability to explore search space in the early stageHowever it is a common phenomenon for evolutionaryalgorithms falling into local optimum easily when facing themultimodal and nonlinear problems and so does BSA Inthis regard an improved BSA based on population controlfactor and optimal learning strategy enlightened by PSO andDE respectively is presented in this article On the one handpopulation control factor that regulates search direction ofpopulation is employed when BSA operates on the otherhand disadvantaged group is guided to have optimal learningwith the help of the optimal individual of the previousiteration Specific improvements are as follows

31 Improved BSA Based on Population Control Factor(CBSA) Shortage shown in the BSA is that the Pop in thevariation equation (2) has stochastic character which makesBSA lack memory about position of population when thetrend of expanding search space and exploring the newsearch area follows Namely Pop has the trend of globaloptimizationTherefore we can take global search firstly andlocal fine search later into account Enlightened by literature[20] inertia weight is employed to the evolutionary equation(2) and the new evolutionary equation is as follows

119872 = 119908 lowast Pop + 119865 lowast (Oldpop minus Pop) (5)

where 119908 is a variable parameter changing with the increaseof iteration numbers linearly So the target we just referredabove can be reached on condition that the algorithm hasa stronger ability of global search when 119908 is larger and thealgorithm tends to local search when 119908 is smaller Thuswe can employ a global coefficient adaptively to improvethe search accuracy and search efficiency when facing somecomplex optimization problems But every coin has twosides This strategy cannot reflect the actual optimizationsearch process when the algorithm we just proposed facessome nonlinear problems as a result of linearity shown in 119908so this article presents the following search equation again

119872 = 119898 lowast 119908 lowast Pop + 119865 lowast (Oldpop minus Pop) (6)

where 119898 is a 119873 times 119863 matrix and the elements in matrix 119898fit normal distribution following the condition that mean is119901 = 1 while variance is 119902 = 03 Thus a disturbance willbe produced to (119908 lowast Pop) to help algorithm escape the localoptimumwhen algorithm handles some nonlinear problemsIn the last the final evolution equation is summarized as

119872 = 119888 lowast Pop + 119865 lowast (Oldpop minus Pop) (7)

where 119888 = 119898 lowast 119908 named population control factorThe above evolutionary equation can not only improve theconvergence speed with the help of 119908 but also enhance theability of escaping the local optimum with the help of 119898

New individual willappear in the optimalsolution nearby in ahigh probability

New individual willbe away from theoptimal solution ina small probability

minus4 minus2minus8 6 80 2 4minus6

Figure 1 Standard normal distribution

Based on simulation experiments it is proved strongly thatthe improved algorithm has a great performance on theconvergence speed

32 Improved BSA Based on Optimal Learning Strategy(OBSA) Another shortage shown in BSA is that BSA doesnot acquire enough experience taking full advantage of thebest current individual from DE Besides it is importantfor evolutionary algorithms to find a balance between globalsearch and local search It can be known that BSA in focuson global optimization with (2) displays a poor ability oflocal search because it ignores the importance of optimumindividualTheoptimum individual in the current populationis very meaningful source that can be used to improvelocal search In this regard enlightened by literature [5]and literature [21] this article proposes the following searchequation

V119894119895 = 119909best119895 + 119898 lowast Pop119894119895 (8)

where 119898 = 3 lowast randn 119909best119895 is the optimum solutionin the current population and randn is standard normaldistribution Obvious observation can be acquired fromFigure 1 that the new individual dominated by (8) can berandom enough for local search owing a high probability thatthe new individual will appear around the best individualNamely the evolutionary equation described by (8) is goodat local search but poor at global search

33 Improved BSA Based on Population Control Factor andOptimal Learning Strategy (COBSA) With purpose of solv-ing the shortage of slow convergence speed in the earlystage and low convergence precision in the late stage shownin BSA an improved BSA based on population controlfactor and optimal learning strategy in combination withSections 31 and 32 is presented in this article The improvedBSAnamedCOBSAproduces the newpopulation by employ-ing (7) and optimal learning strategy based on (8) will beapplied to the disadvantaged group that has a worse behaviorthan previous population The specific steps of the COBSAare given as follows

Step 1 Initialize the population Pop and Oldpop

4 Mathematical Problems in Engineering

Step 2 Generate two numbers 119886 and 119887 randomly if 119886 lt 119887Oldpop = Pop and Oldpop will be arrayed randomly later

Step 3 Produce a new population on the basis of initialpopulation by employing (7)

Step 4 Generate cross matrix map by using (4)

Step 5 Take cross strategy through (3)

Step 6 Initialize the individual that is out of boundary

Step 7 Seek the value of the population and disadvantagedgroup is guided to have optimal learning by employing (8)

Step 8 Select the optimal population according to the greedyselection mechanism

Step 9 Judge whether the cyclemeets the target if not returnto Step 2

Step 10 Output optimal solution

4 Experiment and Result Analysis

In order to verify effectiveness of the improved algorithmproposed in this article COBSA is applied to minimize a setof 18 benchmark functions commonly used in optimizationalgorithms Table 1 presents in detail function form searchspace and function dimension used for the evolutionaryalgorithms in the tests Some problems will arise whenCOBSA operates owing to the different characters shown inthe 18 benchmark functions 1198911sim1198915 11989111sim11989113 and 11989117 areunimodal and continuous functions 1198916sim1198918 11989110 11989114 11989116and 11989118 are multipeak and the number of local minimumvalues will increase obviously when dimension119863 gets larger1198919 is the Rosenbrock function which is unimodal when 119863 =2 and 3 but numerous local minimum values will arise whenproblem dimension is high [22]11989115 is the three-hump camelfunction in 119863 = 2 having three local minimum values Thisarticle adjusts the form of some functions for convenienceto compare when different versions in different literaturesshown in these functions are considered

Enlightened by literatures [23ndash25] the followingmethodsare designed to evaluate the performance of COBSA (1)convergence performance in fixed iteration (2) iterationperformance in fixed precision (3) comparison algorithmsemployed to compare with COBSA (4) statistical test basedon the Wilcoxon Signed-Rank Test used to analyse theproperty of COBSA and comparison algorithms

The parameters of algorithm used in the evaluationmethods (1) (2) (3) and (4) are listed as follows thepopulation size is 30 the maximum cycle of times to beevaluated for each benchmark functions is 1000 and the finalresult is the average value based on 20 independent runningtimes

41 Convergence Performance in Fixed Iteration In thissection in order to analyse how much the population con-trol factor and the optimal learning strategy contribute to

improve the performance of BSA this article compares theconvergence speed and convergence precision of BSA CBSAOBSA and COBSA based on the 18 objective functions whenthe maximum cycle is 1000 And the fitness of function takesthe logarithmof 10 to display the difference shown in differentstrategy more obviously

Important observation on convergence speed and con-vergence precision of different algorithms can be obtainedfrom the result displayed in Figure 2 A better performanceon convergence speed shown in CBSA is displayed whenBSA works under the same condition which implies theefficiency of evolution equation (7) powerfully It also can beobtained from Figure 2 that OBSA has better convergenceprecision than that of BSA in the late stage which shows thecorrectness of search equation (8) proposed in Section 32Meanwhile we can learn that the COBSA in combinationwith CBSA and OBSA displaying faster speed and higherprecision in Figure 2 has a more obvious influence on theperformance of BSA when compared with BSA CBSA andOBSA

42 Iteration Performance in Fixed Precision In this sectionthe performance of each algorithm depended on the itera-tion numbers when evolution precision which is limited isemployed to make a comparison All the algorithms havebeen run 20 times independently based on 18 objectivefunctions For each function the current cycle will stopeither when the precision is less than the limitation given inTable 2 or when the maximum cycle is satisfied The resultsincluding BSA CBSA OBSA and COBSA are displayed inTable 2 in terms of min max mean SR and EI Specificexpression of each parameter we just mentioned is as followsthe success rate (SR) = numbers of achieving target precisiondivide total numbers of experiment expected iterations (EI)= the numbers of particle times average iteration numbers dividesuccess rate And ldquoinfinrdquo appearing in Table 2 indicates that theexpected iterations are infinite

An interesting result is that BSA does not meet the targetprecision for all functions except Sumpower 1198913 and the SRof BSA is just 015 on function Sumpower 1198913 when that ofCBSA OBSA and COBSA is 1 Besides for other functionsa higher SR and less EI are displayed when CBSA and OBSAare compared with BSA while COBSA has 100 success ratefor all functions with least expected iteration The iterationperformance in fixed precision in this section also proves thesuperiority of COBSA

43 Experiment Based on Comparison Algorithms In thissection some relatively new evolutionary algorithms evolv-ing in recent decades are employed to compare with COBSAABC (limit = 100) MABC (limit = 100) BA (119860 = 05 119903 = 05119891max = 2 119891min = 0 120572 = 09 and 120574 = 09) DSA and BSAare included The result is judged in terms of mean standarddeviation (std) of the solution obtained from 20 independenttests

As it can be obtained from Table 3 COBSA offers ahigher convergence precision on almost all functions whilethe stander deviation of COBSA is also much smaller thanthat of the comparison algorithms In particular COBSA can

Mathematical Problems in Engineering 5

Table 1 Test function

Function Function expression Dim Search space

Sphere 1198911 119891(119909) = 119899sum119894=1

1199092119894 30 [minus100 100]Sumsquare 1198912 119891(119909) = 119899sum

119894=1

1198941199092119894 60 [minus10 10]Sumpower 1198913 119891(119909) = 119899sum

119894=1

10038161003816100381610038161199091198941003816100381610038161003816119894+1 60 [minus1 1]Schwefel222 1198914 119891(119909) = 119899sum

119894=1

10038161003816100381610038161199091198941003816100381610038161003816 +119899prod119894=1

10038161003816100381610038161199091198941003816100381610038161003816 100 [minus10 10]Exponential 1198915 119891(119909) = exp(05 lowast 119899sum

119894=1

119909119894) minus 1 60 [minus128 128]Alpine 1198916 119891(119909) = 119899sum

119894=1

1003816100381610038161003816119909119894 sin (119909119894) + 011199091198941003816100381610038161003816 60 [minus10 10]Rastrigin 1198917 119891(119909) = 119899sum

119894=1

(1199092119894 minus 10 cos 2120587119909119894) + 10 100 [minus100 100]Griewank 1198918 119891(119909) = 14000

119899sum119894=1

(119909119894)2 minus 119899prod119894=1

cos( 119909119894radic119909119894) + 1 60 [minus600 600]Rosenbrock 1198919 119891(119909) = 119899minus1sum

119894=1

[100 (1199092119894 minus 119909119894+1)2 + (119909119894 minus 1)2] 30 [minus30 30]Ackley 11989110 119891 (119909) = minus20 exp(minus02radic 1119899

119899sum119894=1

1199092119894) minus exp(1119899119899sum119894=1

cos (2120587119909119894)) + 20 + 119890 60 [minus32 32]R-H-Ellipsoid11989111 119891(119909) = 119899sum

119894=1

119894sum119895=1

1199092119895 80 [minus100 100]

Schwefel12 11989112 119891(119909) = 119899sum119894=1

( 119894sum119895=1

119909119895)2

10 [minus30 30]Elliptic 11989113 119891(119909) = 119899sum

119894=1

(106)(119894minus1)(119899minus1) 1199092119894 30 [minus100 100]

NCRastrigin11989114119891(119909) = 119899sum

119894=1

(1199102119894 minus 10 cos 2120587119910119894) + 10

119910119894 = 119909119894 10038161003816100381610038161199091198941003816100381610038161003816 lt 12 round(2119909119894)2 |119909119894| ge 12

50 [minus512 512]

T-H camel 11989115 119891(119909) = 211990921 minus 10511990941 + 119909616 + 11990911199092 + 11990922 2 [minus5 5]

Schaffer 11989116 119891(119909) = ( 119899sum119894=1

1199092119894)025 (sin2 (50( 119899sum

119894=1

1199092119894)01) + 1) 10 [minus100 100]

Powell 11989117 119891(119909) = 1198994sum119894=1

[(1199094119894minus3 + 101199094119894minus2)2 + 5 (1199094119894minus1 minus 1199094119894)2 + (1199094119894minus2 minus 21199094119894minus1)4 + 10 (1199094119894minus3 + 1199094119894)4] 70 [5 minus4]

Weierstrass 11989118 119891 (119909) = 119899sum119894=1

(119896maxsum119896=0

[119886119896 cos (2120587119887119896 (119909119894 + 05))]) minus 119899119896maxsum119896=0

[119886119896 cos (212058711988711989605)]119886 = 05 119887 = 3 119896max = 20 60 [minus05 05]

find optimal solution when facing Rastrigin 1198917 Griewank1198918 NCRastrigin 11989114 and Weierstrass 11989118 However in thecase of Rosenbrock 1198919 the convergence precision of COBSAis worse compared to that of ABC and MABC since ABCis just one magnitude order higher when compared withCOBSA on Rosenbrock 1198919 the superiority of ABC andMABC is not very obvious Besides COBSA displays a morestable search process than ABC and MABC owing to a lowerstander deviation on Rosenbrock 1198919

44 Experiment Based on Wilcoxon Signed-Rank Test Acomparative method based on statistical analysis by usingWilcoxon Signed-Rank Test is employed to evaluate theperformance of COBSA and comparison algorithms

Table 4 displays the statistical results using the averagevalues based on 20 independent running times of COBSAand comparison algorithms to handle the objective functionsin Table 1 The result presents that the COBSA displaying abetter statistical property successfully outperforms all of the

6 Mathematical Problems in Engineering

Sphere f1

BSCBS

OBSCOBS

10minus40

10minus30

10minus20

10minus10

100

1010

Fitn

ess

200 400 600 800 10000Generation

Sumsquare f2

BSCBS

OBSCOBS

10minus30

10minus20

10minus10

100

1010

Fitn

ess

200 400 600 800 10000Generation

Alpine f6

BSCBS

OBSCOBS

10minus25

10minus20

10minus15

10minus10

10minus5

100

105

Fitn

ess

200 400 600 800 10000Generation

Schwefel222 f4

BSCBS

OBSCOBS

10minus40

10minus20

100

1060

1040

1020

Fitn

ess

200 400 600 800 10000Generation

Sumpower f3

BSCBS

OBSCOBS

10minus150

10minus100

10minus50

100

1050

Fitn

ess

200 400 600 800 10000Generation

Exponential f5

BSCBS

OBSCOBS

10minus25

10minus20

10minus15

10minus10

10minus5

100

105

Fitn

ess

200 400 600 800 10000Generation

Figure 2 Continued

Mathematical Problems in Engineering 7

Griewank f8

BSCBS

OBSCOBS

10minus20

10minus15

10minus10

10minus5

100

105

Fitn

ess

200 400 600 800 10000Generation

Rastrigin f7

BSCBS

OBSCOBS

10minus15

10minus10

10minus5

100

1010

105

Fitn

ess

200 400 600 800 10000Generation

Ackley f10

BSCBS

OBSCOBS

10minus15

10minus10

10minus5

100

105Fi

tnes

s

200 400 600 800 10000Generation

Rosenbrock f9

BSCBS

OBSCOBS

100

1010

108

106

104

102

Fitn

ess

200 400 600 800 10000Generation

R-H-Ellipsoid f11

BSCBS

OBSCOBS

1010

100

Fitn

ess

10minus30

10minus20

10minus10

200 400 600 800 10000Generation

Schwefel12 f12

BSCBS

OBSCOBS

10minus25

10minus20

10minus15

10minus10

10minus5

100

105

Fitn

ess

200 400 600 800 10000Generation

Figure 2 Continued

8 Mathematical Problems in Engineering

Elliptic f13

BSCBS

OBSCOBS

1030

1010

1020

100

Fitn

ess

10minus20

10minus10

200 400 600 800 10000Generation

NCRastrigin f14

BSCBS

OBSCOBS

10minus15

10minus10

10minus5

100

105

Fitn

ess

200 400 600 800 10000Generation

T-H camel f15

BSCBS

OBSCOBS

100

Fitn

ess

10minus120

10minus40

10minus60

10minus80

10minus100

10minus20

200 400 600 800 10000Generation

Schaffer f16

BSCBS

OBSCOBS

10minus1

10minus2

10minus3

10minus4

100

101

102Fi

tnes

s

200 400 600 800 10000Generation

Powell f17

BSCBS

OBSCOBS

100

105

Fitn

ess

10minus25

10minus20

10minus15

10minus10

10minus5

200 400 600 800 10000Generation

Weierstrass f18

BSCBS

OBSCOBS

10minus15

10minus10

10minus5

100

105

Fitn

ess

200 400 600 800 10000Generation

Figure 2 Convergence performance in fixed iteration

Mathematical Problems in Engineering 9

Table 2 Iteration performance in fixed precision

Function Limitation Algorithm Iterations Success rate Expected iterationsMin Max Mean

1198911 119890 minus 24BSA 1000 1000 1000 0 infinCBSA 845 925 877 1 26310OBSA 870 1000 974 03 97400COBSA 635 734 692 1 20760

1198912 119890 minus 25BSA 1000 1000 1000 0 infinCBSA 938 1000 971 095 30663OBSA 720 1000 957 035 82028COBSA 664 802 742 1 22260

1198913 119890 minus 20BSA 927 1000 993 015 198600CBSA 334 442 381 1 11430OBSA 211 353 288 1 8640COBSA 159 259 223 1 6690

1198914 119890 minus 17BSA 1000 1000 1000 0 infinCBSA 844 981 877 1 26310OBSA 791 1000 961 035 82371COBSA 688 851 747 1 22410

1198915 119890 minus 16BSA 1000 1000 1000 0 infinCBSA 757 837 791 1 23730OBSA 725 1000 952 05 57120COBSA 613 755 675 1 20250

1198916 119890 minus 18BSA 1000 1000 1000 0 infinCBSA 913 1000 954 09 31800OBSA 951 1000 993 02 148950COBSA 711 830 784 1 23520

1198917 119890 minus 5BSA 1000 1000 1000 0 infinCBSA 561 646 599 1 17970OBSA 250 541 453 1 13590COBSA 354 488 422 1 12660

1198918 119890 minus 7BSA 1000 1000 1000 0 infinCBSA 477 558 529 1 15870OBSA 329 615 465 1 13680COBSA 247 469 411 1 12330

1198919 30

BSA 1000 1000 1000 0 infinCBSA 215 298 267 1 8010OBSA 163 349 244 1 7320COBSA 141 241 189 1 5670

11989110 119890 minus 14BSA 1000 1000 1000 0 infinCBSA 958 1000 979 045 65266OBSA 830 1000 989 03 98900COBSA 687 860 784 1 23520

11989111 119890 minus 21BSA 1000 1000 1000 0 infinCBSA 873 947 914 1 27420OBSA 693 1000 927 09 30900COBSA 629 816 710 1 21300

11989112 119890 minus 14BSA 1000 1000 1000 0 infinCBSA 892 1000 986 035 84514OBSA 766 1000 883 09 29433COBSA 532 710 624 1 18720

10 Mathematical Problems in Engineering

Table 2 Continued

Function Limitation Algorithm Iterations Success rate Expected iterationsMin Max Mean

11989113 119890 minus 07BSA 1000 1000 1000 0 infinCBSA 630 700 667 1 20010OBSA 505 1000 821 085 28976COBSA 524 611 566 1 16980

11989114 119890 minus 05BSA 1000 1000 1000 0 infinCBSA 590 790 679 1 20370OBSA 286 960 608 1 18240COBSA 424 556 492 1 14760

11989115 119890 minus 80BSA 1000 1000 1000 0 infinCBSA 869 1000 973 06 48650OBSA 803 1000 991 045 66066COBSA 763 893 786 1 10560

11989116 001

BSA 1000 1000 1000 0 infinCBSA 800 1000 948 07 40628OBSA 596 1000 831 08 31162COBSA 475 667 577 1 17310

11989117 119890 minus 20BSA 1000 1000 1000 0 infinCBSA 956 1000 990 05 59400OBSA 658 1000 854 075 34160COBSA 545 761 689 1 20670

11989118 001

BSA 1000 1000 1000 0 infinCBSA 256 323 294 1 8820OBSA 172 1000 496 085 17505COBSA 218 327 260 1 7800

comparison algorithms based on theWilcoxon Signed-RankTest with a statistical significance value 120572 = 005

According to the above analyses of Sections 41 42 43and 44 conclusion can be summarized safely as followsCOBSA can improve the performance of BSA effectively andit is a feasible computation method

5 The Choice of Variance 119902In Section 31 the variance 119902 of matrix119898 plays an importantrole in affecting the performance of CBSA When 119902 takes0 the matrix 119898 will lose its role When 119902 increases from0 to 1 the fluctuation to population Pop will also increasecorrespondingly However a lower convergence speed willbe displayed owing to the obvious fluctuation to populationPop when 119902 is large Therefore four objective functionsSumsquare 1198912 Schwefel222 1198914 Alpine 1198916 and Ackley 11989110are employed to analyse the effect of variance 119902 The fitnessof function takes the logarithm of 10 to show and observe thecurves obviously

The result is better when the fitness of function issmaller owing to the minimization process shown in thesefunctions We can obtain from Figure 3 that variance 119902affects the result obviously CBSA plays a better performanceon Sumsquare 1198912 and Alpine 1198916 when 119902 = 03 When119902 takes 04 the fitness of Schwefel222 1198914 is better ForAckley 11989110 a lower precision is obtained when 119902 is around

03 Therefore the selective variance 119902 is set at 03 in thisarticle

Besides we can also observe from Figure 3 that theinfluence of matrix 119898 cannot be ignored When variance 119902is 0 the element in matrix119898 is 1 and as a result the matrix119898will lose its role As it can be obtained from Figure 3 a betterperformance is displayed when 119902 is around 03 rather than 0which proves the significance of matrix1198986 Conclusion

In order to solve the problem of slow convergencespeed and low convergence precision shown in BSAthis article proposes an improved algorithm namedCOBSA through employing population control factor tothe variation equation and helping disadvantaged grouphave optimal learning The experiments based on 18benchmark functions prove that COBSA is an effectiveevolutionary algorithm and it is a feasible solution toimprove the convergence speed and convergence precision ofBSA

However COBSA displays a lower convergence precisionin the function Rosenbrock 1198919 when compared with ABCand MABC How to make algorithm display better perfor-mance on functions Rosenbrock 1198919 is the next step in ourwork Besides it is worth applying the algorithm proposed in

Mathematical Problems in Engineering 11

Table3Ex

perim

entb

ased

oncomparis

onalgorithm

Functio

n1198911

11989121198913

11989141198915

11989161198917

11989181198919

1198911011989111

1198911211989113

1198911411989115

1198911611989117

11989118ABC M

ean

925119890minus 11

530119890minus 05

445119890minus 08

016130119890minus 03

0237854

003459

035720119890minus 03

363774119890+ 14

1187589119890minus 18

385562

024Std

311119890minus 10

372119890minus 04

232119890minus 08

007437119890minus 04

0122466

005341

032197119890minus 02

207145119890+ 15

224479119890minus 18

112560

004MABC Mean

729119890minus 16

969119890minus 10

687119890minus 15

276119890minus 02

128119890minus 06

795119890minus 05

397356119890minus03

2306230119890minus 03

193119890minus 05

1707625119890minus 08

190219119890minus 12

2512506

013Std

900119890minus 17

205119890minus 09

249119890minus 14

570119890minus 03

269119890minus 07

733119890minus 05

611001

4462871119890minus 03

994119890minus 06

1014230119890minus 07

148978119890minus 12

0391313

034DSA M

ean

550119890minus 03

146501119890minus 17

769005

308104119890+ 03

10117117

3755135

1044503119890+ 16

4932680119890minus 36

2145701

381Std

630119890minus 03

087119119890minus 16

224002

13316171

0122957

3234796

930571119890+ 16

447305119890minus 35

0385193

068BS

A Mean

001237

309119890minus 17

1158

009757

127119890+ 03

11214134

3498661

012252119890+ 25

6467464119890minus 47

157934

494Std

092128

614119890minus 17

208002

30313081

0052738

2772706

008321119890+ 25

534205119890minus 46

019545

073BA

Mean

917119890minus 07

020484119890minus 07

476119890+ 31

017195672119890+03

1018684

1925794119890minus 06

515119890minus 08

502119890+ 25

40900

456119890minus 12

1181

348365

Std

410119890minus 08

035216119890minus 08

213119890+ 32

016639

19500

45010827

014355119890minus 07

230119890minus 09

225119890+ 25

4700202119890minus 12

268107

1634

COBS

A

Mean

201119890minus 32

323119890minus 28

433119890minus 111

395119890minus 20

251119890minus 22

399119890minus 21

00

2738409119890minus 15

147119890minus 27

101119890 minus 20

138119890minus 18

0750119890minus 104

432119890minus 04

542119890minus 27

0

Std

879119890minus 32

143119890minus 27

147119890minus 110

201119890minus 21

750119890minus 22

119119890minus 20

00

013

110119890minus 15

586119890minus 27

433119890minus 20

616119890minus 17

0313119890minus 103

120119890minus 03

227119890minus 26

0

12 Mathematical Problems in Engineering

Table 4 Statistical test based on Wilcoxon Signed-Rank Test (120572 = 005)Algorithm versus COBSA 119901 value 119877+ 119877minus WinnerABC versus COBSA 250119890 minus 03 16 155 COBSAMABC versus COBSA 210119890 minus 03 15 156 COBSADSA versus COBSA 196119890 minus 04 0 171 COBSABSA versus COBSA 196119890 minus 04 0 171 COBSABA versus COBSA 196119890 minus 04 0 171 COBSA

Somesquare f2

01 02 03 04 05 06 07 08 090q

minus30

minus28

minus26

minus24

minus22

minus20

minus18

minus16

minus14

minus12

Fitn

ess

01 02 03 04 05 06 07 08 090q

minus22

minus20

minus18

minus16

minus14

minus12

minus10

minus8

minus6

Fitn

ess

Schwefel222 f4

Ackley f10

01 02 03 04 05 06 07 08 090q

minus15

minus14

minus13

minus12

minus11

minus10

minus9

minus8

Fitn

ess

Alpine f6

minus22

minus20

minus18

minus16

minus14

minus12

minus10

minus8

Fitn

ess

01 02 03 04 05 06 07 08 090q

Figure 3 The influence of 119902

this article to the blind source separation and hyperspectralimage processing

Conflicts of Interest

The authors declare that there are no conflicts of interestregarding the publication of this paper

Acknowledgments

The work is supported by National Nature Science Foun-dation of China (no 61401307) China Postdoctoral ScienceFoundation (no 2014M561184) Tianjin Research Program

of Application Foundation and Advanced Technology ofChina (no 15JCYBJC17100) and Tianjin Research Programof Science and Technology Commissioner of China (16JCT-PJC48400)

References

[1] J Kennedy and R Eberhart ldquoParticle swarm optimizationrdquoin Proceedings of the IEEE International Conference on NeuralNetworks pp 1942ndash1948 Perth Australia December 1995

[2] M Dorigo and T Stutzle Ant Colony Optimization MIT PressCombridge MA USA 2004

Mathematical Problems in Engineering 13

[3] J H HollandAdaptation in Natural and Artificial Systems MITPress Combridge MA USA 1992

[4] J J Liang A K Qin P N Suganthan and S BaskarldquoComprehensive learning particle swarm optimizer for globaloptimization of multimodal functionsrdquo IEEE Transactions onEvolutionary Computation vol 10 no 3 pp 281ndash295 2006

[5] K V Price R M Storn and J A Lampinen Differential Evo-lution A Practical Approach to Global Optimization SpringerBerlin Germany 2005

[6] J Zhang and A C Sanderson ldquoJADE Adaptive differentialevolution with optional external archiverdquo IEEE Transactions onEvolutionary Computation vol 13 no 5 pp 945ndash958 2009

[7] P Civicioglu ldquoTransforming geocentric cartesian coordinatesto geodetic coordinates by using differential search algorithmrdquoComputers and Geosciences vol 46 pp 229ndash247 2012

[8] D Karaboga and B Akay ldquoA comparative study of artificial beecolony algorithmrdquo Applied Mathematics and Computation vol214 no 1 pp 108ndash132 2009

[9] X S Yang and S Deb ldquoCuckoo search via Levy flightsrdquo inProceedings of the World Congress on Nature and BiologicallyInspired Computing (NABIC rsquo09) pp 210ndash214 CoimbatoreIndia December 2009

[10] X S Yang ldquoA new metaheuristic bat-inspired algorithmrdquo inNature Inspired Cooperative Strategies for Optimization (NICSOrsquo10) J R Gonzalez D A Pelta C Cruz G Terrazas andN Krasnogor Eds vol 284 of Studies in ComputationalIntelligence pp 65ndash74 Springer Berlin Germany 2010

[11] L Chen and L Y Zhang ldquoBlind signal separation algorithmbased on temporal predictability and differential search algo-rithmrdquo Journal on Communications vol 35 no 6 pp 117ndash1252014

[12] Z C Jia Y Y Xue L Chen Y J Guo and H D Xu ldquoBlindseparation algorithm for hyperspectral image based on thedenoising reduction and the bat optimizationrdquo Acta PhotonicaSinica vol 45 no 5 Article ID 0511001 2016

[13] J H Liang and M Ma ldquoArtificial bee colony algorithm basedresearch on image segmentationrdquo Computer Engineering andApplication vol 48 no 8 pp 194ndash197 2012

[14] P Civicioglu ldquoBacktracking search optimization algorithm fornumerical optimization problemsrdquo Applied Mathematics andComputation vol 219 no 15 pp 8121ndash8144 2013

[15] Z L Zhang and S Y Liu ldquoDecision-theoretic rough set attributereduction based on backtracking search algorithmrdquo ComputerEngineering and Applications vol 52 no 10 pp 71ndash74 2016

[16] X Chen S Y Liu and YWang ldquoEmergency resources schedul-ing based on improved backtracking search optimization algo-rithmrdquo Computer Applications and Software vol 32 no 12 pp235ndash238 2015

[17] M D Li and H Zhao ldquoBacktracking search optimizationalgorithm with comprehensive learning strategyrdquo System Engi-neering and Electronics vol 37 no 4 pp 958ndash963 2015

[18] X J Wang S Y Liu and W K Tian ldquoImproved backtrackingsearch optimization algorithm with new effective mutationscale factor and greedy crossover strategyrdquo Journal of ComputerApplications vol 34 no 9 pp 2543ndash2546 2014

[19] W K Tian S Y Liu and X J Wang ldquoStudy and improvementof backtracking search optimization algorithm based on differ-ential evolutionrdquoApplication Research of Computers vol 32 no6 pp 1653ndash1662 2015

[20] Y Shi and R Eberhart ldquoA modified particle swarm opti-mizerrdquo in Proceedings of the IEEE International Conference

on Evolutionary Computation and IEEE World Congress onComputational Intelligence (Cat No98TH8360) pp 69ndash73Anchorage Alaska USA May 1998

[21] W F Gao and S Y Liu ldquoA modified artificial bee colonyalgorithmrdquo Computers amp Operations Research vol 39 no 3 pp687ndash697 2012

[22] YW Shang and Y H Qiu ldquoA note on the extended Rosenbrockfunctionrdquo Evolutionary Computation vol 14 no 1 pp 119ndash1262006

[23] WHu and Z S Li ldquoA simpler andmore effective particle swarmoptimization algorithmrdquo Journal of Software vol 18 no 4 pp861ndash868 2007

[24] J Derrac S Garcıa D Molina and F Herrera ldquoA practicaltutorial on the use of nonparametric statistical tests as amethodology for comparing evolutionary and swarm intelli-gence algorithmsrdquo Swarm and Evolutionary Computation vol1 no 1 pp 3ndash18 2011

[25] S Garcıa D Molina M Lozano and F Herrera ldquoA study onthe use of non-parametric tests for analyzing the evolutionaryalgorithmsrsquo behaviour a case study on the CECrsquo2005 SpecialSession on Real Parameter Optimizationrdquo Journal of Heuristicsvol 15 no 6 pp 617ndash644 2009

Submit your manuscripts athttpswwwhindawicom

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical Problems in Engineering

Hindawi Publishing Corporationhttpwwwhindawicom

Differential EquationsInternational Journal of

Volume 2014

Applied MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Probability and StatisticsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical PhysicsAdvances in

Complex AnalysisJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

OptimizationJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

CombinatoricsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Operations ResearchAdvances in

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Function Spaces

Abstract and Applied AnalysisHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of Mathematics and Mathematical Sciences

Hindawi Publishing Corporationhttpwwwhindawicom Volume 201

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Algebra

Discrete Dynamics in Nature and Society

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Decision SciencesAdvances in

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014 Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Stochastic AnalysisInternational Journal of

Page 4: Improved Backtracking Search Algorithm Based on ...downloads.hindawi.com/journals/mpe/2017/3017608.pdfMathematicalProblemsinEngineering 3 3. The Improved Backtracking Search Algorithm

4 Mathematical Problems in Engineering

Step 2 Generate two numbers 119886 and 119887 randomly if 119886 lt 119887Oldpop = Pop and Oldpop will be arrayed randomly later

Step 3 Produce a new population on the basis of initialpopulation by employing (7)

Step 4 Generate cross matrix map by using (4)

Step 5 Take cross strategy through (3)

Step 6 Initialize the individual that is out of boundary

Step 7 Seek the value of the population and disadvantagedgroup is guided to have optimal learning by employing (8)

Step 8 Select the optimal population according to the greedyselection mechanism

Step 9 Judge whether the cyclemeets the target if not returnto Step 2

Step 10 Output optimal solution

4 Experiment and Result Analysis

In order to verify effectiveness of the improved algorithmproposed in this article COBSA is applied to minimize a setof 18 benchmark functions commonly used in optimizationalgorithms Table 1 presents in detail function form searchspace and function dimension used for the evolutionaryalgorithms in the tests Some problems will arise whenCOBSA operates owing to the different characters shown inthe 18 benchmark functions 1198911sim1198915 11989111sim11989113 and 11989117 areunimodal and continuous functions 1198916sim1198918 11989110 11989114 11989116and 11989118 are multipeak and the number of local minimumvalues will increase obviously when dimension119863 gets larger1198919 is the Rosenbrock function which is unimodal when 119863 =2 and 3 but numerous local minimum values will arise whenproblem dimension is high [22]11989115 is the three-hump camelfunction in 119863 = 2 having three local minimum values Thisarticle adjusts the form of some functions for convenienceto compare when different versions in different literaturesshown in these functions are considered

Enlightened by literatures [23ndash25] the followingmethodsare designed to evaluate the performance of COBSA (1)convergence performance in fixed iteration (2) iterationperformance in fixed precision (3) comparison algorithmsemployed to compare with COBSA (4) statistical test basedon the Wilcoxon Signed-Rank Test used to analyse theproperty of COBSA and comparison algorithms

The parameters of algorithm used in the evaluationmethods (1) (2) (3) and (4) are listed as follows thepopulation size is 30 the maximum cycle of times to beevaluated for each benchmark functions is 1000 and the finalresult is the average value based on 20 independent runningtimes

41 Convergence Performance in Fixed Iteration In thissection in order to analyse how much the population con-trol factor and the optimal learning strategy contribute to

improve the performance of BSA this article compares theconvergence speed and convergence precision of BSA CBSAOBSA and COBSA based on the 18 objective functions whenthe maximum cycle is 1000 And the fitness of function takesthe logarithmof 10 to display the difference shown in differentstrategy more obviously

Important observation on convergence speed and con-vergence precision of different algorithms can be obtainedfrom the result displayed in Figure 2 A better performanceon convergence speed shown in CBSA is displayed whenBSA works under the same condition which implies theefficiency of evolution equation (7) powerfully It also can beobtained from Figure 2 that OBSA has better convergenceprecision than that of BSA in the late stage which shows thecorrectness of search equation (8) proposed in Section 32Meanwhile we can learn that the COBSA in combinationwith CBSA and OBSA displaying faster speed and higherprecision in Figure 2 has a more obvious influence on theperformance of BSA when compared with BSA CBSA andOBSA

42 Iteration Performance in Fixed Precision In this sectionthe performance of each algorithm depended on the itera-tion numbers when evolution precision which is limited isemployed to make a comparison All the algorithms havebeen run 20 times independently based on 18 objectivefunctions For each function the current cycle will stopeither when the precision is less than the limitation given inTable 2 or when the maximum cycle is satisfied The resultsincluding BSA CBSA OBSA and COBSA are displayed inTable 2 in terms of min max mean SR and EI Specificexpression of each parameter we just mentioned is as followsthe success rate (SR) = numbers of achieving target precisiondivide total numbers of experiment expected iterations (EI)= the numbers of particle times average iteration numbers dividesuccess rate And ldquoinfinrdquo appearing in Table 2 indicates that theexpected iterations are infinite

An interesting result is that BSA does not meet the targetprecision for all functions except Sumpower 1198913 and the SRof BSA is just 015 on function Sumpower 1198913 when that ofCBSA OBSA and COBSA is 1 Besides for other functionsa higher SR and less EI are displayed when CBSA and OBSAare compared with BSA while COBSA has 100 success ratefor all functions with least expected iteration The iterationperformance in fixed precision in this section also proves thesuperiority of COBSA

43 Experiment Based on Comparison Algorithms In thissection some relatively new evolutionary algorithms evolv-ing in recent decades are employed to compare with COBSAABC (limit = 100) MABC (limit = 100) BA (119860 = 05 119903 = 05119891max = 2 119891min = 0 120572 = 09 and 120574 = 09) DSA and BSAare included The result is judged in terms of mean standarddeviation (std) of the solution obtained from 20 independenttests

As it can be obtained from Table 3 COBSA offers ahigher convergence precision on almost all functions whilethe stander deviation of COBSA is also much smaller thanthat of the comparison algorithms In particular COBSA can

Mathematical Problems in Engineering 5

Table 1 Test function

Function Function expression Dim Search space

Sphere 1198911 119891(119909) = 119899sum119894=1

1199092119894 30 [minus100 100]Sumsquare 1198912 119891(119909) = 119899sum

119894=1

1198941199092119894 60 [minus10 10]Sumpower 1198913 119891(119909) = 119899sum

119894=1

10038161003816100381610038161199091198941003816100381610038161003816119894+1 60 [minus1 1]Schwefel222 1198914 119891(119909) = 119899sum

119894=1

10038161003816100381610038161199091198941003816100381610038161003816 +119899prod119894=1

10038161003816100381610038161199091198941003816100381610038161003816 100 [minus10 10]Exponential 1198915 119891(119909) = exp(05 lowast 119899sum

119894=1

119909119894) minus 1 60 [minus128 128]Alpine 1198916 119891(119909) = 119899sum

119894=1

1003816100381610038161003816119909119894 sin (119909119894) + 011199091198941003816100381610038161003816 60 [minus10 10]Rastrigin 1198917 119891(119909) = 119899sum

119894=1

(1199092119894 minus 10 cos 2120587119909119894) + 10 100 [minus100 100]Griewank 1198918 119891(119909) = 14000

119899sum119894=1

(119909119894)2 minus 119899prod119894=1

cos( 119909119894radic119909119894) + 1 60 [minus600 600]Rosenbrock 1198919 119891(119909) = 119899minus1sum

119894=1

[100 (1199092119894 minus 119909119894+1)2 + (119909119894 minus 1)2] 30 [minus30 30]Ackley 11989110 119891 (119909) = minus20 exp(minus02radic 1119899

119899sum119894=1

1199092119894) minus exp(1119899119899sum119894=1

cos (2120587119909119894)) + 20 + 119890 60 [minus32 32]R-H-Ellipsoid11989111 119891(119909) = 119899sum

119894=1

119894sum119895=1

1199092119895 80 [minus100 100]

Schwefel12 11989112 119891(119909) = 119899sum119894=1

( 119894sum119895=1

119909119895)2

10 [minus30 30]Elliptic 11989113 119891(119909) = 119899sum

119894=1

(106)(119894minus1)(119899minus1) 1199092119894 30 [minus100 100]

NCRastrigin11989114119891(119909) = 119899sum

119894=1

(1199102119894 minus 10 cos 2120587119910119894) + 10

119910119894 = 119909119894 10038161003816100381610038161199091198941003816100381610038161003816 lt 12 round(2119909119894)2 |119909119894| ge 12

50 [minus512 512]

T-H camel 11989115 119891(119909) = 211990921 minus 10511990941 + 119909616 + 11990911199092 + 11990922 2 [minus5 5]

Schaffer 11989116 119891(119909) = ( 119899sum119894=1

1199092119894)025 (sin2 (50( 119899sum

119894=1

1199092119894)01) + 1) 10 [minus100 100]

Powell 11989117 119891(119909) = 1198994sum119894=1

[(1199094119894minus3 + 101199094119894minus2)2 + 5 (1199094119894minus1 minus 1199094119894)2 + (1199094119894minus2 minus 21199094119894minus1)4 + 10 (1199094119894minus3 + 1199094119894)4] 70 [5 minus4]

Weierstrass 11989118 119891 (119909) = 119899sum119894=1

(119896maxsum119896=0

[119886119896 cos (2120587119887119896 (119909119894 + 05))]) minus 119899119896maxsum119896=0

[119886119896 cos (212058711988711989605)]119886 = 05 119887 = 3 119896max = 20 60 [minus05 05]

find optimal solution when facing Rastrigin 1198917 Griewank1198918 NCRastrigin 11989114 and Weierstrass 11989118 However in thecase of Rosenbrock 1198919 the convergence precision of COBSAis worse compared to that of ABC and MABC since ABCis just one magnitude order higher when compared withCOBSA on Rosenbrock 1198919 the superiority of ABC andMABC is not very obvious Besides COBSA displays a morestable search process than ABC and MABC owing to a lowerstander deviation on Rosenbrock 1198919

44 Experiment Based on Wilcoxon Signed-Rank Test Acomparative method based on statistical analysis by usingWilcoxon Signed-Rank Test is employed to evaluate theperformance of COBSA and comparison algorithms

Table 4 displays the statistical results using the averagevalues based on 20 independent running times of COBSAand comparison algorithms to handle the objective functionsin Table 1 The result presents that the COBSA displaying abetter statistical property successfully outperforms all of the

6 Mathematical Problems in Engineering

Sphere f1

BSCBS

OBSCOBS

10minus40

10minus30

10minus20

10minus10

100

1010

Fitn

ess

200 400 600 800 10000Generation

Sumsquare f2

BSCBS

OBSCOBS

10minus30

10minus20

10minus10

100

1010

Fitn

ess

200 400 600 800 10000Generation

Alpine f6

BSCBS

OBSCOBS

10minus25

10minus20

10minus15

10minus10

10minus5

100

105

Fitn

ess

200 400 600 800 10000Generation

Schwefel222 f4

BSCBS

OBSCOBS

10minus40

10minus20

100

1060

1040

1020

Fitn

ess

200 400 600 800 10000Generation

Sumpower f3

BSCBS

OBSCOBS

10minus150

10minus100

10minus50

100

1050

Fitn

ess

200 400 600 800 10000Generation

Exponential f5

BSCBS

OBSCOBS

10minus25

10minus20

10minus15

10minus10

10minus5

100

105

Fitn

ess

200 400 600 800 10000Generation

Figure 2 Continued

Mathematical Problems in Engineering 7

Griewank f8

BSCBS

OBSCOBS

10minus20

10minus15

10minus10

10minus5

100

105

Fitn

ess

200 400 600 800 10000Generation

Rastrigin f7

BSCBS

OBSCOBS

10minus15

10minus10

10minus5

100

1010

105

Fitn

ess

200 400 600 800 10000Generation

Ackley f10

BSCBS

OBSCOBS

10minus15

10minus10

10minus5

100

105Fi

tnes

s

200 400 600 800 10000Generation

Rosenbrock f9

BSCBS

OBSCOBS

100

1010

108

106

104

102

Fitn

ess

200 400 600 800 10000Generation

R-H-Ellipsoid f11

BSCBS

OBSCOBS

1010

100

Fitn

ess

10minus30

10minus20

10minus10

200 400 600 800 10000Generation

Schwefel12 f12

BSCBS

OBSCOBS

10minus25

10minus20

10minus15

10minus10

10minus5

100

105

Fitn

ess

200 400 600 800 10000Generation

Figure 2 Continued

8 Mathematical Problems in Engineering

Elliptic f13

BSCBS

OBSCOBS

1030

1010

1020

100

Fitn

ess

10minus20

10minus10

200 400 600 800 10000Generation

NCRastrigin f14

BSCBS

OBSCOBS

10minus15

10minus10

10minus5

100

105

Fitn

ess

200 400 600 800 10000Generation

T-H camel f15

BSCBS

OBSCOBS

100

Fitn

ess

10minus120

10minus40

10minus60

10minus80

10minus100

10minus20

200 400 600 800 10000Generation

Schaffer f16

BSCBS

OBSCOBS

10minus1

10minus2

10minus3

10minus4

100

101

102Fi

tnes

s

200 400 600 800 10000Generation

Powell f17

BSCBS

OBSCOBS

100

105

Fitn

ess

10minus25

10minus20

10minus15

10minus10

10minus5

200 400 600 800 10000Generation

Weierstrass f18

BSCBS

OBSCOBS

10minus15

10minus10

10minus5

100

105

Fitn

ess

200 400 600 800 10000Generation

Figure 2 Convergence performance in fixed iteration

Mathematical Problems in Engineering 9

Table 2 Iteration performance in fixed precision

Function Limitation Algorithm Iterations Success rate Expected iterationsMin Max Mean

1198911 119890 minus 24BSA 1000 1000 1000 0 infinCBSA 845 925 877 1 26310OBSA 870 1000 974 03 97400COBSA 635 734 692 1 20760

1198912 119890 minus 25BSA 1000 1000 1000 0 infinCBSA 938 1000 971 095 30663OBSA 720 1000 957 035 82028COBSA 664 802 742 1 22260

1198913 119890 minus 20BSA 927 1000 993 015 198600CBSA 334 442 381 1 11430OBSA 211 353 288 1 8640COBSA 159 259 223 1 6690

1198914 119890 minus 17BSA 1000 1000 1000 0 infinCBSA 844 981 877 1 26310OBSA 791 1000 961 035 82371COBSA 688 851 747 1 22410

1198915 119890 minus 16BSA 1000 1000 1000 0 infinCBSA 757 837 791 1 23730OBSA 725 1000 952 05 57120COBSA 613 755 675 1 20250

1198916 119890 minus 18BSA 1000 1000 1000 0 infinCBSA 913 1000 954 09 31800OBSA 951 1000 993 02 148950COBSA 711 830 784 1 23520

1198917 119890 minus 5BSA 1000 1000 1000 0 infinCBSA 561 646 599 1 17970OBSA 250 541 453 1 13590COBSA 354 488 422 1 12660

1198918 119890 minus 7BSA 1000 1000 1000 0 infinCBSA 477 558 529 1 15870OBSA 329 615 465 1 13680COBSA 247 469 411 1 12330

1198919 30

BSA 1000 1000 1000 0 infinCBSA 215 298 267 1 8010OBSA 163 349 244 1 7320COBSA 141 241 189 1 5670

11989110 119890 minus 14BSA 1000 1000 1000 0 infinCBSA 958 1000 979 045 65266OBSA 830 1000 989 03 98900COBSA 687 860 784 1 23520

11989111 119890 minus 21BSA 1000 1000 1000 0 infinCBSA 873 947 914 1 27420OBSA 693 1000 927 09 30900COBSA 629 816 710 1 21300

11989112 119890 minus 14BSA 1000 1000 1000 0 infinCBSA 892 1000 986 035 84514OBSA 766 1000 883 09 29433COBSA 532 710 624 1 18720

10 Mathematical Problems in Engineering

Table 2 Continued

Function Limitation Algorithm Iterations Success rate Expected iterationsMin Max Mean

11989113 119890 minus 07BSA 1000 1000 1000 0 infinCBSA 630 700 667 1 20010OBSA 505 1000 821 085 28976COBSA 524 611 566 1 16980

11989114 119890 minus 05BSA 1000 1000 1000 0 infinCBSA 590 790 679 1 20370OBSA 286 960 608 1 18240COBSA 424 556 492 1 14760

11989115 119890 minus 80BSA 1000 1000 1000 0 infinCBSA 869 1000 973 06 48650OBSA 803 1000 991 045 66066COBSA 763 893 786 1 10560

11989116 001

BSA 1000 1000 1000 0 infinCBSA 800 1000 948 07 40628OBSA 596 1000 831 08 31162COBSA 475 667 577 1 17310

11989117 119890 minus 20BSA 1000 1000 1000 0 infinCBSA 956 1000 990 05 59400OBSA 658 1000 854 075 34160COBSA 545 761 689 1 20670

11989118 001

BSA 1000 1000 1000 0 infinCBSA 256 323 294 1 8820OBSA 172 1000 496 085 17505COBSA 218 327 260 1 7800

comparison algorithms based on theWilcoxon Signed-RankTest with a statistical significance value 120572 = 005

According to the above analyses of Sections 41 42 43and 44 conclusion can be summarized safely as followsCOBSA can improve the performance of BSA effectively andit is a feasible computation method

5 The Choice of Variance 119902In Section 31 the variance 119902 of matrix119898 plays an importantrole in affecting the performance of CBSA When 119902 takes0 the matrix 119898 will lose its role When 119902 increases from0 to 1 the fluctuation to population Pop will also increasecorrespondingly However a lower convergence speed willbe displayed owing to the obvious fluctuation to populationPop when 119902 is large Therefore four objective functionsSumsquare 1198912 Schwefel222 1198914 Alpine 1198916 and Ackley 11989110are employed to analyse the effect of variance 119902 The fitnessof function takes the logarithm of 10 to show and observe thecurves obviously

The result is better when the fitness of function issmaller owing to the minimization process shown in thesefunctions We can obtain from Figure 3 that variance 119902affects the result obviously CBSA plays a better performanceon Sumsquare 1198912 and Alpine 1198916 when 119902 = 03 When119902 takes 04 the fitness of Schwefel222 1198914 is better ForAckley 11989110 a lower precision is obtained when 119902 is around

03 Therefore the selective variance 119902 is set at 03 in thisarticle

Besides we can also observe from Figure 3 that theinfluence of matrix 119898 cannot be ignored When variance 119902is 0 the element in matrix119898 is 1 and as a result the matrix119898will lose its role As it can be obtained from Figure 3 a betterperformance is displayed when 119902 is around 03 rather than 0which proves the significance of matrix1198986 Conclusion

In order to solve the problem of slow convergencespeed and low convergence precision shown in BSAthis article proposes an improved algorithm namedCOBSA through employing population control factor tothe variation equation and helping disadvantaged grouphave optimal learning The experiments based on 18benchmark functions prove that COBSA is an effectiveevolutionary algorithm and it is a feasible solution toimprove the convergence speed and convergence precision ofBSA

However COBSA displays a lower convergence precisionin the function Rosenbrock 1198919 when compared with ABCand MABC How to make algorithm display better perfor-mance on functions Rosenbrock 1198919 is the next step in ourwork Besides it is worth applying the algorithm proposed in

Mathematical Problems in Engineering 11

Table3Ex

perim

entb

ased

oncomparis

onalgorithm

Functio

n1198911

11989121198913

11989141198915

11989161198917

11989181198919

1198911011989111

1198911211989113

1198911411989115

1198911611989117

11989118ABC M

ean

925119890minus 11

530119890minus 05

445119890minus 08

016130119890minus 03

0237854

003459

035720119890minus 03

363774119890+ 14

1187589119890minus 18

385562

024Std

311119890minus 10

372119890minus 04

232119890minus 08

007437119890minus 04

0122466

005341

032197119890minus 02

207145119890+ 15

224479119890minus 18

112560

004MABC Mean

729119890minus 16

969119890minus 10

687119890minus 15

276119890minus 02

128119890minus 06

795119890minus 05

397356119890minus03

2306230119890minus 03

193119890minus 05

1707625119890minus 08

190219119890minus 12

2512506

013Std

900119890minus 17

205119890minus 09

249119890minus 14

570119890minus 03

269119890minus 07

733119890minus 05

611001

4462871119890minus 03

994119890minus 06

1014230119890minus 07

148978119890minus 12

0391313

034DSA M

ean

550119890minus 03

146501119890minus 17

769005

308104119890+ 03

10117117

3755135

1044503119890+ 16

4932680119890minus 36

2145701

381Std

630119890minus 03

087119119890minus 16

224002

13316171

0122957

3234796

930571119890+ 16

447305119890minus 35

0385193

068BS

A Mean

001237

309119890minus 17

1158

009757

127119890+ 03

11214134

3498661

012252119890+ 25

6467464119890minus 47

157934

494Std

092128

614119890minus 17

208002

30313081

0052738

2772706

008321119890+ 25

534205119890minus 46

019545

073BA

Mean

917119890minus 07

020484119890minus 07

476119890+ 31

017195672119890+03

1018684

1925794119890minus 06

515119890minus 08

502119890+ 25

40900

456119890minus 12

1181

348365

Std

410119890minus 08

035216119890minus 08

213119890+ 32

016639

19500

45010827

014355119890minus 07

230119890minus 09

225119890+ 25

4700202119890minus 12

268107

1634

COBS

A

Mean

201119890minus 32

323119890minus 28

433119890minus 111

395119890minus 20

251119890minus 22

399119890minus 21

00

2738409119890minus 15

147119890minus 27

101119890 minus 20

138119890minus 18

0750119890minus 104

432119890minus 04

542119890minus 27

0

Std

879119890minus 32

143119890minus 27

147119890minus 110

201119890minus 21

750119890minus 22

119119890minus 20

00

013

110119890minus 15

586119890minus 27

433119890minus 20

616119890minus 17

0313119890minus 103

120119890minus 03

227119890minus 26

0

12 Mathematical Problems in Engineering

Table 4 Statistical test based on Wilcoxon Signed-Rank Test (120572 = 005)Algorithm versus COBSA 119901 value 119877+ 119877minus WinnerABC versus COBSA 250119890 minus 03 16 155 COBSAMABC versus COBSA 210119890 minus 03 15 156 COBSADSA versus COBSA 196119890 minus 04 0 171 COBSABSA versus COBSA 196119890 minus 04 0 171 COBSABA versus COBSA 196119890 minus 04 0 171 COBSA

Somesquare f2

01 02 03 04 05 06 07 08 090q

minus30

minus28

minus26

minus24

minus22

minus20

minus18

minus16

minus14

minus12

Fitn

ess

01 02 03 04 05 06 07 08 090q

minus22

minus20

minus18

minus16

minus14

minus12

minus10

minus8

minus6

Fitn

ess

Schwefel222 f4

Ackley f10

01 02 03 04 05 06 07 08 090q

minus15

minus14

minus13

minus12

minus11

minus10

minus9

minus8

Fitn

ess

Alpine f6

minus22

minus20

minus18

minus16

minus14

minus12

minus10

minus8

Fitn

ess

01 02 03 04 05 06 07 08 090q

Figure 3 The influence of 119902

this article to the blind source separation and hyperspectralimage processing

Conflicts of Interest

The authors declare that there are no conflicts of interestregarding the publication of this paper

Acknowledgments

The work is supported by National Nature Science Foun-dation of China (no 61401307) China Postdoctoral ScienceFoundation (no 2014M561184) Tianjin Research Program

of Application Foundation and Advanced Technology ofChina (no 15JCYBJC17100) and Tianjin Research Programof Science and Technology Commissioner of China (16JCT-PJC48400)

References

[1] J Kennedy and R Eberhart ldquoParticle swarm optimizationrdquoin Proceedings of the IEEE International Conference on NeuralNetworks pp 1942ndash1948 Perth Australia December 1995

[2] M Dorigo and T Stutzle Ant Colony Optimization MIT PressCombridge MA USA 2004

Mathematical Problems in Engineering 13

[3] J H HollandAdaptation in Natural and Artificial Systems MITPress Combridge MA USA 1992

[4] J J Liang A K Qin P N Suganthan and S BaskarldquoComprehensive learning particle swarm optimizer for globaloptimization of multimodal functionsrdquo IEEE Transactions onEvolutionary Computation vol 10 no 3 pp 281ndash295 2006

[5] K V Price R M Storn and J A Lampinen Differential Evo-lution A Practical Approach to Global Optimization SpringerBerlin Germany 2005

[6] J Zhang and A C Sanderson ldquoJADE Adaptive differentialevolution with optional external archiverdquo IEEE Transactions onEvolutionary Computation vol 13 no 5 pp 945ndash958 2009

[7] P Civicioglu ldquoTransforming geocentric cartesian coordinatesto geodetic coordinates by using differential search algorithmrdquoComputers and Geosciences vol 46 pp 229ndash247 2012

[8] D Karaboga and B Akay ldquoA comparative study of artificial beecolony algorithmrdquo Applied Mathematics and Computation vol214 no 1 pp 108ndash132 2009

[9] X S Yang and S Deb ldquoCuckoo search via Levy flightsrdquo inProceedings of the World Congress on Nature and BiologicallyInspired Computing (NABIC rsquo09) pp 210ndash214 CoimbatoreIndia December 2009

[10] X S Yang ldquoA new metaheuristic bat-inspired algorithmrdquo inNature Inspired Cooperative Strategies for Optimization (NICSOrsquo10) J R Gonzalez D A Pelta C Cruz G Terrazas andN Krasnogor Eds vol 284 of Studies in ComputationalIntelligence pp 65ndash74 Springer Berlin Germany 2010

[11] L Chen and L Y Zhang ldquoBlind signal separation algorithmbased on temporal predictability and differential search algo-rithmrdquo Journal on Communications vol 35 no 6 pp 117ndash1252014

[12] Z C Jia Y Y Xue L Chen Y J Guo and H D Xu ldquoBlindseparation algorithm for hyperspectral image based on thedenoising reduction and the bat optimizationrdquo Acta PhotonicaSinica vol 45 no 5 Article ID 0511001 2016

[13] J H Liang and M Ma ldquoArtificial bee colony algorithm basedresearch on image segmentationrdquo Computer Engineering andApplication vol 48 no 8 pp 194ndash197 2012

[14] P Civicioglu ldquoBacktracking search optimization algorithm fornumerical optimization problemsrdquo Applied Mathematics andComputation vol 219 no 15 pp 8121ndash8144 2013

[15] Z L Zhang and S Y Liu ldquoDecision-theoretic rough set attributereduction based on backtracking search algorithmrdquo ComputerEngineering and Applications vol 52 no 10 pp 71ndash74 2016

[16] X Chen S Y Liu and YWang ldquoEmergency resources schedul-ing based on improved backtracking search optimization algo-rithmrdquo Computer Applications and Software vol 32 no 12 pp235ndash238 2015

[17] M D Li and H Zhao ldquoBacktracking search optimizationalgorithm with comprehensive learning strategyrdquo System Engi-neering and Electronics vol 37 no 4 pp 958ndash963 2015

[18] X J Wang S Y Liu and W K Tian ldquoImproved backtrackingsearch optimization algorithm with new effective mutationscale factor and greedy crossover strategyrdquo Journal of ComputerApplications vol 34 no 9 pp 2543ndash2546 2014

[19] W K Tian S Y Liu and X J Wang ldquoStudy and improvementof backtracking search optimization algorithm based on differ-ential evolutionrdquoApplication Research of Computers vol 32 no6 pp 1653ndash1662 2015

[20] Y Shi and R Eberhart ldquoA modified particle swarm opti-mizerrdquo in Proceedings of the IEEE International Conference

on Evolutionary Computation and IEEE World Congress onComputational Intelligence (Cat No98TH8360) pp 69ndash73Anchorage Alaska USA May 1998

[21] W F Gao and S Y Liu ldquoA modified artificial bee colonyalgorithmrdquo Computers amp Operations Research vol 39 no 3 pp687ndash697 2012

[22] YW Shang and Y H Qiu ldquoA note on the extended Rosenbrockfunctionrdquo Evolutionary Computation vol 14 no 1 pp 119ndash1262006

[23] WHu and Z S Li ldquoA simpler andmore effective particle swarmoptimization algorithmrdquo Journal of Software vol 18 no 4 pp861ndash868 2007

[24] J Derrac S Garcıa D Molina and F Herrera ldquoA practicaltutorial on the use of nonparametric statistical tests as amethodology for comparing evolutionary and swarm intelli-gence algorithmsrdquo Swarm and Evolutionary Computation vol1 no 1 pp 3ndash18 2011

[25] S Garcıa D Molina M Lozano and F Herrera ldquoA study onthe use of non-parametric tests for analyzing the evolutionaryalgorithmsrsquo behaviour a case study on the CECrsquo2005 SpecialSession on Real Parameter Optimizationrdquo Journal of Heuristicsvol 15 no 6 pp 617ndash644 2009

Submit your manuscripts athttpswwwhindawicom

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical Problems in Engineering

Hindawi Publishing Corporationhttpwwwhindawicom

Differential EquationsInternational Journal of

Volume 2014

Applied MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Probability and StatisticsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical PhysicsAdvances in

Complex AnalysisJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

OptimizationJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

CombinatoricsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Operations ResearchAdvances in

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Function Spaces

Abstract and Applied AnalysisHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of Mathematics and Mathematical Sciences

Hindawi Publishing Corporationhttpwwwhindawicom Volume 201

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Algebra

Discrete Dynamics in Nature and Society

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Decision SciencesAdvances in

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014 Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Stochastic AnalysisInternational Journal of

Page 5: Improved Backtracking Search Algorithm Based on ...downloads.hindawi.com/journals/mpe/2017/3017608.pdfMathematicalProblemsinEngineering 3 3. The Improved Backtracking Search Algorithm

Mathematical Problems in Engineering 5

Table 1 Test function

Function Function expression Dim Search space

Sphere 1198911 119891(119909) = 119899sum119894=1

1199092119894 30 [minus100 100]Sumsquare 1198912 119891(119909) = 119899sum

119894=1

1198941199092119894 60 [minus10 10]Sumpower 1198913 119891(119909) = 119899sum

119894=1

10038161003816100381610038161199091198941003816100381610038161003816119894+1 60 [minus1 1]Schwefel222 1198914 119891(119909) = 119899sum

119894=1

10038161003816100381610038161199091198941003816100381610038161003816 +119899prod119894=1

10038161003816100381610038161199091198941003816100381610038161003816 100 [minus10 10]Exponential 1198915 119891(119909) = exp(05 lowast 119899sum

119894=1

119909119894) minus 1 60 [minus128 128]Alpine 1198916 119891(119909) = 119899sum

119894=1

1003816100381610038161003816119909119894 sin (119909119894) + 011199091198941003816100381610038161003816 60 [minus10 10]Rastrigin 1198917 119891(119909) = 119899sum

119894=1

(1199092119894 minus 10 cos 2120587119909119894) + 10 100 [minus100 100]Griewank 1198918 119891(119909) = 14000

119899sum119894=1

(119909119894)2 minus 119899prod119894=1

cos( 119909119894radic119909119894) + 1 60 [minus600 600]Rosenbrock 1198919 119891(119909) = 119899minus1sum

119894=1

[100 (1199092119894 minus 119909119894+1)2 + (119909119894 minus 1)2] 30 [minus30 30]Ackley 11989110 119891 (119909) = minus20 exp(minus02radic 1119899

119899sum119894=1

1199092119894) minus exp(1119899119899sum119894=1

cos (2120587119909119894)) + 20 + 119890 60 [minus32 32]R-H-Ellipsoid11989111 119891(119909) = 119899sum

119894=1

119894sum119895=1

1199092119895 80 [minus100 100]

Schwefel12 11989112 119891(119909) = 119899sum119894=1

( 119894sum119895=1

119909119895)2

10 [minus30 30]Elliptic 11989113 119891(119909) = 119899sum

119894=1

(106)(119894minus1)(119899minus1) 1199092119894 30 [minus100 100]

NCRastrigin11989114119891(119909) = 119899sum

119894=1

(1199102119894 minus 10 cos 2120587119910119894) + 10

119910119894 = 119909119894 10038161003816100381610038161199091198941003816100381610038161003816 lt 12 round(2119909119894)2 |119909119894| ge 12

50 [minus512 512]

T-H camel 11989115 119891(119909) = 211990921 minus 10511990941 + 119909616 + 11990911199092 + 11990922 2 [minus5 5]

Schaffer 11989116 119891(119909) = ( 119899sum119894=1

1199092119894)025 (sin2 (50( 119899sum

119894=1

1199092119894)01) + 1) 10 [minus100 100]

Powell 11989117 119891(119909) = 1198994sum119894=1

[(1199094119894minus3 + 101199094119894minus2)2 + 5 (1199094119894minus1 minus 1199094119894)2 + (1199094119894minus2 minus 21199094119894minus1)4 + 10 (1199094119894minus3 + 1199094119894)4] 70 [5 minus4]

Weierstrass 11989118 119891 (119909) = 119899sum119894=1

(119896maxsum119896=0

[119886119896 cos (2120587119887119896 (119909119894 + 05))]) minus 119899119896maxsum119896=0

[119886119896 cos (212058711988711989605)]119886 = 05 119887 = 3 119896max = 20 60 [minus05 05]

find optimal solution when facing Rastrigin 1198917 Griewank1198918 NCRastrigin 11989114 and Weierstrass 11989118 However in thecase of Rosenbrock 1198919 the convergence precision of COBSAis worse compared to that of ABC and MABC since ABCis just one magnitude order higher when compared withCOBSA on Rosenbrock 1198919 the superiority of ABC andMABC is not very obvious Besides COBSA displays a morestable search process than ABC and MABC owing to a lowerstander deviation on Rosenbrock 1198919

44 Experiment Based on Wilcoxon Signed-Rank Test Acomparative method based on statistical analysis by usingWilcoxon Signed-Rank Test is employed to evaluate theperformance of COBSA and comparison algorithms

Table 4 displays the statistical results using the averagevalues based on 20 independent running times of COBSAand comparison algorithms to handle the objective functionsin Table 1 The result presents that the COBSA displaying abetter statistical property successfully outperforms all of the

6 Mathematical Problems in Engineering

Sphere f1

BSCBS

OBSCOBS

10minus40

10minus30

10minus20

10minus10

100

1010

Fitn

ess

200 400 600 800 10000Generation

Sumsquare f2

BSCBS

OBSCOBS

10minus30

10minus20

10minus10

100

1010

Fitn

ess

200 400 600 800 10000Generation

Alpine f6

BSCBS

OBSCOBS

10minus25

10minus20

10minus15

10minus10

10minus5

100

105

Fitn

ess

200 400 600 800 10000Generation

Schwefel222 f4

BSCBS

OBSCOBS

10minus40

10minus20

100

1060

1040

1020

Fitn

ess

200 400 600 800 10000Generation

Sumpower f3

BSCBS

OBSCOBS

10minus150

10minus100

10minus50

100

1050

Fitn

ess

200 400 600 800 10000Generation

Exponential f5

BSCBS

OBSCOBS

10minus25

10minus20

10minus15

10minus10

10minus5

100

105

Fitn

ess

200 400 600 800 10000Generation

Figure 2 Continued

Mathematical Problems in Engineering 7

Griewank f8

BSCBS

OBSCOBS

10minus20

10minus15

10minus10

10minus5

100

105

Fitn

ess

200 400 600 800 10000Generation

Rastrigin f7

BSCBS

OBSCOBS

10minus15

10minus10

10minus5

100

1010

105

Fitn

ess

200 400 600 800 10000Generation

Ackley f10

BSCBS

OBSCOBS

10minus15

10minus10

10minus5

100

105Fi

tnes

s

200 400 600 800 10000Generation

Rosenbrock f9

BSCBS

OBSCOBS

100

1010

108

106

104

102

Fitn

ess

200 400 600 800 10000Generation

R-H-Ellipsoid f11

BSCBS

OBSCOBS

1010

100

Fitn

ess

10minus30

10minus20

10minus10

200 400 600 800 10000Generation

Schwefel12 f12

BSCBS

OBSCOBS

10minus25

10minus20

10minus15

10minus10

10minus5

100

105

Fitn

ess

200 400 600 800 10000Generation

Figure 2 Continued

8 Mathematical Problems in Engineering

Elliptic f13

BSCBS

OBSCOBS

1030

1010

1020

100

Fitn

ess

10minus20

10minus10

200 400 600 800 10000Generation

NCRastrigin f14

BSCBS

OBSCOBS

10minus15

10minus10

10minus5

100

105

Fitn

ess

200 400 600 800 10000Generation

T-H camel f15

BSCBS

OBSCOBS

100

Fitn

ess

10minus120

10minus40

10minus60

10minus80

10minus100

10minus20

200 400 600 800 10000Generation

Schaffer f16

BSCBS

OBSCOBS

10minus1

10minus2

10minus3

10minus4

100

101

102Fi

tnes

s

200 400 600 800 10000Generation

Powell f17

BSCBS

OBSCOBS

100

105

Fitn

ess

10minus25

10minus20

10minus15

10minus10

10minus5

200 400 600 800 10000Generation

Weierstrass f18

BSCBS

OBSCOBS

10minus15

10minus10

10minus5

100

105

Fitn

ess

200 400 600 800 10000Generation

Figure 2 Convergence performance in fixed iteration

Mathematical Problems in Engineering 9

Table 2 Iteration performance in fixed precision

Function Limitation Algorithm Iterations Success rate Expected iterationsMin Max Mean

1198911 119890 minus 24BSA 1000 1000 1000 0 infinCBSA 845 925 877 1 26310OBSA 870 1000 974 03 97400COBSA 635 734 692 1 20760

1198912 119890 minus 25BSA 1000 1000 1000 0 infinCBSA 938 1000 971 095 30663OBSA 720 1000 957 035 82028COBSA 664 802 742 1 22260

1198913 119890 minus 20BSA 927 1000 993 015 198600CBSA 334 442 381 1 11430OBSA 211 353 288 1 8640COBSA 159 259 223 1 6690

1198914 119890 minus 17BSA 1000 1000 1000 0 infinCBSA 844 981 877 1 26310OBSA 791 1000 961 035 82371COBSA 688 851 747 1 22410

1198915 119890 minus 16BSA 1000 1000 1000 0 infinCBSA 757 837 791 1 23730OBSA 725 1000 952 05 57120COBSA 613 755 675 1 20250

1198916 119890 minus 18BSA 1000 1000 1000 0 infinCBSA 913 1000 954 09 31800OBSA 951 1000 993 02 148950COBSA 711 830 784 1 23520

1198917 119890 minus 5BSA 1000 1000 1000 0 infinCBSA 561 646 599 1 17970OBSA 250 541 453 1 13590COBSA 354 488 422 1 12660

1198918 119890 minus 7BSA 1000 1000 1000 0 infinCBSA 477 558 529 1 15870OBSA 329 615 465 1 13680COBSA 247 469 411 1 12330

1198919 30

BSA 1000 1000 1000 0 infinCBSA 215 298 267 1 8010OBSA 163 349 244 1 7320COBSA 141 241 189 1 5670

11989110 119890 minus 14BSA 1000 1000 1000 0 infinCBSA 958 1000 979 045 65266OBSA 830 1000 989 03 98900COBSA 687 860 784 1 23520

11989111 119890 minus 21BSA 1000 1000 1000 0 infinCBSA 873 947 914 1 27420OBSA 693 1000 927 09 30900COBSA 629 816 710 1 21300

11989112 119890 minus 14BSA 1000 1000 1000 0 infinCBSA 892 1000 986 035 84514OBSA 766 1000 883 09 29433COBSA 532 710 624 1 18720

10 Mathematical Problems in Engineering

Table 2 Continued

Function Limitation Algorithm Iterations Success rate Expected iterationsMin Max Mean

11989113 119890 minus 07BSA 1000 1000 1000 0 infinCBSA 630 700 667 1 20010OBSA 505 1000 821 085 28976COBSA 524 611 566 1 16980

11989114 119890 minus 05BSA 1000 1000 1000 0 infinCBSA 590 790 679 1 20370OBSA 286 960 608 1 18240COBSA 424 556 492 1 14760

11989115 119890 minus 80BSA 1000 1000 1000 0 infinCBSA 869 1000 973 06 48650OBSA 803 1000 991 045 66066COBSA 763 893 786 1 10560

11989116 001

BSA 1000 1000 1000 0 infinCBSA 800 1000 948 07 40628OBSA 596 1000 831 08 31162COBSA 475 667 577 1 17310

11989117 119890 minus 20BSA 1000 1000 1000 0 infinCBSA 956 1000 990 05 59400OBSA 658 1000 854 075 34160COBSA 545 761 689 1 20670

11989118 001

BSA 1000 1000 1000 0 infinCBSA 256 323 294 1 8820OBSA 172 1000 496 085 17505COBSA 218 327 260 1 7800

comparison algorithms based on theWilcoxon Signed-RankTest with a statistical significance value 120572 = 005

According to the above analyses of Sections 41 42 43and 44 conclusion can be summarized safely as followsCOBSA can improve the performance of BSA effectively andit is a feasible computation method

5 The Choice of Variance 119902In Section 31 the variance 119902 of matrix119898 plays an importantrole in affecting the performance of CBSA When 119902 takes0 the matrix 119898 will lose its role When 119902 increases from0 to 1 the fluctuation to population Pop will also increasecorrespondingly However a lower convergence speed willbe displayed owing to the obvious fluctuation to populationPop when 119902 is large Therefore four objective functionsSumsquare 1198912 Schwefel222 1198914 Alpine 1198916 and Ackley 11989110are employed to analyse the effect of variance 119902 The fitnessof function takes the logarithm of 10 to show and observe thecurves obviously

The result is better when the fitness of function issmaller owing to the minimization process shown in thesefunctions We can obtain from Figure 3 that variance 119902affects the result obviously CBSA plays a better performanceon Sumsquare 1198912 and Alpine 1198916 when 119902 = 03 When119902 takes 04 the fitness of Schwefel222 1198914 is better ForAckley 11989110 a lower precision is obtained when 119902 is around

03 Therefore the selective variance 119902 is set at 03 in thisarticle

Besides we can also observe from Figure 3 that theinfluence of matrix 119898 cannot be ignored When variance 119902is 0 the element in matrix119898 is 1 and as a result the matrix119898will lose its role As it can be obtained from Figure 3 a betterperformance is displayed when 119902 is around 03 rather than 0which proves the significance of matrix1198986 Conclusion

In order to solve the problem of slow convergencespeed and low convergence precision shown in BSAthis article proposes an improved algorithm namedCOBSA through employing population control factor tothe variation equation and helping disadvantaged grouphave optimal learning The experiments based on 18benchmark functions prove that COBSA is an effectiveevolutionary algorithm and it is a feasible solution toimprove the convergence speed and convergence precision ofBSA

However COBSA displays a lower convergence precisionin the function Rosenbrock 1198919 when compared with ABCand MABC How to make algorithm display better perfor-mance on functions Rosenbrock 1198919 is the next step in ourwork Besides it is worth applying the algorithm proposed in

Mathematical Problems in Engineering 11

Table3Ex

perim

entb

ased

oncomparis

onalgorithm

Functio

n1198911

11989121198913

11989141198915

11989161198917

11989181198919

1198911011989111

1198911211989113

1198911411989115

1198911611989117

11989118ABC M

ean

925119890minus 11

530119890minus 05

445119890minus 08

016130119890minus 03

0237854

003459

035720119890minus 03

363774119890+ 14

1187589119890minus 18

385562

024Std

311119890minus 10

372119890minus 04

232119890minus 08

007437119890minus 04

0122466

005341

032197119890minus 02

207145119890+ 15

224479119890minus 18

112560

004MABC Mean

729119890minus 16

969119890minus 10

687119890minus 15

276119890minus 02

128119890minus 06

795119890minus 05

397356119890minus03

2306230119890minus 03

193119890minus 05

1707625119890minus 08

190219119890minus 12

2512506

013Std

900119890minus 17

205119890minus 09

249119890minus 14

570119890minus 03

269119890minus 07

733119890minus 05

611001

4462871119890minus 03

994119890minus 06

1014230119890minus 07

148978119890minus 12

0391313

034DSA M

ean

550119890minus 03

146501119890minus 17

769005

308104119890+ 03

10117117

3755135

1044503119890+ 16

4932680119890minus 36

2145701

381Std

630119890minus 03

087119119890minus 16

224002

13316171

0122957

3234796

930571119890+ 16

447305119890minus 35

0385193

068BS

A Mean

001237

309119890minus 17

1158

009757

127119890+ 03

11214134

3498661

012252119890+ 25

6467464119890minus 47

157934

494Std

092128

614119890minus 17

208002

30313081

0052738

2772706

008321119890+ 25

534205119890minus 46

019545

073BA

Mean

917119890minus 07

020484119890minus 07

476119890+ 31

017195672119890+03

1018684

1925794119890minus 06

515119890minus 08

502119890+ 25

40900

456119890minus 12

1181

348365

Std

410119890minus 08

035216119890minus 08

213119890+ 32

016639

19500

45010827

014355119890minus 07

230119890minus 09

225119890+ 25

4700202119890minus 12

268107

1634

COBS

A

Mean

201119890minus 32

323119890minus 28

433119890minus 111

395119890minus 20

251119890minus 22

399119890minus 21

00

2738409119890minus 15

147119890minus 27

101119890 minus 20

138119890minus 18

0750119890minus 104

432119890minus 04

542119890minus 27

0

Std

879119890minus 32

143119890minus 27

147119890minus 110

201119890minus 21

750119890minus 22

119119890minus 20

00

013

110119890minus 15

586119890minus 27

433119890minus 20

616119890minus 17

0313119890minus 103

120119890minus 03

227119890minus 26

0

12 Mathematical Problems in Engineering

Table 4 Statistical test based on Wilcoxon Signed-Rank Test (120572 = 005)Algorithm versus COBSA 119901 value 119877+ 119877minus WinnerABC versus COBSA 250119890 minus 03 16 155 COBSAMABC versus COBSA 210119890 minus 03 15 156 COBSADSA versus COBSA 196119890 minus 04 0 171 COBSABSA versus COBSA 196119890 minus 04 0 171 COBSABA versus COBSA 196119890 minus 04 0 171 COBSA

Somesquare f2

01 02 03 04 05 06 07 08 090q

minus30

minus28

minus26

minus24

minus22

minus20

minus18

minus16

minus14

minus12

Fitn

ess

01 02 03 04 05 06 07 08 090q

minus22

minus20

minus18

minus16

minus14

minus12

minus10

minus8

minus6

Fitn

ess

Schwefel222 f4

Ackley f10

01 02 03 04 05 06 07 08 090q

minus15

minus14

minus13

minus12

minus11

minus10

minus9

minus8

Fitn

ess

Alpine f6

minus22

minus20

minus18

minus16

minus14

minus12

minus10

minus8

Fitn

ess

01 02 03 04 05 06 07 08 090q

Figure 3 The influence of 119902

this article to the blind source separation and hyperspectralimage processing

Conflicts of Interest

The authors declare that there are no conflicts of interestregarding the publication of this paper

Acknowledgments

The work is supported by National Nature Science Foun-dation of China (no 61401307) China Postdoctoral ScienceFoundation (no 2014M561184) Tianjin Research Program

of Application Foundation and Advanced Technology ofChina (no 15JCYBJC17100) and Tianjin Research Programof Science and Technology Commissioner of China (16JCT-PJC48400)

References

[1] J Kennedy and R Eberhart ldquoParticle swarm optimizationrdquoin Proceedings of the IEEE International Conference on NeuralNetworks pp 1942ndash1948 Perth Australia December 1995

[2] M Dorigo and T Stutzle Ant Colony Optimization MIT PressCombridge MA USA 2004

Mathematical Problems in Engineering 13

[3] J H HollandAdaptation in Natural and Artificial Systems MITPress Combridge MA USA 1992

[4] J J Liang A K Qin P N Suganthan and S BaskarldquoComprehensive learning particle swarm optimizer for globaloptimization of multimodal functionsrdquo IEEE Transactions onEvolutionary Computation vol 10 no 3 pp 281ndash295 2006

[5] K V Price R M Storn and J A Lampinen Differential Evo-lution A Practical Approach to Global Optimization SpringerBerlin Germany 2005

[6] J Zhang and A C Sanderson ldquoJADE Adaptive differentialevolution with optional external archiverdquo IEEE Transactions onEvolutionary Computation vol 13 no 5 pp 945ndash958 2009

[7] P Civicioglu ldquoTransforming geocentric cartesian coordinatesto geodetic coordinates by using differential search algorithmrdquoComputers and Geosciences vol 46 pp 229ndash247 2012

[8] D Karaboga and B Akay ldquoA comparative study of artificial beecolony algorithmrdquo Applied Mathematics and Computation vol214 no 1 pp 108ndash132 2009

[9] X S Yang and S Deb ldquoCuckoo search via Levy flightsrdquo inProceedings of the World Congress on Nature and BiologicallyInspired Computing (NABIC rsquo09) pp 210ndash214 CoimbatoreIndia December 2009

[10] X S Yang ldquoA new metaheuristic bat-inspired algorithmrdquo inNature Inspired Cooperative Strategies for Optimization (NICSOrsquo10) J R Gonzalez D A Pelta C Cruz G Terrazas andN Krasnogor Eds vol 284 of Studies in ComputationalIntelligence pp 65ndash74 Springer Berlin Germany 2010

[11] L Chen and L Y Zhang ldquoBlind signal separation algorithmbased on temporal predictability and differential search algo-rithmrdquo Journal on Communications vol 35 no 6 pp 117ndash1252014

[12] Z C Jia Y Y Xue L Chen Y J Guo and H D Xu ldquoBlindseparation algorithm for hyperspectral image based on thedenoising reduction and the bat optimizationrdquo Acta PhotonicaSinica vol 45 no 5 Article ID 0511001 2016

[13] J H Liang and M Ma ldquoArtificial bee colony algorithm basedresearch on image segmentationrdquo Computer Engineering andApplication vol 48 no 8 pp 194ndash197 2012

[14] P Civicioglu ldquoBacktracking search optimization algorithm fornumerical optimization problemsrdquo Applied Mathematics andComputation vol 219 no 15 pp 8121ndash8144 2013

[15] Z L Zhang and S Y Liu ldquoDecision-theoretic rough set attributereduction based on backtracking search algorithmrdquo ComputerEngineering and Applications vol 52 no 10 pp 71ndash74 2016

[16] X Chen S Y Liu and YWang ldquoEmergency resources schedul-ing based on improved backtracking search optimization algo-rithmrdquo Computer Applications and Software vol 32 no 12 pp235ndash238 2015

[17] M D Li and H Zhao ldquoBacktracking search optimizationalgorithm with comprehensive learning strategyrdquo System Engi-neering and Electronics vol 37 no 4 pp 958ndash963 2015

[18] X J Wang S Y Liu and W K Tian ldquoImproved backtrackingsearch optimization algorithm with new effective mutationscale factor and greedy crossover strategyrdquo Journal of ComputerApplications vol 34 no 9 pp 2543ndash2546 2014

[19] W K Tian S Y Liu and X J Wang ldquoStudy and improvementof backtracking search optimization algorithm based on differ-ential evolutionrdquoApplication Research of Computers vol 32 no6 pp 1653ndash1662 2015

[20] Y Shi and R Eberhart ldquoA modified particle swarm opti-mizerrdquo in Proceedings of the IEEE International Conference

on Evolutionary Computation and IEEE World Congress onComputational Intelligence (Cat No98TH8360) pp 69ndash73Anchorage Alaska USA May 1998

[21] W F Gao and S Y Liu ldquoA modified artificial bee colonyalgorithmrdquo Computers amp Operations Research vol 39 no 3 pp687ndash697 2012

[22] YW Shang and Y H Qiu ldquoA note on the extended Rosenbrockfunctionrdquo Evolutionary Computation vol 14 no 1 pp 119ndash1262006

[23] WHu and Z S Li ldquoA simpler andmore effective particle swarmoptimization algorithmrdquo Journal of Software vol 18 no 4 pp861ndash868 2007

[24] J Derrac S Garcıa D Molina and F Herrera ldquoA practicaltutorial on the use of nonparametric statistical tests as amethodology for comparing evolutionary and swarm intelli-gence algorithmsrdquo Swarm and Evolutionary Computation vol1 no 1 pp 3ndash18 2011

[25] S Garcıa D Molina M Lozano and F Herrera ldquoA study onthe use of non-parametric tests for analyzing the evolutionaryalgorithmsrsquo behaviour a case study on the CECrsquo2005 SpecialSession on Real Parameter Optimizationrdquo Journal of Heuristicsvol 15 no 6 pp 617ndash644 2009

Submit your manuscripts athttpswwwhindawicom

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical Problems in Engineering

Hindawi Publishing Corporationhttpwwwhindawicom

Differential EquationsInternational Journal of

Volume 2014

Applied MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Probability and StatisticsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical PhysicsAdvances in

Complex AnalysisJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

OptimizationJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

CombinatoricsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Operations ResearchAdvances in

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Function Spaces

Abstract and Applied AnalysisHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of Mathematics and Mathematical Sciences

Hindawi Publishing Corporationhttpwwwhindawicom Volume 201

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Algebra

Discrete Dynamics in Nature and Society

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Decision SciencesAdvances in

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014 Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Stochastic AnalysisInternational Journal of

Page 6: Improved Backtracking Search Algorithm Based on ...downloads.hindawi.com/journals/mpe/2017/3017608.pdfMathematicalProblemsinEngineering 3 3. The Improved Backtracking Search Algorithm

6 Mathematical Problems in Engineering

Sphere f1

BSCBS

OBSCOBS

10minus40

10minus30

10minus20

10minus10

100

1010

Fitn

ess

200 400 600 800 10000Generation

Sumsquare f2

BSCBS

OBSCOBS

10minus30

10minus20

10minus10

100

1010

Fitn

ess

200 400 600 800 10000Generation

Alpine f6

BSCBS

OBSCOBS

10minus25

10minus20

10minus15

10minus10

10minus5

100

105

Fitn

ess

200 400 600 800 10000Generation

Schwefel222 f4

BSCBS

OBSCOBS

10minus40

10minus20

100

1060

1040

1020

Fitn

ess

200 400 600 800 10000Generation

Sumpower f3

BSCBS

OBSCOBS

10minus150

10minus100

10minus50

100

1050

Fitn

ess

200 400 600 800 10000Generation

Exponential f5

BSCBS

OBSCOBS

10minus25

10minus20

10minus15

10minus10

10minus5

100

105

Fitn

ess

200 400 600 800 10000Generation

Figure 2 Continued

Mathematical Problems in Engineering 7

Griewank f8

BSCBS

OBSCOBS

10minus20

10minus15

10minus10

10minus5

100

105

Fitn

ess

200 400 600 800 10000Generation

Rastrigin f7

BSCBS

OBSCOBS

10minus15

10minus10

10minus5

100

1010

105

Fitn

ess

200 400 600 800 10000Generation

Ackley f10

BSCBS

OBSCOBS

10minus15

10minus10

10minus5

100

105Fi

tnes

s

200 400 600 800 10000Generation

Rosenbrock f9

BSCBS

OBSCOBS

100

1010

108

106

104

102

Fitn

ess

200 400 600 800 10000Generation

R-H-Ellipsoid f11

BSCBS

OBSCOBS

1010

100

Fitn

ess

10minus30

10minus20

10minus10

200 400 600 800 10000Generation

Schwefel12 f12

BSCBS

OBSCOBS

10minus25

10minus20

10minus15

10minus10

10minus5

100

105

Fitn

ess

200 400 600 800 10000Generation

Figure 2 Continued

8 Mathematical Problems in Engineering

Elliptic f13

BSCBS

OBSCOBS

1030

1010

1020

100

Fitn

ess

10minus20

10minus10

200 400 600 800 10000Generation

NCRastrigin f14

BSCBS

OBSCOBS

10minus15

10minus10

10minus5

100

105

Fitn

ess

200 400 600 800 10000Generation

T-H camel f15

BSCBS

OBSCOBS

100

Fitn

ess

10minus120

10minus40

10minus60

10minus80

10minus100

10minus20

200 400 600 800 10000Generation

Schaffer f16

BSCBS

OBSCOBS

10minus1

10minus2

10minus3

10minus4

100

101

102Fi

tnes

s

200 400 600 800 10000Generation

Powell f17

BSCBS

OBSCOBS

100

105

Fitn

ess

10minus25

10minus20

10minus15

10minus10

10minus5

200 400 600 800 10000Generation

Weierstrass f18

BSCBS

OBSCOBS

10minus15

10minus10

10minus5

100

105

Fitn

ess

200 400 600 800 10000Generation

Figure 2 Convergence performance in fixed iteration

Mathematical Problems in Engineering 9

Table 2 Iteration performance in fixed precision

Function Limitation Algorithm Iterations Success rate Expected iterationsMin Max Mean

1198911 119890 minus 24BSA 1000 1000 1000 0 infinCBSA 845 925 877 1 26310OBSA 870 1000 974 03 97400COBSA 635 734 692 1 20760

1198912 119890 minus 25BSA 1000 1000 1000 0 infinCBSA 938 1000 971 095 30663OBSA 720 1000 957 035 82028COBSA 664 802 742 1 22260

1198913 119890 minus 20BSA 927 1000 993 015 198600CBSA 334 442 381 1 11430OBSA 211 353 288 1 8640COBSA 159 259 223 1 6690

1198914 119890 minus 17BSA 1000 1000 1000 0 infinCBSA 844 981 877 1 26310OBSA 791 1000 961 035 82371COBSA 688 851 747 1 22410

1198915 119890 minus 16BSA 1000 1000 1000 0 infinCBSA 757 837 791 1 23730OBSA 725 1000 952 05 57120COBSA 613 755 675 1 20250

1198916 119890 minus 18BSA 1000 1000 1000 0 infinCBSA 913 1000 954 09 31800OBSA 951 1000 993 02 148950COBSA 711 830 784 1 23520

1198917 119890 minus 5BSA 1000 1000 1000 0 infinCBSA 561 646 599 1 17970OBSA 250 541 453 1 13590COBSA 354 488 422 1 12660

1198918 119890 minus 7BSA 1000 1000 1000 0 infinCBSA 477 558 529 1 15870OBSA 329 615 465 1 13680COBSA 247 469 411 1 12330

1198919 30

BSA 1000 1000 1000 0 infinCBSA 215 298 267 1 8010OBSA 163 349 244 1 7320COBSA 141 241 189 1 5670

11989110 119890 minus 14BSA 1000 1000 1000 0 infinCBSA 958 1000 979 045 65266OBSA 830 1000 989 03 98900COBSA 687 860 784 1 23520

11989111 119890 minus 21BSA 1000 1000 1000 0 infinCBSA 873 947 914 1 27420OBSA 693 1000 927 09 30900COBSA 629 816 710 1 21300

11989112 119890 minus 14BSA 1000 1000 1000 0 infinCBSA 892 1000 986 035 84514OBSA 766 1000 883 09 29433COBSA 532 710 624 1 18720

10 Mathematical Problems in Engineering

Table 2 Continued

Function Limitation Algorithm Iterations Success rate Expected iterationsMin Max Mean

11989113 119890 minus 07BSA 1000 1000 1000 0 infinCBSA 630 700 667 1 20010OBSA 505 1000 821 085 28976COBSA 524 611 566 1 16980

11989114 119890 minus 05BSA 1000 1000 1000 0 infinCBSA 590 790 679 1 20370OBSA 286 960 608 1 18240COBSA 424 556 492 1 14760

11989115 119890 minus 80BSA 1000 1000 1000 0 infinCBSA 869 1000 973 06 48650OBSA 803 1000 991 045 66066COBSA 763 893 786 1 10560

11989116 001

BSA 1000 1000 1000 0 infinCBSA 800 1000 948 07 40628OBSA 596 1000 831 08 31162COBSA 475 667 577 1 17310

11989117 119890 minus 20BSA 1000 1000 1000 0 infinCBSA 956 1000 990 05 59400OBSA 658 1000 854 075 34160COBSA 545 761 689 1 20670

11989118 001

BSA 1000 1000 1000 0 infinCBSA 256 323 294 1 8820OBSA 172 1000 496 085 17505COBSA 218 327 260 1 7800

comparison algorithms based on theWilcoxon Signed-RankTest with a statistical significance value 120572 = 005

According to the above analyses of Sections 41 42 43and 44 conclusion can be summarized safely as followsCOBSA can improve the performance of BSA effectively andit is a feasible computation method

5 The Choice of Variance 119902In Section 31 the variance 119902 of matrix119898 plays an importantrole in affecting the performance of CBSA When 119902 takes0 the matrix 119898 will lose its role When 119902 increases from0 to 1 the fluctuation to population Pop will also increasecorrespondingly However a lower convergence speed willbe displayed owing to the obvious fluctuation to populationPop when 119902 is large Therefore four objective functionsSumsquare 1198912 Schwefel222 1198914 Alpine 1198916 and Ackley 11989110are employed to analyse the effect of variance 119902 The fitnessof function takes the logarithm of 10 to show and observe thecurves obviously

The result is better when the fitness of function issmaller owing to the minimization process shown in thesefunctions We can obtain from Figure 3 that variance 119902affects the result obviously CBSA plays a better performanceon Sumsquare 1198912 and Alpine 1198916 when 119902 = 03 When119902 takes 04 the fitness of Schwefel222 1198914 is better ForAckley 11989110 a lower precision is obtained when 119902 is around

03 Therefore the selective variance 119902 is set at 03 in thisarticle

Besides we can also observe from Figure 3 that theinfluence of matrix 119898 cannot be ignored When variance 119902is 0 the element in matrix119898 is 1 and as a result the matrix119898will lose its role As it can be obtained from Figure 3 a betterperformance is displayed when 119902 is around 03 rather than 0which proves the significance of matrix1198986 Conclusion

In order to solve the problem of slow convergencespeed and low convergence precision shown in BSAthis article proposes an improved algorithm namedCOBSA through employing population control factor tothe variation equation and helping disadvantaged grouphave optimal learning The experiments based on 18benchmark functions prove that COBSA is an effectiveevolutionary algorithm and it is a feasible solution toimprove the convergence speed and convergence precision ofBSA

However COBSA displays a lower convergence precisionin the function Rosenbrock 1198919 when compared with ABCand MABC How to make algorithm display better perfor-mance on functions Rosenbrock 1198919 is the next step in ourwork Besides it is worth applying the algorithm proposed in

Mathematical Problems in Engineering 11

Table3Ex

perim

entb

ased

oncomparis

onalgorithm

Functio

n1198911

11989121198913

11989141198915

11989161198917

11989181198919

1198911011989111

1198911211989113

1198911411989115

1198911611989117

11989118ABC M

ean

925119890minus 11

530119890minus 05

445119890minus 08

016130119890minus 03

0237854

003459

035720119890minus 03

363774119890+ 14

1187589119890minus 18

385562

024Std

311119890minus 10

372119890minus 04

232119890minus 08

007437119890minus 04

0122466

005341

032197119890minus 02

207145119890+ 15

224479119890minus 18

112560

004MABC Mean

729119890minus 16

969119890minus 10

687119890minus 15

276119890minus 02

128119890minus 06

795119890minus 05

397356119890minus03

2306230119890minus 03

193119890minus 05

1707625119890minus 08

190219119890minus 12

2512506

013Std

900119890minus 17

205119890minus 09

249119890minus 14

570119890minus 03

269119890minus 07

733119890minus 05

611001

4462871119890minus 03

994119890minus 06

1014230119890minus 07

148978119890minus 12

0391313

034DSA M

ean

550119890minus 03

146501119890minus 17

769005

308104119890+ 03

10117117

3755135

1044503119890+ 16

4932680119890minus 36

2145701

381Std

630119890minus 03

087119119890minus 16

224002

13316171

0122957

3234796

930571119890+ 16

447305119890minus 35

0385193

068BS

A Mean

001237

309119890minus 17

1158

009757

127119890+ 03

11214134

3498661

012252119890+ 25

6467464119890minus 47

157934

494Std

092128

614119890minus 17

208002

30313081

0052738

2772706

008321119890+ 25

534205119890minus 46

019545

073BA

Mean

917119890minus 07

020484119890minus 07

476119890+ 31

017195672119890+03

1018684

1925794119890minus 06

515119890minus 08

502119890+ 25

40900

456119890minus 12

1181

348365

Std

410119890minus 08

035216119890minus 08

213119890+ 32

016639

19500

45010827

014355119890minus 07

230119890minus 09

225119890+ 25

4700202119890minus 12

268107

1634

COBS

A

Mean

201119890minus 32

323119890minus 28

433119890minus 111

395119890minus 20

251119890minus 22

399119890minus 21

00

2738409119890minus 15

147119890minus 27

101119890 minus 20

138119890minus 18

0750119890minus 104

432119890minus 04

542119890minus 27

0

Std

879119890minus 32

143119890minus 27

147119890minus 110

201119890minus 21

750119890minus 22

119119890minus 20

00

013

110119890minus 15

586119890minus 27

433119890minus 20

616119890minus 17

0313119890minus 103

120119890minus 03

227119890minus 26

0

12 Mathematical Problems in Engineering

Table 4 Statistical test based on Wilcoxon Signed-Rank Test (120572 = 005)Algorithm versus COBSA 119901 value 119877+ 119877minus WinnerABC versus COBSA 250119890 minus 03 16 155 COBSAMABC versus COBSA 210119890 minus 03 15 156 COBSADSA versus COBSA 196119890 minus 04 0 171 COBSABSA versus COBSA 196119890 minus 04 0 171 COBSABA versus COBSA 196119890 minus 04 0 171 COBSA

Somesquare f2

01 02 03 04 05 06 07 08 090q

minus30

minus28

minus26

minus24

minus22

minus20

minus18

minus16

minus14

minus12

Fitn

ess

01 02 03 04 05 06 07 08 090q

minus22

minus20

minus18

minus16

minus14

minus12

minus10

minus8

minus6

Fitn

ess

Schwefel222 f4

Ackley f10

01 02 03 04 05 06 07 08 090q

minus15

minus14

minus13

minus12

minus11

minus10

minus9

minus8

Fitn

ess

Alpine f6

minus22

minus20

minus18

minus16

minus14

minus12

minus10

minus8

Fitn

ess

01 02 03 04 05 06 07 08 090q

Figure 3 The influence of 119902

this article to the blind source separation and hyperspectralimage processing

Conflicts of Interest

The authors declare that there are no conflicts of interestregarding the publication of this paper

Acknowledgments

The work is supported by National Nature Science Foun-dation of China (no 61401307) China Postdoctoral ScienceFoundation (no 2014M561184) Tianjin Research Program

of Application Foundation and Advanced Technology ofChina (no 15JCYBJC17100) and Tianjin Research Programof Science and Technology Commissioner of China (16JCT-PJC48400)

References

[1] J Kennedy and R Eberhart ldquoParticle swarm optimizationrdquoin Proceedings of the IEEE International Conference on NeuralNetworks pp 1942ndash1948 Perth Australia December 1995

[2] M Dorigo and T Stutzle Ant Colony Optimization MIT PressCombridge MA USA 2004

Mathematical Problems in Engineering 13

[3] J H HollandAdaptation in Natural and Artificial Systems MITPress Combridge MA USA 1992

[4] J J Liang A K Qin P N Suganthan and S BaskarldquoComprehensive learning particle swarm optimizer for globaloptimization of multimodal functionsrdquo IEEE Transactions onEvolutionary Computation vol 10 no 3 pp 281ndash295 2006

[5] K V Price R M Storn and J A Lampinen Differential Evo-lution A Practical Approach to Global Optimization SpringerBerlin Germany 2005

[6] J Zhang and A C Sanderson ldquoJADE Adaptive differentialevolution with optional external archiverdquo IEEE Transactions onEvolutionary Computation vol 13 no 5 pp 945ndash958 2009

[7] P Civicioglu ldquoTransforming geocentric cartesian coordinatesto geodetic coordinates by using differential search algorithmrdquoComputers and Geosciences vol 46 pp 229ndash247 2012

[8] D Karaboga and B Akay ldquoA comparative study of artificial beecolony algorithmrdquo Applied Mathematics and Computation vol214 no 1 pp 108ndash132 2009

[9] X S Yang and S Deb ldquoCuckoo search via Levy flightsrdquo inProceedings of the World Congress on Nature and BiologicallyInspired Computing (NABIC rsquo09) pp 210ndash214 CoimbatoreIndia December 2009

[10] X S Yang ldquoA new metaheuristic bat-inspired algorithmrdquo inNature Inspired Cooperative Strategies for Optimization (NICSOrsquo10) J R Gonzalez D A Pelta C Cruz G Terrazas andN Krasnogor Eds vol 284 of Studies in ComputationalIntelligence pp 65ndash74 Springer Berlin Germany 2010

[11] L Chen and L Y Zhang ldquoBlind signal separation algorithmbased on temporal predictability and differential search algo-rithmrdquo Journal on Communications vol 35 no 6 pp 117ndash1252014

[12] Z C Jia Y Y Xue L Chen Y J Guo and H D Xu ldquoBlindseparation algorithm for hyperspectral image based on thedenoising reduction and the bat optimizationrdquo Acta PhotonicaSinica vol 45 no 5 Article ID 0511001 2016

[13] J H Liang and M Ma ldquoArtificial bee colony algorithm basedresearch on image segmentationrdquo Computer Engineering andApplication vol 48 no 8 pp 194ndash197 2012

[14] P Civicioglu ldquoBacktracking search optimization algorithm fornumerical optimization problemsrdquo Applied Mathematics andComputation vol 219 no 15 pp 8121ndash8144 2013

[15] Z L Zhang and S Y Liu ldquoDecision-theoretic rough set attributereduction based on backtracking search algorithmrdquo ComputerEngineering and Applications vol 52 no 10 pp 71ndash74 2016

[16] X Chen S Y Liu and YWang ldquoEmergency resources schedul-ing based on improved backtracking search optimization algo-rithmrdquo Computer Applications and Software vol 32 no 12 pp235ndash238 2015

[17] M D Li and H Zhao ldquoBacktracking search optimizationalgorithm with comprehensive learning strategyrdquo System Engi-neering and Electronics vol 37 no 4 pp 958ndash963 2015

[18] X J Wang S Y Liu and W K Tian ldquoImproved backtrackingsearch optimization algorithm with new effective mutationscale factor and greedy crossover strategyrdquo Journal of ComputerApplications vol 34 no 9 pp 2543ndash2546 2014

[19] W K Tian S Y Liu and X J Wang ldquoStudy and improvementof backtracking search optimization algorithm based on differ-ential evolutionrdquoApplication Research of Computers vol 32 no6 pp 1653ndash1662 2015

[20] Y Shi and R Eberhart ldquoA modified particle swarm opti-mizerrdquo in Proceedings of the IEEE International Conference

on Evolutionary Computation and IEEE World Congress onComputational Intelligence (Cat No98TH8360) pp 69ndash73Anchorage Alaska USA May 1998

[21] W F Gao and S Y Liu ldquoA modified artificial bee colonyalgorithmrdquo Computers amp Operations Research vol 39 no 3 pp687ndash697 2012

[22] YW Shang and Y H Qiu ldquoA note on the extended Rosenbrockfunctionrdquo Evolutionary Computation vol 14 no 1 pp 119ndash1262006

[23] WHu and Z S Li ldquoA simpler andmore effective particle swarmoptimization algorithmrdquo Journal of Software vol 18 no 4 pp861ndash868 2007

[24] J Derrac S Garcıa D Molina and F Herrera ldquoA practicaltutorial on the use of nonparametric statistical tests as amethodology for comparing evolutionary and swarm intelli-gence algorithmsrdquo Swarm and Evolutionary Computation vol1 no 1 pp 3ndash18 2011

[25] S Garcıa D Molina M Lozano and F Herrera ldquoA study onthe use of non-parametric tests for analyzing the evolutionaryalgorithmsrsquo behaviour a case study on the CECrsquo2005 SpecialSession on Real Parameter Optimizationrdquo Journal of Heuristicsvol 15 no 6 pp 617ndash644 2009

Submit your manuscripts athttpswwwhindawicom

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical Problems in Engineering

Hindawi Publishing Corporationhttpwwwhindawicom

Differential EquationsInternational Journal of

Volume 2014

Applied MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Probability and StatisticsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical PhysicsAdvances in

Complex AnalysisJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

OptimizationJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

CombinatoricsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Operations ResearchAdvances in

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Function Spaces

Abstract and Applied AnalysisHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of Mathematics and Mathematical Sciences

Hindawi Publishing Corporationhttpwwwhindawicom Volume 201

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Algebra

Discrete Dynamics in Nature and Society

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Decision SciencesAdvances in

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014 Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Stochastic AnalysisInternational Journal of

Page 7: Improved Backtracking Search Algorithm Based on ...downloads.hindawi.com/journals/mpe/2017/3017608.pdfMathematicalProblemsinEngineering 3 3. The Improved Backtracking Search Algorithm

Mathematical Problems in Engineering 7

Griewank f8

BSCBS

OBSCOBS

10minus20

10minus15

10minus10

10minus5

100

105

Fitn

ess

200 400 600 800 10000Generation

Rastrigin f7

BSCBS

OBSCOBS

10minus15

10minus10

10minus5

100

1010

105

Fitn

ess

200 400 600 800 10000Generation

Ackley f10

BSCBS

OBSCOBS

10minus15

10minus10

10minus5

100

105Fi

tnes

s

200 400 600 800 10000Generation

Rosenbrock f9

BSCBS

OBSCOBS

100

1010

108

106

104

102

Fitn

ess

200 400 600 800 10000Generation

R-H-Ellipsoid f11

BSCBS

OBSCOBS

1010

100

Fitn

ess

10minus30

10minus20

10minus10

200 400 600 800 10000Generation

Schwefel12 f12

BSCBS

OBSCOBS

10minus25

10minus20

10minus15

10minus10

10minus5

100

105

Fitn

ess

200 400 600 800 10000Generation

Figure 2 Continued

8 Mathematical Problems in Engineering

Elliptic f13

BSCBS

OBSCOBS

1030

1010

1020

100

Fitn

ess

10minus20

10minus10

200 400 600 800 10000Generation

NCRastrigin f14

BSCBS

OBSCOBS

10minus15

10minus10

10minus5

100

105

Fitn

ess

200 400 600 800 10000Generation

T-H camel f15

BSCBS

OBSCOBS

100

Fitn

ess

10minus120

10minus40

10minus60

10minus80

10minus100

10minus20

200 400 600 800 10000Generation

Schaffer f16

BSCBS

OBSCOBS

10minus1

10minus2

10minus3

10minus4

100

101

102Fi

tnes

s

200 400 600 800 10000Generation

Powell f17

BSCBS

OBSCOBS

100

105

Fitn

ess

10minus25

10minus20

10minus15

10minus10

10minus5

200 400 600 800 10000Generation

Weierstrass f18

BSCBS

OBSCOBS

10minus15

10minus10

10minus5

100

105

Fitn

ess

200 400 600 800 10000Generation

Figure 2 Convergence performance in fixed iteration

Mathematical Problems in Engineering 9

Table 2 Iteration performance in fixed precision

Function Limitation Algorithm Iterations Success rate Expected iterationsMin Max Mean

1198911 119890 minus 24BSA 1000 1000 1000 0 infinCBSA 845 925 877 1 26310OBSA 870 1000 974 03 97400COBSA 635 734 692 1 20760

1198912 119890 minus 25BSA 1000 1000 1000 0 infinCBSA 938 1000 971 095 30663OBSA 720 1000 957 035 82028COBSA 664 802 742 1 22260

1198913 119890 minus 20BSA 927 1000 993 015 198600CBSA 334 442 381 1 11430OBSA 211 353 288 1 8640COBSA 159 259 223 1 6690

1198914 119890 minus 17BSA 1000 1000 1000 0 infinCBSA 844 981 877 1 26310OBSA 791 1000 961 035 82371COBSA 688 851 747 1 22410

1198915 119890 minus 16BSA 1000 1000 1000 0 infinCBSA 757 837 791 1 23730OBSA 725 1000 952 05 57120COBSA 613 755 675 1 20250

1198916 119890 minus 18BSA 1000 1000 1000 0 infinCBSA 913 1000 954 09 31800OBSA 951 1000 993 02 148950COBSA 711 830 784 1 23520

1198917 119890 minus 5BSA 1000 1000 1000 0 infinCBSA 561 646 599 1 17970OBSA 250 541 453 1 13590COBSA 354 488 422 1 12660

1198918 119890 minus 7BSA 1000 1000 1000 0 infinCBSA 477 558 529 1 15870OBSA 329 615 465 1 13680COBSA 247 469 411 1 12330

1198919 30

BSA 1000 1000 1000 0 infinCBSA 215 298 267 1 8010OBSA 163 349 244 1 7320COBSA 141 241 189 1 5670

11989110 119890 minus 14BSA 1000 1000 1000 0 infinCBSA 958 1000 979 045 65266OBSA 830 1000 989 03 98900COBSA 687 860 784 1 23520

11989111 119890 minus 21BSA 1000 1000 1000 0 infinCBSA 873 947 914 1 27420OBSA 693 1000 927 09 30900COBSA 629 816 710 1 21300

11989112 119890 minus 14BSA 1000 1000 1000 0 infinCBSA 892 1000 986 035 84514OBSA 766 1000 883 09 29433COBSA 532 710 624 1 18720

10 Mathematical Problems in Engineering

Table 2 Continued

Function Limitation Algorithm Iterations Success rate Expected iterationsMin Max Mean

11989113 119890 minus 07BSA 1000 1000 1000 0 infinCBSA 630 700 667 1 20010OBSA 505 1000 821 085 28976COBSA 524 611 566 1 16980

11989114 119890 minus 05BSA 1000 1000 1000 0 infinCBSA 590 790 679 1 20370OBSA 286 960 608 1 18240COBSA 424 556 492 1 14760

11989115 119890 minus 80BSA 1000 1000 1000 0 infinCBSA 869 1000 973 06 48650OBSA 803 1000 991 045 66066COBSA 763 893 786 1 10560

11989116 001

BSA 1000 1000 1000 0 infinCBSA 800 1000 948 07 40628OBSA 596 1000 831 08 31162COBSA 475 667 577 1 17310

11989117 119890 minus 20BSA 1000 1000 1000 0 infinCBSA 956 1000 990 05 59400OBSA 658 1000 854 075 34160COBSA 545 761 689 1 20670

11989118 001

BSA 1000 1000 1000 0 infinCBSA 256 323 294 1 8820OBSA 172 1000 496 085 17505COBSA 218 327 260 1 7800

comparison algorithms based on theWilcoxon Signed-RankTest with a statistical significance value 120572 = 005

According to the above analyses of Sections 41 42 43and 44 conclusion can be summarized safely as followsCOBSA can improve the performance of BSA effectively andit is a feasible computation method

5 The Choice of Variance 119902In Section 31 the variance 119902 of matrix119898 plays an importantrole in affecting the performance of CBSA When 119902 takes0 the matrix 119898 will lose its role When 119902 increases from0 to 1 the fluctuation to population Pop will also increasecorrespondingly However a lower convergence speed willbe displayed owing to the obvious fluctuation to populationPop when 119902 is large Therefore four objective functionsSumsquare 1198912 Schwefel222 1198914 Alpine 1198916 and Ackley 11989110are employed to analyse the effect of variance 119902 The fitnessof function takes the logarithm of 10 to show and observe thecurves obviously

The result is better when the fitness of function issmaller owing to the minimization process shown in thesefunctions We can obtain from Figure 3 that variance 119902affects the result obviously CBSA plays a better performanceon Sumsquare 1198912 and Alpine 1198916 when 119902 = 03 When119902 takes 04 the fitness of Schwefel222 1198914 is better ForAckley 11989110 a lower precision is obtained when 119902 is around

03 Therefore the selective variance 119902 is set at 03 in thisarticle

Besides we can also observe from Figure 3 that theinfluence of matrix 119898 cannot be ignored When variance 119902is 0 the element in matrix119898 is 1 and as a result the matrix119898will lose its role As it can be obtained from Figure 3 a betterperformance is displayed when 119902 is around 03 rather than 0which proves the significance of matrix1198986 Conclusion

In order to solve the problem of slow convergencespeed and low convergence precision shown in BSAthis article proposes an improved algorithm namedCOBSA through employing population control factor tothe variation equation and helping disadvantaged grouphave optimal learning The experiments based on 18benchmark functions prove that COBSA is an effectiveevolutionary algorithm and it is a feasible solution toimprove the convergence speed and convergence precision ofBSA

However COBSA displays a lower convergence precisionin the function Rosenbrock 1198919 when compared with ABCand MABC How to make algorithm display better perfor-mance on functions Rosenbrock 1198919 is the next step in ourwork Besides it is worth applying the algorithm proposed in

Mathematical Problems in Engineering 11

Table3Ex

perim

entb

ased

oncomparis

onalgorithm

Functio

n1198911

11989121198913

11989141198915

11989161198917

11989181198919

1198911011989111

1198911211989113

1198911411989115

1198911611989117

11989118ABC M

ean

925119890minus 11

530119890minus 05

445119890minus 08

016130119890minus 03

0237854

003459

035720119890minus 03

363774119890+ 14

1187589119890minus 18

385562

024Std

311119890minus 10

372119890minus 04

232119890minus 08

007437119890minus 04

0122466

005341

032197119890minus 02

207145119890+ 15

224479119890minus 18

112560

004MABC Mean

729119890minus 16

969119890minus 10

687119890minus 15

276119890minus 02

128119890minus 06

795119890minus 05

397356119890minus03

2306230119890minus 03

193119890minus 05

1707625119890minus 08

190219119890minus 12

2512506

013Std

900119890minus 17

205119890minus 09

249119890minus 14

570119890minus 03

269119890minus 07

733119890minus 05

611001

4462871119890minus 03

994119890minus 06

1014230119890minus 07

148978119890minus 12

0391313

034DSA M

ean

550119890minus 03

146501119890minus 17

769005

308104119890+ 03

10117117

3755135

1044503119890+ 16

4932680119890minus 36

2145701

381Std

630119890minus 03

087119119890minus 16

224002

13316171

0122957

3234796

930571119890+ 16

447305119890minus 35

0385193

068BS

A Mean

001237

309119890minus 17

1158

009757

127119890+ 03

11214134

3498661

012252119890+ 25

6467464119890minus 47

157934

494Std

092128

614119890minus 17

208002

30313081

0052738

2772706

008321119890+ 25

534205119890minus 46

019545

073BA

Mean

917119890minus 07

020484119890minus 07

476119890+ 31

017195672119890+03

1018684

1925794119890minus 06

515119890minus 08

502119890+ 25

40900

456119890minus 12

1181

348365

Std

410119890minus 08

035216119890minus 08

213119890+ 32

016639

19500

45010827

014355119890minus 07

230119890minus 09

225119890+ 25

4700202119890minus 12

268107

1634

COBS

A

Mean

201119890minus 32

323119890minus 28

433119890minus 111

395119890minus 20

251119890minus 22

399119890minus 21

00

2738409119890minus 15

147119890minus 27

101119890 minus 20

138119890minus 18

0750119890minus 104

432119890minus 04

542119890minus 27

0

Std

879119890minus 32

143119890minus 27

147119890minus 110

201119890minus 21

750119890minus 22

119119890minus 20

00

013

110119890minus 15

586119890minus 27

433119890minus 20

616119890minus 17

0313119890minus 103

120119890minus 03

227119890minus 26

0

12 Mathematical Problems in Engineering

Table 4 Statistical test based on Wilcoxon Signed-Rank Test (120572 = 005)Algorithm versus COBSA 119901 value 119877+ 119877minus WinnerABC versus COBSA 250119890 minus 03 16 155 COBSAMABC versus COBSA 210119890 minus 03 15 156 COBSADSA versus COBSA 196119890 minus 04 0 171 COBSABSA versus COBSA 196119890 minus 04 0 171 COBSABA versus COBSA 196119890 minus 04 0 171 COBSA

Somesquare f2

01 02 03 04 05 06 07 08 090q

minus30

minus28

minus26

minus24

minus22

minus20

minus18

minus16

minus14

minus12

Fitn

ess

01 02 03 04 05 06 07 08 090q

minus22

minus20

minus18

minus16

minus14

minus12

minus10

minus8

minus6

Fitn

ess

Schwefel222 f4

Ackley f10

01 02 03 04 05 06 07 08 090q

minus15

minus14

minus13

minus12

minus11

minus10

minus9

minus8

Fitn

ess

Alpine f6

minus22

minus20

minus18

minus16

minus14

minus12

minus10

minus8

Fitn

ess

01 02 03 04 05 06 07 08 090q

Figure 3 The influence of 119902

this article to the blind source separation and hyperspectralimage processing

Conflicts of Interest

The authors declare that there are no conflicts of interestregarding the publication of this paper

Acknowledgments

The work is supported by National Nature Science Foun-dation of China (no 61401307) China Postdoctoral ScienceFoundation (no 2014M561184) Tianjin Research Program

of Application Foundation and Advanced Technology ofChina (no 15JCYBJC17100) and Tianjin Research Programof Science and Technology Commissioner of China (16JCT-PJC48400)

References

[1] J Kennedy and R Eberhart ldquoParticle swarm optimizationrdquoin Proceedings of the IEEE International Conference on NeuralNetworks pp 1942ndash1948 Perth Australia December 1995

[2] M Dorigo and T Stutzle Ant Colony Optimization MIT PressCombridge MA USA 2004

Mathematical Problems in Engineering 13

[3] J H HollandAdaptation in Natural and Artificial Systems MITPress Combridge MA USA 1992

[4] J J Liang A K Qin P N Suganthan and S BaskarldquoComprehensive learning particle swarm optimizer for globaloptimization of multimodal functionsrdquo IEEE Transactions onEvolutionary Computation vol 10 no 3 pp 281ndash295 2006

[5] K V Price R M Storn and J A Lampinen Differential Evo-lution A Practical Approach to Global Optimization SpringerBerlin Germany 2005

[6] J Zhang and A C Sanderson ldquoJADE Adaptive differentialevolution with optional external archiverdquo IEEE Transactions onEvolutionary Computation vol 13 no 5 pp 945ndash958 2009

[7] P Civicioglu ldquoTransforming geocentric cartesian coordinatesto geodetic coordinates by using differential search algorithmrdquoComputers and Geosciences vol 46 pp 229ndash247 2012

[8] D Karaboga and B Akay ldquoA comparative study of artificial beecolony algorithmrdquo Applied Mathematics and Computation vol214 no 1 pp 108ndash132 2009

[9] X S Yang and S Deb ldquoCuckoo search via Levy flightsrdquo inProceedings of the World Congress on Nature and BiologicallyInspired Computing (NABIC rsquo09) pp 210ndash214 CoimbatoreIndia December 2009

[10] X S Yang ldquoA new metaheuristic bat-inspired algorithmrdquo inNature Inspired Cooperative Strategies for Optimization (NICSOrsquo10) J R Gonzalez D A Pelta C Cruz G Terrazas andN Krasnogor Eds vol 284 of Studies in ComputationalIntelligence pp 65ndash74 Springer Berlin Germany 2010

[11] L Chen and L Y Zhang ldquoBlind signal separation algorithmbased on temporal predictability and differential search algo-rithmrdquo Journal on Communications vol 35 no 6 pp 117ndash1252014

[12] Z C Jia Y Y Xue L Chen Y J Guo and H D Xu ldquoBlindseparation algorithm for hyperspectral image based on thedenoising reduction and the bat optimizationrdquo Acta PhotonicaSinica vol 45 no 5 Article ID 0511001 2016

[13] J H Liang and M Ma ldquoArtificial bee colony algorithm basedresearch on image segmentationrdquo Computer Engineering andApplication vol 48 no 8 pp 194ndash197 2012

[14] P Civicioglu ldquoBacktracking search optimization algorithm fornumerical optimization problemsrdquo Applied Mathematics andComputation vol 219 no 15 pp 8121ndash8144 2013

[15] Z L Zhang and S Y Liu ldquoDecision-theoretic rough set attributereduction based on backtracking search algorithmrdquo ComputerEngineering and Applications vol 52 no 10 pp 71ndash74 2016

[16] X Chen S Y Liu and YWang ldquoEmergency resources schedul-ing based on improved backtracking search optimization algo-rithmrdquo Computer Applications and Software vol 32 no 12 pp235ndash238 2015

[17] M D Li and H Zhao ldquoBacktracking search optimizationalgorithm with comprehensive learning strategyrdquo System Engi-neering and Electronics vol 37 no 4 pp 958ndash963 2015

[18] X J Wang S Y Liu and W K Tian ldquoImproved backtrackingsearch optimization algorithm with new effective mutationscale factor and greedy crossover strategyrdquo Journal of ComputerApplications vol 34 no 9 pp 2543ndash2546 2014

[19] W K Tian S Y Liu and X J Wang ldquoStudy and improvementof backtracking search optimization algorithm based on differ-ential evolutionrdquoApplication Research of Computers vol 32 no6 pp 1653ndash1662 2015

[20] Y Shi and R Eberhart ldquoA modified particle swarm opti-mizerrdquo in Proceedings of the IEEE International Conference

on Evolutionary Computation and IEEE World Congress onComputational Intelligence (Cat No98TH8360) pp 69ndash73Anchorage Alaska USA May 1998

[21] W F Gao and S Y Liu ldquoA modified artificial bee colonyalgorithmrdquo Computers amp Operations Research vol 39 no 3 pp687ndash697 2012

[22] YW Shang and Y H Qiu ldquoA note on the extended Rosenbrockfunctionrdquo Evolutionary Computation vol 14 no 1 pp 119ndash1262006

[23] WHu and Z S Li ldquoA simpler andmore effective particle swarmoptimization algorithmrdquo Journal of Software vol 18 no 4 pp861ndash868 2007

[24] J Derrac S Garcıa D Molina and F Herrera ldquoA practicaltutorial on the use of nonparametric statistical tests as amethodology for comparing evolutionary and swarm intelli-gence algorithmsrdquo Swarm and Evolutionary Computation vol1 no 1 pp 3ndash18 2011

[25] S Garcıa D Molina M Lozano and F Herrera ldquoA study onthe use of non-parametric tests for analyzing the evolutionaryalgorithmsrsquo behaviour a case study on the CECrsquo2005 SpecialSession on Real Parameter Optimizationrdquo Journal of Heuristicsvol 15 no 6 pp 617ndash644 2009

Submit your manuscripts athttpswwwhindawicom

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical Problems in Engineering

Hindawi Publishing Corporationhttpwwwhindawicom

Differential EquationsInternational Journal of

Volume 2014

Applied MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Probability and StatisticsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical PhysicsAdvances in

Complex AnalysisJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

OptimizationJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

CombinatoricsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Operations ResearchAdvances in

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Function Spaces

Abstract and Applied AnalysisHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of Mathematics and Mathematical Sciences

Hindawi Publishing Corporationhttpwwwhindawicom Volume 201

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Algebra

Discrete Dynamics in Nature and Society

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Decision SciencesAdvances in

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014 Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Stochastic AnalysisInternational Journal of

Page 8: Improved Backtracking Search Algorithm Based on ...downloads.hindawi.com/journals/mpe/2017/3017608.pdfMathematicalProblemsinEngineering 3 3. The Improved Backtracking Search Algorithm

8 Mathematical Problems in Engineering

Elliptic f13

BSCBS

OBSCOBS

1030

1010

1020

100

Fitn

ess

10minus20

10minus10

200 400 600 800 10000Generation

NCRastrigin f14

BSCBS

OBSCOBS

10minus15

10minus10

10minus5

100

105

Fitn

ess

200 400 600 800 10000Generation

T-H camel f15

BSCBS

OBSCOBS

100

Fitn

ess

10minus120

10minus40

10minus60

10minus80

10minus100

10minus20

200 400 600 800 10000Generation

Schaffer f16

BSCBS

OBSCOBS

10minus1

10minus2

10minus3

10minus4

100

101

102Fi

tnes

s

200 400 600 800 10000Generation

Powell f17

BSCBS

OBSCOBS

100

105

Fitn

ess

10minus25

10minus20

10minus15

10minus10

10minus5

200 400 600 800 10000Generation

Weierstrass f18

BSCBS

OBSCOBS

10minus15

10minus10

10minus5

100

105

Fitn

ess

200 400 600 800 10000Generation

Figure 2 Convergence performance in fixed iteration

Mathematical Problems in Engineering 9

Table 2 Iteration performance in fixed precision

Function Limitation Algorithm Iterations Success rate Expected iterationsMin Max Mean

1198911 119890 minus 24BSA 1000 1000 1000 0 infinCBSA 845 925 877 1 26310OBSA 870 1000 974 03 97400COBSA 635 734 692 1 20760

1198912 119890 minus 25BSA 1000 1000 1000 0 infinCBSA 938 1000 971 095 30663OBSA 720 1000 957 035 82028COBSA 664 802 742 1 22260

1198913 119890 minus 20BSA 927 1000 993 015 198600CBSA 334 442 381 1 11430OBSA 211 353 288 1 8640COBSA 159 259 223 1 6690

1198914 119890 minus 17BSA 1000 1000 1000 0 infinCBSA 844 981 877 1 26310OBSA 791 1000 961 035 82371COBSA 688 851 747 1 22410

1198915 119890 minus 16BSA 1000 1000 1000 0 infinCBSA 757 837 791 1 23730OBSA 725 1000 952 05 57120COBSA 613 755 675 1 20250

1198916 119890 minus 18BSA 1000 1000 1000 0 infinCBSA 913 1000 954 09 31800OBSA 951 1000 993 02 148950COBSA 711 830 784 1 23520

1198917 119890 minus 5BSA 1000 1000 1000 0 infinCBSA 561 646 599 1 17970OBSA 250 541 453 1 13590COBSA 354 488 422 1 12660

1198918 119890 minus 7BSA 1000 1000 1000 0 infinCBSA 477 558 529 1 15870OBSA 329 615 465 1 13680COBSA 247 469 411 1 12330

1198919 30

BSA 1000 1000 1000 0 infinCBSA 215 298 267 1 8010OBSA 163 349 244 1 7320COBSA 141 241 189 1 5670

11989110 119890 minus 14BSA 1000 1000 1000 0 infinCBSA 958 1000 979 045 65266OBSA 830 1000 989 03 98900COBSA 687 860 784 1 23520

11989111 119890 minus 21BSA 1000 1000 1000 0 infinCBSA 873 947 914 1 27420OBSA 693 1000 927 09 30900COBSA 629 816 710 1 21300

11989112 119890 minus 14BSA 1000 1000 1000 0 infinCBSA 892 1000 986 035 84514OBSA 766 1000 883 09 29433COBSA 532 710 624 1 18720

10 Mathematical Problems in Engineering

Table 2 Continued

Function Limitation Algorithm Iterations Success rate Expected iterationsMin Max Mean

11989113 119890 minus 07BSA 1000 1000 1000 0 infinCBSA 630 700 667 1 20010OBSA 505 1000 821 085 28976COBSA 524 611 566 1 16980

11989114 119890 minus 05BSA 1000 1000 1000 0 infinCBSA 590 790 679 1 20370OBSA 286 960 608 1 18240COBSA 424 556 492 1 14760

11989115 119890 minus 80BSA 1000 1000 1000 0 infinCBSA 869 1000 973 06 48650OBSA 803 1000 991 045 66066COBSA 763 893 786 1 10560

11989116 001

BSA 1000 1000 1000 0 infinCBSA 800 1000 948 07 40628OBSA 596 1000 831 08 31162COBSA 475 667 577 1 17310

11989117 119890 minus 20BSA 1000 1000 1000 0 infinCBSA 956 1000 990 05 59400OBSA 658 1000 854 075 34160COBSA 545 761 689 1 20670

11989118 001

BSA 1000 1000 1000 0 infinCBSA 256 323 294 1 8820OBSA 172 1000 496 085 17505COBSA 218 327 260 1 7800

comparison algorithms based on theWilcoxon Signed-RankTest with a statistical significance value 120572 = 005

According to the above analyses of Sections 41 42 43and 44 conclusion can be summarized safely as followsCOBSA can improve the performance of BSA effectively andit is a feasible computation method

5 The Choice of Variance 119902In Section 31 the variance 119902 of matrix119898 plays an importantrole in affecting the performance of CBSA When 119902 takes0 the matrix 119898 will lose its role When 119902 increases from0 to 1 the fluctuation to population Pop will also increasecorrespondingly However a lower convergence speed willbe displayed owing to the obvious fluctuation to populationPop when 119902 is large Therefore four objective functionsSumsquare 1198912 Schwefel222 1198914 Alpine 1198916 and Ackley 11989110are employed to analyse the effect of variance 119902 The fitnessof function takes the logarithm of 10 to show and observe thecurves obviously

The result is better when the fitness of function issmaller owing to the minimization process shown in thesefunctions We can obtain from Figure 3 that variance 119902affects the result obviously CBSA plays a better performanceon Sumsquare 1198912 and Alpine 1198916 when 119902 = 03 When119902 takes 04 the fitness of Schwefel222 1198914 is better ForAckley 11989110 a lower precision is obtained when 119902 is around

03 Therefore the selective variance 119902 is set at 03 in thisarticle

Besides we can also observe from Figure 3 that theinfluence of matrix 119898 cannot be ignored When variance 119902is 0 the element in matrix119898 is 1 and as a result the matrix119898will lose its role As it can be obtained from Figure 3 a betterperformance is displayed when 119902 is around 03 rather than 0which proves the significance of matrix1198986 Conclusion

In order to solve the problem of slow convergencespeed and low convergence precision shown in BSAthis article proposes an improved algorithm namedCOBSA through employing population control factor tothe variation equation and helping disadvantaged grouphave optimal learning The experiments based on 18benchmark functions prove that COBSA is an effectiveevolutionary algorithm and it is a feasible solution toimprove the convergence speed and convergence precision ofBSA

However COBSA displays a lower convergence precisionin the function Rosenbrock 1198919 when compared with ABCand MABC How to make algorithm display better perfor-mance on functions Rosenbrock 1198919 is the next step in ourwork Besides it is worth applying the algorithm proposed in

Mathematical Problems in Engineering 11

Table3Ex

perim

entb

ased

oncomparis

onalgorithm

Functio

n1198911

11989121198913

11989141198915

11989161198917

11989181198919

1198911011989111

1198911211989113

1198911411989115

1198911611989117

11989118ABC M

ean

925119890minus 11

530119890minus 05

445119890minus 08

016130119890minus 03

0237854

003459

035720119890minus 03

363774119890+ 14

1187589119890minus 18

385562

024Std

311119890minus 10

372119890minus 04

232119890minus 08

007437119890minus 04

0122466

005341

032197119890minus 02

207145119890+ 15

224479119890minus 18

112560

004MABC Mean

729119890minus 16

969119890minus 10

687119890minus 15

276119890minus 02

128119890minus 06

795119890minus 05

397356119890minus03

2306230119890minus 03

193119890minus 05

1707625119890minus 08

190219119890minus 12

2512506

013Std

900119890minus 17

205119890minus 09

249119890minus 14

570119890minus 03

269119890minus 07

733119890minus 05

611001

4462871119890minus 03

994119890minus 06

1014230119890minus 07

148978119890minus 12

0391313

034DSA M

ean

550119890minus 03

146501119890minus 17

769005

308104119890+ 03

10117117

3755135

1044503119890+ 16

4932680119890minus 36

2145701

381Std

630119890minus 03

087119119890minus 16

224002

13316171

0122957

3234796

930571119890+ 16

447305119890minus 35

0385193

068BS

A Mean

001237

309119890minus 17

1158

009757

127119890+ 03

11214134

3498661

012252119890+ 25

6467464119890minus 47

157934

494Std

092128

614119890minus 17

208002

30313081

0052738

2772706

008321119890+ 25

534205119890minus 46

019545

073BA

Mean

917119890minus 07

020484119890minus 07

476119890+ 31

017195672119890+03

1018684

1925794119890minus 06

515119890minus 08

502119890+ 25

40900

456119890minus 12

1181

348365

Std

410119890minus 08

035216119890minus 08

213119890+ 32

016639

19500

45010827

014355119890minus 07

230119890minus 09

225119890+ 25

4700202119890minus 12

268107

1634

COBS

A

Mean

201119890minus 32

323119890minus 28

433119890minus 111

395119890minus 20

251119890minus 22

399119890minus 21

00

2738409119890minus 15

147119890minus 27

101119890 minus 20

138119890minus 18

0750119890minus 104

432119890minus 04

542119890minus 27

0

Std

879119890minus 32

143119890minus 27

147119890minus 110

201119890minus 21

750119890minus 22

119119890minus 20

00

013

110119890minus 15

586119890minus 27

433119890minus 20

616119890minus 17

0313119890minus 103

120119890minus 03

227119890minus 26

0

12 Mathematical Problems in Engineering

Table 4 Statistical test based on Wilcoxon Signed-Rank Test (120572 = 005)Algorithm versus COBSA 119901 value 119877+ 119877minus WinnerABC versus COBSA 250119890 minus 03 16 155 COBSAMABC versus COBSA 210119890 minus 03 15 156 COBSADSA versus COBSA 196119890 minus 04 0 171 COBSABSA versus COBSA 196119890 minus 04 0 171 COBSABA versus COBSA 196119890 minus 04 0 171 COBSA

Somesquare f2

01 02 03 04 05 06 07 08 090q

minus30

minus28

minus26

minus24

minus22

minus20

minus18

minus16

minus14

minus12

Fitn

ess

01 02 03 04 05 06 07 08 090q

minus22

minus20

minus18

minus16

minus14

minus12

minus10

minus8

minus6

Fitn

ess

Schwefel222 f4

Ackley f10

01 02 03 04 05 06 07 08 090q

minus15

minus14

minus13

minus12

minus11

minus10

minus9

minus8

Fitn

ess

Alpine f6

minus22

minus20

minus18

minus16

minus14

minus12

minus10

minus8

Fitn

ess

01 02 03 04 05 06 07 08 090q

Figure 3 The influence of 119902

this article to the blind source separation and hyperspectralimage processing

Conflicts of Interest

The authors declare that there are no conflicts of interestregarding the publication of this paper

Acknowledgments

The work is supported by National Nature Science Foun-dation of China (no 61401307) China Postdoctoral ScienceFoundation (no 2014M561184) Tianjin Research Program

of Application Foundation and Advanced Technology ofChina (no 15JCYBJC17100) and Tianjin Research Programof Science and Technology Commissioner of China (16JCT-PJC48400)

References

[1] J Kennedy and R Eberhart ldquoParticle swarm optimizationrdquoin Proceedings of the IEEE International Conference on NeuralNetworks pp 1942ndash1948 Perth Australia December 1995

[2] M Dorigo and T Stutzle Ant Colony Optimization MIT PressCombridge MA USA 2004

Mathematical Problems in Engineering 13

[3] J H HollandAdaptation in Natural and Artificial Systems MITPress Combridge MA USA 1992

[4] J J Liang A K Qin P N Suganthan and S BaskarldquoComprehensive learning particle swarm optimizer for globaloptimization of multimodal functionsrdquo IEEE Transactions onEvolutionary Computation vol 10 no 3 pp 281ndash295 2006

[5] K V Price R M Storn and J A Lampinen Differential Evo-lution A Practical Approach to Global Optimization SpringerBerlin Germany 2005

[6] J Zhang and A C Sanderson ldquoJADE Adaptive differentialevolution with optional external archiverdquo IEEE Transactions onEvolutionary Computation vol 13 no 5 pp 945ndash958 2009

[7] P Civicioglu ldquoTransforming geocentric cartesian coordinatesto geodetic coordinates by using differential search algorithmrdquoComputers and Geosciences vol 46 pp 229ndash247 2012

[8] D Karaboga and B Akay ldquoA comparative study of artificial beecolony algorithmrdquo Applied Mathematics and Computation vol214 no 1 pp 108ndash132 2009

[9] X S Yang and S Deb ldquoCuckoo search via Levy flightsrdquo inProceedings of the World Congress on Nature and BiologicallyInspired Computing (NABIC rsquo09) pp 210ndash214 CoimbatoreIndia December 2009

[10] X S Yang ldquoA new metaheuristic bat-inspired algorithmrdquo inNature Inspired Cooperative Strategies for Optimization (NICSOrsquo10) J R Gonzalez D A Pelta C Cruz G Terrazas andN Krasnogor Eds vol 284 of Studies in ComputationalIntelligence pp 65ndash74 Springer Berlin Germany 2010

[11] L Chen and L Y Zhang ldquoBlind signal separation algorithmbased on temporal predictability and differential search algo-rithmrdquo Journal on Communications vol 35 no 6 pp 117ndash1252014

[12] Z C Jia Y Y Xue L Chen Y J Guo and H D Xu ldquoBlindseparation algorithm for hyperspectral image based on thedenoising reduction and the bat optimizationrdquo Acta PhotonicaSinica vol 45 no 5 Article ID 0511001 2016

[13] J H Liang and M Ma ldquoArtificial bee colony algorithm basedresearch on image segmentationrdquo Computer Engineering andApplication vol 48 no 8 pp 194ndash197 2012

[14] P Civicioglu ldquoBacktracking search optimization algorithm fornumerical optimization problemsrdquo Applied Mathematics andComputation vol 219 no 15 pp 8121ndash8144 2013

[15] Z L Zhang and S Y Liu ldquoDecision-theoretic rough set attributereduction based on backtracking search algorithmrdquo ComputerEngineering and Applications vol 52 no 10 pp 71ndash74 2016

[16] X Chen S Y Liu and YWang ldquoEmergency resources schedul-ing based on improved backtracking search optimization algo-rithmrdquo Computer Applications and Software vol 32 no 12 pp235ndash238 2015

[17] M D Li and H Zhao ldquoBacktracking search optimizationalgorithm with comprehensive learning strategyrdquo System Engi-neering and Electronics vol 37 no 4 pp 958ndash963 2015

[18] X J Wang S Y Liu and W K Tian ldquoImproved backtrackingsearch optimization algorithm with new effective mutationscale factor and greedy crossover strategyrdquo Journal of ComputerApplications vol 34 no 9 pp 2543ndash2546 2014

[19] W K Tian S Y Liu and X J Wang ldquoStudy and improvementof backtracking search optimization algorithm based on differ-ential evolutionrdquoApplication Research of Computers vol 32 no6 pp 1653ndash1662 2015

[20] Y Shi and R Eberhart ldquoA modified particle swarm opti-mizerrdquo in Proceedings of the IEEE International Conference

on Evolutionary Computation and IEEE World Congress onComputational Intelligence (Cat No98TH8360) pp 69ndash73Anchorage Alaska USA May 1998

[21] W F Gao and S Y Liu ldquoA modified artificial bee colonyalgorithmrdquo Computers amp Operations Research vol 39 no 3 pp687ndash697 2012

[22] YW Shang and Y H Qiu ldquoA note on the extended Rosenbrockfunctionrdquo Evolutionary Computation vol 14 no 1 pp 119ndash1262006

[23] WHu and Z S Li ldquoA simpler andmore effective particle swarmoptimization algorithmrdquo Journal of Software vol 18 no 4 pp861ndash868 2007

[24] J Derrac S Garcıa D Molina and F Herrera ldquoA practicaltutorial on the use of nonparametric statistical tests as amethodology for comparing evolutionary and swarm intelli-gence algorithmsrdquo Swarm and Evolutionary Computation vol1 no 1 pp 3ndash18 2011

[25] S Garcıa D Molina M Lozano and F Herrera ldquoA study onthe use of non-parametric tests for analyzing the evolutionaryalgorithmsrsquo behaviour a case study on the CECrsquo2005 SpecialSession on Real Parameter Optimizationrdquo Journal of Heuristicsvol 15 no 6 pp 617ndash644 2009

Submit your manuscripts athttpswwwhindawicom

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical Problems in Engineering

Hindawi Publishing Corporationhttpwwwhindawicom

Differential EquationsInternational Journal of

Volume 2014

Applied MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Probability and StatisticsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical PhysicsAdvances in

Complex AnalysisJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

OptimizationJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

CombinatoricsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Operations ResearchAdvances in

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Function Spaces

Abstract and Applied AnalysisHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of Mathematics and Mathematical Sciences

Hindawi Publishing Corporationhttpwwwhindawicom Volume 201

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Algebra

Discrete Dynamics in Nature and Society

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Decision SciencesAdvances in

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014 Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Stochastic AnalysisInternational Journal of

Page 9: Improved Backtracking Search Algorithm Based on ...downloads.hindawi.com/journals/mpe/2017/3017608.pdfMathematicalProblemsinEngineering 3 3. The Improved Backtracking Search Algorithm

Mathematical Problems in Engineering 9

Table 2 Iteration performance in fixed precision

Function Limitation Algorithm Iterations Success rate Expected iterationsMin Max Mean

1198911 119890 minus 24BSA 1000 1000 1000 0 infinCBSA 845 925 877 1 26310OBSA 870 1000 974 03 97400COBSA 635 734 692 1 20760

1198912 119890 minus 25BSA 1000 1000 1000 0 infinCBSA 938 1000 971 095 30663OBSA 720 1000 957 035 82028COBSA 664 802 742 1 22260

1198913 119890 minus 20BSA 927 1000 993 015 198600CBSA 334 442 381 1 11430OBSA 211 353 288 1 8640COBSA 159 259 223 1 6690

1198914 119890 minus 17BSA 1000 1000 1000 0 infinCBSA 844 981 877 1 26310OBSA 791 1000 961 035 82371COBSA 688 851 747 1 22410

1198915 119890 minus 16BSA 1000 1000 1000 0 infinCBSA 757 837 791 1 23730OBSA 725 1000 952 05 57120COBSA 613 755 675 1 20250

1198916 119890 minus 18BSA 1000 1000 1000 0 infinCBSA 913 1000 954 09 31800OBSA 951 1000 993 02 148950COBSA 711 830 784 1 23520

1198917 119890 minus 5BSA 1000 1000 1000 0 infinCBSA 561 646 599 1 17970OBSA 250 541 453 1 13590COBSA 354 488 422 1 12660

1198918 119890 minus 7BSA 1000 1000 1000 0 infinCBSA 477 558 529 1 15870OBSA 329 615 465 1 13680COBSA 247 469 411 1 12330

1198919 30

BSA 1000 1000 1000 0 infinCBSA 215 298 267 1 8010OBSA 163 349 244 1 7320COBSA 141 241 189 1 5670

11989110 119890 minus 14BSA 1000 1000 1000 0 infinCBSA 958 1000 979 045 65266OBSA 830 1000 989 03 98900COBSA 687 860 784 1 23520

11989111 119890 minus 21BSA 1000 1000 1000 0 infinCBSA 873 947 914 1 27420OBSA 693 1000 927 09 30900COBSA 629 816 710 1 21300

11989112 119890 minus 14BSA 1000 1000 1000 0 infinCBSA 892 1000 986 035 84514OBSA 766 1000 883 09 29433COBSA 532 710 624 1 18720

10 Mathematical Problems in Engineering

Table 2 Continued

Function Limitation Algorithm Iterations Success rate Expected iterationsMin Max Mean

11989113 119890 minus 07BSA 1000 1000 1000 0 infinCBSA 630 700 667 1 20010OBSA 505 1000 821 085 28976COBSA 524 611 566 1 16980

11989114 119890 minus 05BSA 1000 1000 1000 0 infinCBSA 590 790 679 1 20370OBSA 286 960 608 1 18240COBSA 424 556 492 1 14760

11989115 119890 minus 80BSA 1000 1000 1000 0 infinCBSA 869 1000 973 06 48650OBSA 803 1000 991 045 66066COBSA 763 893 786 1 10560

11989116 001

BSA 1000 1000 1000 0 infinCBSA 800 1000 948 07 40628OBSA 596 1000 831 08 31162COBSA 475 667 577 1 17310

11989117 119890 minus 20BSA 1000 1000 1000 0 infinCBSA 956 1000 990 05 59400OBSA 658 1000 854 075 34160COBSA 545 761 689 1 20670

11989118 001

BSA 1000 1000 1000 0 infinCBSA 256 323 294 1 8820OBSA 172 1000 496 085 17505COBSA 218 327 260 1 7800

comparison algorithms based on theWilcoxon Signed-RankTest with a statistical significance value 120572 = 005

According to the above analyses of Sections 41 42 43and 44 conclusion can be summarized safely as followsCOBSA can improve the performance of BSA effectively andit is a feasible computation method

5 The Choice of Variance 119902In Section 31 the variance 119902 of matrix119898 plays an importantrole in affecting the performance of CBSA When 119902 takes0 the matrix 119898 will lose its role When 119902 increases from0 to 1 the fluctuation to population Pop will also increasecorrespondingly However a lower convergence speed willbe displayed owing to the obvious fluctuation to populationPop when 119902 is large Therefore four objective functionsSumsquare 1198912 Schwefel222 1198914 Alpine 1198916 and Ackley 11989110are employed to analyse the effect of variance 119902 The fitnessof function takes the logarithm of 10 to show and observe thecurves obviously

The result is better when the fitness of function issmaller owing to the minimization process shown in thesefunctions We can obtain from Figure 3 that variance 119902affects the result obviously CBSA plays a better performanceon Sumsquare 1198912 and Alpine 1198916 when 119902 = 03 When119902 takes 04 the fitness of Schwefel222 1198914 is better ForAckley 11989110 a lower precision is obtained when 119902 is around

03 Therefore the selective variance 119902 is set at 03 in thisarticle

Besides we can also observe from Figure 3 that theinfluence of matrix 119898 cannot be ignored When variance 119902is 0 the element in matrix119898 is 1 and as a result the matrix119898will lose its role As it can be obtained from Figure 3 a betterperformance is displayed when 119902 is around 03 rather than 0which proves the significance of matrix1198986 Conclusion

In order to solve the problem of slow convergencespeed and low convergence precision shown in BSAthis article proposes an improved algorithm namedCOBSA through employing population control factor tothe variation equation and helping disadvantaged grouphave optimal learning The experiments based on 18benchmark functions prove that COBSA is an effectiveevolutionary algorithm and it is a feasible solution toimprove the convergence speed and convergence precision ofBSA

However COBSA displays a lower convergence precisionin the function Rosenbrock 1198919 when compared with ABCand MABC How to make algorithm display better perfor-mance on functions Rosenbrock 1198919 is the next step in ourwork Besides it is worth applying the algorithm proposed in

Mathematical Problems in Engineering 11

Table3Ex

perim

entb

ased

oncomparis

onalgorithm

Functio

n1198911

11989121198913

11989141198915

11989161198917

11989181198919

1198911011989111

1198911211989113

1198911411989115

1198911611989117

11989118ABC M

ean

925119890minus 11

530119890minus 05

445119890minus 08

016130119890minus 03

0237854

003459

035720119890minus 03

363774119890+ 14

1187589119890minus 18

385562

024Std

311119890minus 10

372119890minus 04

232119890minus 08

007437119890minus 04

0122466

005341

032197119890minus 02

207145119890+ 15

224479119890minus 18

112560

004MABC Mean

729119890minus 16

969119890minus 10

687119890minus 15

276119890minus 02

128119890minus 06

795119890minus 05

397356119890minus03

2306230119890minus 03

193119890minus 05

1707625119890minus 08

190219119890minus 12

2512506

013Std

900119890minus 17

205119890minus 09

249119890minus 14

570119890minus 03

269119890minus 07

733119890minus 05

611001

4462871119890minus 03

994119890minus 06

1014230119890minus 07

148978119890minus 12

0391313

034DSA M

ean

550119890minus 03

146501119890minus 17

769005

308104119890+ 03

10117117

3755135

1044503119890+ 16

4932680119890minus 36

2145701

381Std

630119890minus 03

087119119890minus 16

224002

13316171

0122957

3234796

930571119890+ 16

447305119890minus 35

0385193

068BS

A Mean

001237

309119890minus 17

1158

009757

127119890+ 03

11214134

3498661

012252119890+ 25

6467464119890minus 47

157934

494Std

092128

614119890minus 17

208002

30313081

0052738

2772706

008321119890+ 25

534205119890minus 46

019545

073BA

Mean

917119890minus 07

020484119890minus 07

476119890+ 31

017195672119890+03

1018684

1925794119890minus 06

515119890minus 08

502119890+ 25

40900

456119890minus 12

1181

348365

Std

410119890minus 08

035216119890minus 08

213119890+ 32

016639

19500

45010827

014355119890minus 07

230119890minus 09

225119890+ 25

4700202119890minus 12

268107

1634

COBS

A

Mean

201119890minus 32

323119890minus 28

433119890minus 111

395119890minus 20

251119890minus 22

399119890minus 21

00

2738409119890minus 15

147119890minus 27

101119890 minus 20

138119890minus 18

0750119890minus 104

432119890minus 04

542119890minus 27

0

Std

879119890minus 32

143119890minus 27

147119890minus 110

201119890minus 21

750119890minus 22

119119890minus 20

00

013

110119890minus 15

586119890minus 27

433119890minus 20

616119890minus 17

0313119890minus 103

120119890minus 03

227119890minus 26

0

12 Mathematical Problems in Engineering

Table 4 Statistical test based on Wilcoxon Signed-Rank Test (120572 = 005)Algorithm versus COBSA 119901 value 119877+ 119877minus WinnerABC versus COBSA 250119890 minus 03 16 155 COBSAMABC versus COBSA 210119890 minus 03 15 156 COBSADSA versus COBSA 196119890 minus 04 0 171 COBSABSA versus COBSA 196119890 minus 04 0 171 COBSABA versus COBSA 196119890 minus 04 0 171 COBSA

Somesquare f2

01 02 03 04 05 06 07 08 090q

minus30

minus28

minus26

minus24

minus22

minus20

minus18

minus16

minus14

minus12

Fitn

ess

01 02 03 04 05 06 07 08 090q

minus22

minus20

minus18

minus16

minus14

minus12

minus10

minus8

minus6

Fitn

ess

Schwefel222 f4

Ackley f10

01 02 03 04 05 06 07 08 090q

minus15

minus14

minus13

minus12

minus11

minus10

minus9

minus8

Fitn

ess

Alpine f6

minus22

minus20

minus18

minus16

minus14

minus12

minus10

minus8

Fitn

ess

01 02 03 04 05 06 07 08 090q

Figure 3 The influence of 119902

this article to the blind source separation and hyperspectralimage processing

Conflicts of Interest

The authors declare that there are no conflicts of interestregarding the publication of this paper

Acknowledgments

The work is supported by National Nature Science Foun-dation of China (no 61401307) China Postdoctoral ScienceFoundation (no 2014M561184) Tianjin Research Program

of Application Foundation and Advanced Technology ofChina (no 15JCYBJC17100) and Tianjin Research Programof Science and Technology Commissioner of China (16JCT-PJC48400)

References

[1] J Kennedy and R Eberhart ldquoParticle swarm optimizationrdquoin Proceedings of the IEEE International Conference on NeuralNetworks pp 1942ndash1948 Perth Australia December 1995

[2] M Dorigo and T Stutzle Ant Colony Optimization MIT PressCombridge MA USA 2004

Mathematical Problems in Engineering 13

[3] J H HollandAdaptation in Natural and Artificial Systems MITPress Combridge MA USA 1992

[4] J J Liang A K Qin P N Suganthan and S BaskarldquoComprehensive learning particle swarm optimizer for globaloptimization of multimodal functionsrdquo IEEE Transactions onEvolutionary Computation vol 10 no 3 pp 281ndash295 2006

[5] K V Price R M Storn and J A Lampinen Differential Evo-lution A Practical Approach to Global Optimization SpringerBerlin Germany 2005

[6] J Zhang and A C Sanderson ldquoJADE Adaptive differentialevolution with optional external archiverdquo IEEE Transactions onEvolutionary Computation vol 13 no 5 pp 945ndash958 2009

[7] P Civicioglu ldquoTransforming geocentric cartesian coordinatesto geodetic coordinates by using differential search algorithmrdquoComputers and Geosciences vol 46 pp 229ndash247 2012

[8] D Karaboga and B Akay ldquoA comparative study of artificial beecolony algorithmrdquo Applied Mathematics and Computation vol214 no 1 pp 108ndash132 2009

[9] X S Yang and S Deb ldquoCuckoo search via Levy flightsrdquo inProceedings of the World Congress on Nature and BiologicallyInspired Computing (NABIC rsquo09) pp 210ndash214 CoimbatoreIndia December 2009

[10] X S Yang ldquoA new metaheuristic bat-inspired algorithmrdquo inNature Inspired Cooperative Strategies for Optimization (NICSOrsquo10) J R Gonzalez D A Pelta C Cruz G Terrazas andN Krasnogor Eds vol 284 of Studies in ComputationalIntelligence pp 65ndash74 Springer Berlin Germany 2010

[11] L Chen and L Y Zhang ldquoBlind signal separation algorithmbased on temporal predictability and differential search algo-rithmrdquo Journal on Communications vol 35 no 6 pp 117ndash1252014

[12] Z C Jia Y Y Xue L Chen Y J Guo and H D Xu ldquoBlindseparation algorithm for hyperspectral image based on thedenoising reduction and the bat optimizationrdquo Acta PhotonicaSinica vol 45 no 5 Article ID 0511001 2016

[13] J H Liang and M Ma ldquoArtificial bee colony algorithm basedresearch on image segmentationrdquo Computer Engineering andApplication vol 48 no 8 pp 194ndash197 2012

[14] P Civicioglu ldquoBacktracking search optimization algorithm fornumerical optimization problemsrdquo Applied Mathematics andComputation vol 219 no 15 pp 8121ndash8144 2013

[15] Z L Zhang and S Y Liu ldquoDecision-theoretic rough set attributereduction based on backtracking search algorithmrdquo ComputerEngineering and Applications vol 52 no 10 pp 71ndash74 2016

[16] X Chen S Y Liu and YWang ldquoEmergency resources schedul-ing based on improved backtracking search optimization algo-rithmrdquo Computer Applications and Software vol 32 no 12 pp235ndash238 2015

[17] M D Li and H Zhao ldquoBacktracking search optimizationalgorithm with comprehensive learning strategyrdquo System Engi-neering and Electronics vol 37 no 4 pp 958ndash963 2015

[18] X J Wang S Y Liu and W K Tian ldquoImproved backtrackingsearch optimization algorithm with new effective mutationscale factor and greedy crossover strategyrdquo Journal of ComputerApplications vol 34 no 9 pp 2543ndash2546 2014

[19] W K Tian S Y Liu and X J Wang ldquoStudy and improvementof backtracking search optimization algorithm based on differ-ential evolutionrdquoApplication Research of Computers vol 32 no6 pp 1653ndash1662 2015

[20] Y Shi and R Eberhart ldquoA modified particle swarm opti-mizerrdquo in Proceedings of the IEEE International Conference

on Evolutionary Computation and IEEE World Congress onComputational Intelligence (Cat No98TH8360) pp 69ndash73Anchorage Alaska USA May 1998

[21] W F Gao and S Y Liu ldquoA modified artificial bee colonyalgorithmrdquo Computers amp Operations Research vol 39 no 3 pp687ndash697 2012

[22] YW Shang and Y H Qiu ldquoA note on the extended Rosenbrockfunctionrdquo Evolutionary Computation vol 14 no 1 pp 119ndash1262006

[23] WHu and Z S Li ldquoA simpler andmore effective particle swarmoptimization algorithmrdquo Journal of Software vol 18 no 4 pp861ndash868 2007

[24] J Derrac S Garcıa D Molina and F Herrera ldquoA practicaltutorial on the use of nonparametric statistical tests as amethodology for comparing evolutionary and swarm intelli-gence algorithmsrdquo Swarm and Evolutionary Computation vol1 no 1 pp 3ndash18 2011

[25] S Garcıa D Molina M Lozano and F Herrera ldquoA study onthe use of non-parametric tests for analyzing the evolutionaryalgorithmsrsquo behaviour a case study on the CECrsquo2005 SpecialSession on Real Parameter Optimizationrdquo Journal of Heuristicsvol 15 no 6 pp 617ndash644 2009

Submit your manuscripts athttpswwwhindawicom

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical Problems in Engineering

Hindawi Publishing Corporationhttpwwwhindawicom

Differential EquationsInternational Journal of

Volume 2014

Applied MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Probability and StatisticsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical PhysicsAdvances in

Complex AnalysisJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

OptimizationJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

CombinatoricsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Operations ResearchAdvances in

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Function Spaces

Abstract and Applied AnalysisHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of Mathematics and Mathematical Sciences

Hindawi Publishing Corporationhttpwwwhindawicom Volume 201

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Algebra

Discrete Dynamics in Nature and Society

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Decision SciencesAdvances in

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014 Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Stochastic AnalysisInternational Journal of

Page 10: Improved Backtracking Search Algorithm Based on ...downloads.hindawi.com/journals/mpe/2017/3017608.pdfMathematicalProblemsinEngineering 3 3. The Improved Backtracking Search Algorithm

10 Mathematical Problems in Engineering

Table 2 Continued

Function Limitation Algorithm Iterations Success rate Expected iterationsMin Max Mean

11989113 119890 minus 07BSA 1000 1000 1000 0 infinCBSA 630 700 667 1 20010OBSA 505 1000 821 085 28976COBSA 524 611 566 1 16980

11989114 119890 minus 05BSA 1000 1000 1000 0 infinCBSA 590 790 679 1 20370OBSA 286 960 608 1 18240COBSA 424 556 492 1 14760

11989115 119890 minus 80BSA 1000 1000 1000 0 infinCBSA 869 1000 973 06 48650OBSA 803 1000 991 045 66066COBSA 763 893 786 1 10560

11989116 001

BSA 1000 1000 1000 0 infinCBSA 800 1000 948 07 40628OBSA 596 1000 831 08 31162COBSA 475 667 577 1 17310

11989117 119890 minus 20BSA 1000 1000 1000 0 infinCBSA 956 1000 990 05 59400OBSA 658 1000 854 075 34160COBSA 545 761 689 1 20670

11989118 001

BSA 1000 1000 1000 0 infinCBSA 256 323 294 1 8820OBSA 172 1000 496 085 17505COBSA 218 327 260 1 7800

comparison algorithms based on theWilcoxon Signed-RankTest with a statistical significance value 120572 = 005

According to the above analyses of Sections 41 42 43and 44 conclusion can be summarized safely as followsCOBSA can improve the performance of BSA effectively andit is a feasible computation method

5 The Choice of Variance 119902In Section 31 the variance 119902 of matrix119898 plays an importantrole in affecting the performance of CBSA When 119902 takes0 the matrix 119898 will lose its role When 119902 increases from0 to 1 the fluctuation to population Pop will also increasecorrespondingly However a lower convergence speed willbe displayed owing to the obvious fluctuation to populationPop when 119902 is large Therefore four objective functionsSumsquare 1198912 Schwefel222 1198914 Alpine 1198916 and Ackley 11989110are employed to analyse the effect of variance 119902 The fitnessof function takes the logarithm of 10 to show and observe thecurves obviously

The result is better when the fitness of function issmaller owing to the minimization process shown in thesefunctions We can obtain from Figure 3 that variance 119902affects the result obviously CBSA plays a better performanceon Sumsquare 1198912 and Alpine 1198916 when 119902 = 03 When119902 takes 04 the fitness of Schwefel222 1198914 is better ForAckley 11989110 a lower precision is obtained when 119902 is around

03 Therefore the selective variance 119902 is set at 03 in thisarticle

Besides we can also observe from Figure 3 that theinfluence of matrix 119898 cannot be ignored When variance 119902is 0 the element in matrix119898 is 1 and as a result the matrix119898will lose its role As it can be obtained from Figure 3 a betterperformance is displayed when 119902 is around 03 rather than 0which proves the significance of matrix1198986 Conclusion

In order to solve the problem of slow convergencespeed and low convergence precision shown in BSAthis article proposes an improved algorithm namedCOBSA through employing population control factor tothe variation equation and helping disadvantaged grouphave optimal learning The experiments based on 18benchmark functions prove that COBSA is an effectiveevolutionary algorithm and it is a feasible solution toimprove the convergence speed and convergence precision ofBSA

However COBSA displays a lower convergence precisionin the function Rosenbrock 1198919 when compared with ABCand MABC How to make algorithm display better perfor-mance on functions Rosenbrock 1198919 is the next step in ourwork Besides it is worth applying the algorithm proposed in

Mathematical Problems in Engineering 11

Table3Ex

perim

entb

ased

oncomparis

onalgorithm

Functio

n1198911

11989121198913

11989141198915

11989161198917

11989181198919

1198911011989111

1198911211989113

1198911411989115

1198911611989117

11989118ABC M

ean

925119890minus 11

530119890minus 05

445119890minus 08

016130119890minus 03

0237854

003459

035720119890minus 03

363774119890+ 14

1187589119890minus 18

385562

024Std

311119890minus 10

372119890minus 04

232119890minus 08

007437119890minus 04

0122466

005341

032197119890minus 02

207145119890+ 15

224479119890minus 18

112560

004MABC Mean

729119890minus 16

969119890minus 10

687119890minus 15

276119890minus 02

128119890minus 06

795119890minus 05

397356119890minus03

2306230119890minus 03

193119890minus 05

1707625119890minus 08

190219119890minus 12

2512506

013Std

900119890minus 17

205119890minus 09

249119890minus 14

570119890minus 03

269119890minus 07

733119890minus 05

611001

4462871119890minus 03

994119890minus 06

1014230119890minus 07

148978119890minus 12

0391313

034DSA M

ean

550119890minus 03

146501119890minus 17

769005

308104119890+ 03

10117117

3755135

1044503119890+ 16

4932680119890minus 36

2145701

381Std

630119890minus 03

087119119890minus 16

224002

13316171

0122957

3234796

930571119890+ 16

447305119890minus 35

0385193

068BS

A Mean

001237

309119890minus 17

1158

009757

127119890+ 03

11214134

3498661

012252119890+ 25

6467464119890minus 47

157934

494Std

092128

614119890minus 17

208002

30313081

0052738

2772706

008321119890+ 25

534205119890minus 46

019545

073BA

Mean

917119890minus 07

020484119890minus 07

476119890+ 31

017195672119890+03

1018684

1925794119890minus 06

515119890minus 08

502119890+ 25

40900

456119890minus 12

1181

348365

Std

410119890minus 08

035216119890minus 08

213119890+ 32

016639

19500

45010827

014355119890minus 07

230119890minus 09

225119890+ 25

4700202119890minus 12

268107

1634

COBS

A

Mean

201119890minus 32

323119890minus 28

433119890minus 111

395119890minus 20

251119890minus 22

399119890minus 21

00

2738409119890minus 15

147119890minus 27

101119890 minus 20

138119890minus 18

0750119890minus 104

432119890minus 04

542119890minus 27

0

Std

879119890minus 32

143119890minus 27

147119890minus 110

201119890minus 21

750119890minus 22

119119890minus 20

00

013

110119890minus 15

586119890minus 27

433119890minus 20

616119890minus 17

0313119890minus 103

120119890minus 03

227119890minus 26

0

12 Mathematical Problems in Engineering

Table 4 Statistical test based on Wilcoxon Signed-Rank Test (120572 = 005)Algorithm versus COBSA 119901 value 119877+ 119877minus WinnerABC versus COBSA 250119890 minus 03 16 155 COBSAMABC versus COBSA 210119890 minus 03 15 156 COBSADSA versus COBSA 196119890 minus 04 0 171 COBSABSA versus COBSA 196119890 minus 04 0 171 COBSABA versus COBSA 196119890 minus 04 0 171 COBSA

Somesquare f2

01 02 03 04 05 06 07 08 090q

minus30

minus28

minus26

minus24

minus22

minus20

minus18

minus16

minus14

minus12

Fitn

ess

01 02 03 04 05 06 07 08 090q

minus22

minus20

minus18

minus16

minus14

minus12

minus10

minus8

minus6

Fitn

ess

Schwefel222 f4

Ackley f10

01 02 03 04 05 06 07 08 090q

minus15

minus14

minus13

minus12

minus11

minus10

minus9

minus8

Fitn

ess

Alpine f6

minus22

minus20

minus18

minus16

minus14

minus12

minus10

minus8

Fitn

ess

01 02 03 04 05 06 07 08 090q

Figure 3 The influence of 119902

this article to the blind source separation and hyperspectralimage processing

Conflicts of Interest

The authors declare that there are no conflicts of interestregarding the publication of this paper

Acknowledgments

The work is supported by National Nature Science Foun-dation of China (no 61401307) China Postdoctoral ScienceFoundation (no 2014M561184) Tianjin Research Program

of Application Foundation and Advanced Technology ofChina (no 15JCYBJC17100) and Tianjin Research Programof Science and Technology Commissioner of China (16JCT-PJC48400)

References

[1] J Kennedy and R Eberhart ldquoParticle swarm optimizationrdquoin Proceedings of the IEEE International Conference on NeuralNetworks pp 1942ndash1948 Perth Australia December 1995

[2] M Dorigo and T Stutzle Ant Colony Optimization MIT PressCombridge MA USA 2004

Mathematical Problems in Engineering 13

[3] J H HollandAdaptation in Natural and Artificial Systems MITPress Combridge MA USA 1992

[4] J J Liang A K Qin P N Suganthan and S BaskarldquoComprehensive learning particle swarm optimizer for globaloptimization of multimodal functionsrdquo IEEE Transactions onEvolutionary Computation vol 10 no 3 pp 281ndash295 2006

[5] K V Price R M Storn and J A Lampinen Differential Evo-lution A Practical Approach to Global Optimization SpringerBerlin Germany 2005

[6] J Zhang and A C Sanderson ldquoJADE Adaptive differentialevolution with optional external archiverdquo IEEE Transactions onEvolutionary Computation vol 13 no 5 pp 945ndash958 2009

[7] P Civicioglu ldquoTransforming geocentric cartesian coordinatesto geodetic coordinates by using differential search algorithmrdquoComputers and Geosciences vol 46 pp 229ndash247 2012

[8] D Karaboga and B Akay ldquoA comparative study of artificial beecolony algorithmrdquo Applied Mathematics and Computation vol214 no 1 pp 108ndash132 2009

[9] X S Yang and S Deb ldquoCuckoo search via Levy flightsrdquo inProceedings of the World Congress on Nature and BiologicallyInspired Computing (NABIC rsquo09) pp 210ndash214 CoimbatoreIndia December 2009

[10] X S Yang ldquoA new metaheuristic bat-inspired algorithmrdquo inNature Inspired Cooperative Strategies for Optimization (NICSOrsquo10) J R Gonzalez D A Pelta C Cruz G Terrazas andN Krasnogor Eds vol 284 of Studies in ComputationalIntelligence pp 65ndash74 Springer Berlin Germany 2010

[11] L Chen and L Y Zhang ldquoBlind signal separation algorithmbased on temporal predictability and differential search algo-rithmrdquo Journal on Communications vol 35 no 6 pp 117ndash1252014

[12] Z C Jia Y Y Xue L Chen Y J Guo and H D Xu ldquoBlindseparation algorithm for hyperspectral image based on thedenoising reduction and the bat optimizationrdquo Acta PhotonicaSinica vol 45 no 5 Article ID 0511001 2016

[13] J H Liang and M Ma ldquoArtificial bee colony algorithm basedresearch on image segmentationrdquo Computer Engineering andApplication vol 48 no 8 pp 194ndash197 2012

[14] P Civicioglu ldquoBacktracking search optimization algorithm fornumerical optimization problemsrdquo Applied Mathematics andComputation vol 219 no 15 pp 8121ndash8144 2013

[15] Z L Zhang and S Y Liu ldquoDecision-theoretic rough set attributereduction based on backtracking search algorithmrdquo ComputerEngineering and Applications vol 52 no 10 pp 71ndash74 2016

[16] X Chen S Y Liu and YWang ldquoEmergency resources schedul-ing based on improved backtracking search optimization algo-rithmrdquo Computer Applications and Software vol 32 no 12 pp235ndash238 2015

[17] M D Li and H Zhao ldquoBacktracking search optimizationalgorithm with comprehensive learning strategyrdquo System Engi-neering and Electronics vol 37 no 4 pp 958ndash963 2015

[18] X J Wang S Y Liu and W K Tian ldquoImproved backtrackingsearch optimization algorithm with new effective mutationscale factor and greedy crossover strategyrdquo Journal of ComputerApplications vol 34 no 9 pp 2543ndash2546 2014

[19] W K Tian S Y Liu and X J Wang ldquoStudy and improvementof backtracking search optimization algorithm based on differ-ential evolutionrdquoApplication Research of Computers vol 32 no6 pp 1653ndash1662 2015

[20] Y Shi and R Eberhart ldquoA modified particle swarm opti-mizerrdquo in Proceedings of the IEEE International Conference

on Evolutionary Computation and IEEE World Congress onComputational Intelligence (Cat No98TH8360) pp 69ndash73Anchorage Alaska USA May 1998

[21] W F Gao and S Y Liu ldquoA modified artificial bee colonyalgorithmrdquo Computers amp Operations Research vol 39 no 3 pp687ndash697 2012

[22] YW Shang and Y H Qiu ldquoA note on the extended Rosenbrockfunctionrdquo Evolutionary Computation vol 14 no 1 pp 119ndash1262006

[23] WHu and Z S Li ldquoA simpler andmore effective particle swarmoptimization algorithmrdquo Journal of Software vol 18 no 4 pp861ndash868 2007

[24] J Derrac S Garcıa D Molina and F Herrera ldquoA practicaltutorial on the use of nonparametric statistical tests as amethodology for comparing evolutionary and swarm intelli-gence algorithmsrdquo Swarm and Evolutionary Computation vol1 no 1 pp 3ndash18 2011

[25] S Garcıa D Molina M Lozano and F Herrera ldquoA study onthe use of non-parametric tests for analyzing the evolutionaryalgorithmsrsquo behaviour a case study on the CECrsquo2005 SpecialSession on Real Parameter Optimizationrdquo Journal of Heuristicsvol 15 no 6 pp 617ndash644 2009

Submit your manuscripts athttpswwwhindawicom

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical Problems in Engineering

Hindawi Publishing Corporationhttpwwwhindawicom

Differential EquationsInternational Journal of

Volume 2014

Applied MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Probability and StatisticsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical PhysicsAdvances in

Complex AnalysisJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

OptimizationJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

CombinatoricsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Operations ResearchAdvances in

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Function Spaces

Abstract and Applied AnalysisHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of Mathematics and Mathematical Sciences

Hindawi Publishing Corporationhttpwwwhindawicom Volume 201

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Algebra

Discrete Dynamics in Nature and Society

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Decision SciencesAdvances in

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014 Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Stochastic AnalysisInternational Journal of

Page 11: Improved Backtracking Search Algorithm Based on ...downloads.hindawi.com/journals/mpe/2017/3017608.pdfMathematicalProblemsinEngineering 3 3. The Improved Backtracking Search Algorithm

Mathematical Problems in Engineering 11

Table3Ex

perim

entb

ased

oncomparis

onalgorithm

Functio

n1198911

11989121198913

11989141198915

11989161198917

11989181198919

1198911011989111

1198911211989113

1198911411989115

1198911611989117

11989118ABC M

ean

925119890minus 11

530119890minus 05

445119890minus 08

016130119890minus 03

0237854

003459

035720119890minus 03

363774119890+ 14

1187589119890minus 18

385562

024Std

311119890minus 10

372119890minus 04

232119890minus 08

007437119890minus 04

0122466

005341

032197119890minus 02

207145119890+ 15

224479119890minus 18

112560

004MABC Mean

729119890minus 16

969119890minus 10

687119890minus 15

276119890minus 02

128119890minus 06

795119890minus 05

397356119890minus03

2306230119890minus 03

193119890minus 05

1707625119890minus 08

190219119890minus 12

2512506

013Std

900119890minus 17

205119890minus 09

249119890minus 14

570119890minus 03

269119890minus 07

733119890minus 05

611001

4462871119890minus 03

994119890minus 06

1014230119890minus 07

148978119890minus 12

0391313

034DSA M

ean

550119890minus 03

146501119890minus 17

769005

308104119890+ 03

10117117

3755135

1044503119890+ 16

4932680119890minus 36

2145701

381Std

630119890minus 03

087119119890minus 16

224002

13316171

0122957

3234796

930571119890+ 16

447305119890minus 35

0385193

068BS

A Mean

001237

309119890minus 17

1158

009757

127119890+ 03

11214134

3498661

012252119890+ 25

6467464119890minus 47

157934

494Std

092128

614119890minus 17

208002

30313081

0052738

2772706

008321119890+ 25

534205119890minus 46

019545

073BA

Mean

917119890minus 07

020484119890minus 07

476119890+ 31

017195672119890+03

1018684

1925794119890minus 06

515119890minus 08

502119890+ 25

40900

456119890minus 12

1181

348365

Std

410119890minus 08

035216119890minus 08

213119890+ 32

016639

19500

45010827

014355119890minus 07

230119890minus 09

225119890+ 25

4700202119890minus 12

268107

1634

COBS

A

Mean

201119890minus 32

323119890minus 28

433119890minus 111

395119890minus 20

251119890minus 22

399119890minus 21

00

2738409119890minus 15

147119890minus 27

101119890 minus 20

138119890minus 18

0750119890minus 104

432119890minus 04

542119890minus 27

0

Std

879119890minus 32

143119890minus 27

147119890minus 110

201119890minus 21

750119890minus 22

119119890minus 20

00

013

110119890minus 15

586119890minus 27

433119890minus 20

616119890minus 17

0313119890minus 103

120119890minus 03

227119890minus 26

0

12 Mathematical Problems in Engineering

Table 4 Statistical test based on Wilcoxon Signed-Rank Test (120572 = 005)Algorithm versus COBSA 119901 value 119877+ 119877minus WinnerABC versus COBSA 250119890 minus 03 16 155 COBSAMABC versus COBSA 210119890 minus 03 15 156 COBSADSA versus COBSA 196119890 minus 04 0 171 COBSABSA versus COBSA 196119890 minus 04 0 171 COBSABA versus COBSA 196119890 minus 04 0 171 COBSA

Somesquare f2

01 02 03 04 05 06 07 08 090q

minus30

minus28

minus26

minus24

minus22

minus20

minus18

minus16

minus14

minus12

Fitn

ess

01 02 03 04 05 06 07 08 090q

minus22

minus20

minus18

minus16

minus14

minus12

minus10

minus8

minus6

Fitn

ess

Schwefel222 f4

Ackley f10

01 02 03 04 05 06 07 08 090q

minus15

minus14

minus13

minus12

minus11

minus10

minus9

minus8

Fitn

ess

Alpine f6

minus22

minus20

minus18

minus16

minus14

minus12

minus10

minus8

Fitn

ess

01 02 03 04 05 06 07 08 090q

Figure 3 The influence of 119902

this article to the blind source separation and hyperspectralimage processing

Conflicts of Interest

The authors declare that there are no conflicts of interestregarding the publication of this paper

Acknowledgments

The work is supported by National Nature Science Foun-dation of China (no 61401307) China Postdoctoral ScienceFoundation (no 2014M561184) Tianjin Research Program

of Application Foundation and Advanced Technology ofChina (no 15JCYBJC17100) and Tianjin Research Programof Science and Technology Commissioner of China (16JCT-PJC48400)

References

[1] J Kennedy and R Eberhart ldquoParticle swarm optimizationrdquoin Proceedings of the IEEE International Conference on NeuralNetworks pp 1942ndash1948 Perth Australia December 1995

[2] M Dorigo and T Stutzle Ant Colony Optimization MIT PressCombridge MA USA 2004

Mathematical Problems in Engineering 13

[3] J H HollandAdaptation in Natural and Artificial Systems MITPress Combridge MA USA 1992

[4] J J Liang A K Qin P N Suganthan and S BaskarldquoComprehensive learning particle swarm optimizer for globaloptimization of multimodal functionsrdquo IEEE Transactions onEvolutionary Computation vol 10 no 3 pp 281ndash295 2006

[5] K V Price R M Storn and J A Lampinen Differential Evo-lution A Practical Approach to Global Optimization SpringerBerlin Germany 2005

[6] J Zhang and A C Sanderson ldquoJADE Adaptive differentialevolution with optional external archiverdquo IEEE Transactions onEvolutionary Computation vol 13 no 5 pp 945ndash958 2009

[7] P Civicioglu ldquoTransforming geocentric cartesian coordinatesto geodetic coordinates by using differential search algorithmrdquoComputers and Geosciences vol 46 pp 229ndash247 2012

[8] D Karaboga and B Akay ldquoA comparative study of artificial beecolony algorithmrdquo Applied Mathematics and Computation vol214 no 1 pp 108ndash132 2009

[9] X S Yang and S Deb ldquoCuckoo search via Levy flightsrdquo inProceedings of the World Congress on Nature and BiologicallyInspired Computing (NABIC rsquo09) pp 210ndash214 CoimbatoreIndia December 2009

[10] X S Yang ldquoA new metaheuristic bat-inspired algorithmrdquo inNature Inspired Cooperative Strategies for Optimization (NICSOrsquo10) J R Gonzalez D A Pelta C Cruz G Terrazas andN Krasnogor Eds vol 284 of Studies in ComputationalIntelligence pp 65ndash74 Springer Berlin Germany 2010

[11] L Chen and L Y Zhang ldquoBlind signal separation algorithmbased on temporal predictability and differential search algo-rithmrdquo Journal on Communications vol 35 no 6 pp 117ndash1252014

[12] Z C Jia Y Y Xue L Chen Y J Guo and H D Xu ldquoBlindseparation algorithm for hyperspectral image based on thedenoising reduction and the bat optimizationrdquo Acta PhotonicaSinica vol 45 no 5 Article ID 0511001 2016

[13] J H Liang and M Ma ldquoArtificial bee colony algorithm basedresearch on image segmentationrdquo Computer Engineering andApplication vol 48 no 8 pp 194ndash197 2012

[14] P Civicioglu ldquoBacktracking search optimization algorithm fornumerical optimization problemsrdquo Applied Mathematics andComputation vol 219 no 15 pp 8121ndash8144 2013

[15] Z L Zhang and S Y Liu ldquoDecision-theoretic rough set attributereduction based on backtracking search algorithmrdquo ComputerEngineering and Applications vol 52 no 10 pp 71ndash74 2016

[16] X Chen S Y Liu and YWang ldquoEmergency resources schedul-ing based on improved backtracking search optimization algo-rithmrdquo Computer Applications and Software vol 32 no 12 pp235ndash238 2015

[17] M D Li and H Zhao ldquoBacktracking search optimizationalgorithm with comprehensive learning strategyrdquo System Engi-neering and Electronics vol 37 no 4 pp 958ndash963 2015

[18] X J Wang S Y Liu and W K Tian ldquoImproved backtrackingsearch optimization algorithm with new effective mutationscale factor and greedy crossover strategyrdquo Journal of ComputerApplications vol 34 no 9 pp 2543ndash2546 2014

[19] W K Tian S Y Liu and X J Wang ldquoStudy and improvementof backtracking search optimization algorithm based on differ-ential evolutionrdquoApplication Research of Computers vol 32 no6 pp 1653ndash1662 2015

[20] Y Shi and R Eberhart ldquoA modified particle swarm opti-mizerrdquo in Proceedings of the IEEE International Conference

on Evolutionary Computation and IEEE World Congress onComputational Intelligence (Cat No98TH8360) pp 69ndash73Anchorage Alaska USA May 1998

[21] W F Gao and S Y Liu ldquoA modified artificial bee colonyalgorithmrdquo Computers amp Operations Research vol 39 no 3 pp687ndash697 2012

[22] YW Shang and Y H Qiu ldquoA note on the extended Rosenbrockfunctionrdquo Evolutionary Computation vol 14 no 1 pp 119ndash1262006

[23] WHu and Z S Li ldquoA simpler andmore effective particle swarmoptimization algorithmrdquo Journal of Software vol 18 no 4 pp861ndash868 2007

[24] J Derrac S Garcıa D Molina and F Herrera ldquoA practicaltutorial on the use of nonparametric statistical tests as amethodology for comparing evolutionary and swarm intelli-gence algorithmsrdquo Swarm and Evolutionary Computation vol1 no 1 pp 3ndash18 2011

[25] S Garcıa D Molina M Lozano and F Herrera ldquoA study onthe use of non-parametric tests for analyzing the evolutionaryalgorithmsrsquo behaviour a case study on the CECrsquo2005 SpecialSession on Real Parameter Optimizationrdquo Journal of Heuristicsvol 15 no 6 pp 617ndash644 2009

Submit your manuscripts athttpswwwhindawicom

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical Problems in Engineering

Hindawi Publishing Corporationhttpwwwhindawicom

Differential EquationsInternational Journal of

Volume 2014

Applied MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Probability and StatisticsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical PhysicsAdvances in

Complex AnalysisJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

OptimizationJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

CombinatoricsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Operations ResearchAdvances in

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Function Spaces

Abstract and Applied AnalysisHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of Mathematics and Mathematical Sciences

Hindawi Publishing Corporationhttpwwwhindawicom Volume 201

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Algebra

Discrete Dynamics in Nature and Society

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Decision SciencesAdvances in

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014 Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Stochastic AnalysisInternational Journal of

Page 12: Improved Backtracking Search Algorithm Based on ...downloads.hindawi.com/journals/mpe/2017/3017608.pdfMathematicalProblemsinEngineering 3 3. The Improved Backtracking Search Algorithm

12 Mathematical Problems in Engineering

Table 4 Statistical test based on Wilcoxon Signed-Rank Test (120572 = 005)Algorithm versus COBSA 119901 value 119877+ 119877minus WinnerABC versus COBSA 250119890 minus 03 16 155 COBSAMABC versus COBSA 210119890 minus 03 15 156 COBSADSA versus COBSA 196119890 minus 04 0 171 COBSABSA versus COBSA 196119890 minus 04 0 171 COBSABA versus COBSA 196119890 minus 04 0 171 COBSA

Somesquare f2

01 02 03 04 05 06 07 08 090q

minus30

minus28

minus26

minus24

minus22

minus20

minus18

minus16

minus14

minus12

Fitn

ess

01 02 03 04 05 06 07 08 090q

minus22

minus20

minus18

minus16

minus14

minus12

minus10

minus8

minus6

Fitn

ess

Schwefel222 f4

Ackley f10

01 02 03 04 05 06 07 08 090q

minus15

minus14

minus13

minus12

minus11

minus10

minus9

minus8

Fitn

ess

Alpine f6

minus22

minus20

minus18

minus16

minus14

minus12

minus10

minus8

Fitn

ess

01 02 03 04 05 06 07 08 090q

Figure 3 The influence of 119902

this article to the blind source separation and hyperspectralimage processing

Conflicts of Interest

The authors declare that there are no conflicts of interestregarding the publication of this paper

Acknowledgments

The work is supported by National Nature Science Foun-dation of China (no 61401307) China Postdoctoral ScienceFoundation (no 2014M561184) Tianjin Research Program

of Application Foundation and Advanced Technology ofChina (no 15JCYBJC17100) and Tianjin Research Programof Science and Technology Commissioner of China (16JCT-PJC48400)

References

[1] J Kennedy and R Eberhart ldquoParticle swarm optimizationrdquoin Proceedings of the IEEE International Conference on NeuralNetworks pp 1942ndash1948 Perth Australia December 1995

[2] M Dorigo and T Stutzle Ant Colony Optimization MIT PressCombridge MA USA 2004

Mathematical Problems in Engineering 13

[3] J H HollandAdaptation in Natural and Artificial Systems MITPress Combridge MA USA 1992

[4] J J Liang A K Qin P N Suganthan and S BaskarldquoComprehensive learning particle swarm optimizer for globaloptimization of multimodal functionsrdquo IEEE Transactions onEvolutionary Computation vol 10 no 3 pp 281ndash295 2006

[5] K V Price R M Storn and J A Lampinen Differential Evo-lution A Practical Approach to Global Optimization SpringerBerlin Germany 2005

[6] J Zhang and A C Sanderson ldquoJADE Adaptive differentialevolution with optional external archiverdquo IEEE Transactions onEvolutionary Computation vol 13 no 5 pp 945ndash958 2009

[7] P Civicioglu ldquoTransforming geocentric cartesian coordinatesto geodetic coordinates by using differential search algorithmrdquoComputers and Geosciences vol 46 pp 229ndash247 2012

[8] D Karaboga and B Akay ldquoA comparative study of artificial beecolony algorithmrdquo Applied Mathematics and Computation vol214 no 1 pp 108ndash132 2009

[9] X S Yang and S Deb ldquoCuckoo search via Levy flightsrdquo inProceedings of the World Congress on Nature and BiologicallyInspired Computing (NABIC rsquo09) pp 210ndash214 CoimbatoreIndia December 2009

[10] X S Yang ldquoA new metaheuristic bat-inspired algorithmrdquo inNature Inspired Cooperative Strategies for Optimization (NICSOrsquo10) J R Gonzalez D A Pelta C Cruz G Terrazas andN Krasnogor Eds vol 284 of Studies in ComputationalIntelligence pp 65ndash74 Springer Berlin Germany 2010

[11] L Chen and L Y Zhang ldquoBlind signal separation algorithmbased on temporal predictability and differential search algo-rithmrdquo Journal on Communications vol 35 no 6 pp 117ndash1252014

[12] Z C Jia Y Y Xue L Chen Y J Guo and H D Xu ldquoBlindseparation algorithm for hyperspectral image based on thedenoising reduction and the bat optimizationrdquo Acta PhotonicaSinica vol 45 no 5 Article ID 0511001 2016

[13] J H Liang and M Ma ldquoArtificial bee colony algorithm basedresearch on image segmentationrdquo Computer Engineering andApplication vol 48 no 8 pp 194ndash197 2012

[14] P Civicioglu ldquoBacktracking search optimization algorithm fornumerical optimization problemsrdquo Applied Mathematics andComputation vol 219 no 15 pp 8121ndash8144 2013

[15] Z L Zhang and S Y Liu ldquoDecision-theoretic rough set attributereduction based on backtracking search algorithmrdquo ComputerEngineering and Applications vol 52 no 10 pp 71ndash74 2016

[16] X Chen S Y Liu and YWang ldquoEmergency resources schedul-ing based on improved backtracking search optimization algo-rithmrdquo Computer Applications and Software vol 32 no 12 pp235ndash238 2015

[17] M D Li and H Zhao ldquoBacktracking search optimizationalgorithm with comprehensive learning strategyrdquo System Engi-neering and Electronics vol 37 no 4 pp 958ndash963 2015

[18] X J Wang S Y Liu and W K Tian ldquoImproved backtrackingsearch optimization algorithm with new effective mutationscale factor and greedy crossover strategyrdquo Journal of ComputerApplications vol 34 no 9 pp 2543ndash2546 2014

[19] W K Tian S Y Liu and X J Wang ldquoStudy and improvementof backtracking search optimization algorithm based on differ-ential evolutionrdquoApplication Research of Computers vol 32 no6 pp 1653ndash1662 2015

[20] Y Shi and R Eberhart ldquoA modified particle swarm opti-mizerrdquo in Proceedings of the IEEE International Conference

on Evolutionary Computation and IEEE World Congress onComputational Intelligence (Cat No98TH8360) pp 69ndash73Anchorage Alaska USA May 1998

[21] W F Gao and S Y Liu ldquoA modified artificial bee colonyalgorithmrdquo Computers amp Operations Research vol 39 no 3 pp687ndash697 2012

[22] YW Shang and Y H Qiu ldquoA note on the extended Rosenbrockfunctionrdquo Evolutionary Computation vol 14 no 1 pp 119ndash1262006

[23] WHu and Z S Li ldquoA simpler andmore effective particle swarmoptimization algorithmrdquo Journal of Software vol 18 no 4 pp861ndash868 2007

[24] J Derrac S Garcıa D Molina and F Herrera ldquoA practicaltutorial on the use of nonparametric statistical tests as amethodology for comparing evolutionary and swarm intelli-gence algorithmsrdquo Swarm and Evolutionary Computation vol1 no 1 pp 3ndash18 2011

[25] S Garcıa D Molina M Lozano and F Herrera ldquoA study onthe use of non-parametric tests for analyzing the evolutionaryalgorithmsrsquo behaviour a case study on the CECrsquo2005 SpecialSession on Real Parameter Optimizationrdquo Journal of Heuristicsvol 15 no 6 pp 617ndash644 2009

Submit your manuscripts athttpswwwhindawicom

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical Problems in Engineering

Hindawi Publishing Corporationhttpwwwhindawicom

Differential EquationsInternational Journal of

Volume 2014

Applied MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Probability and StatisticsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical PhysicsAdvances in

Complex AnalysisJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

OptimizationJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

CombinatoricsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Operations ResearchAdvances in

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Function Spaces

Abstract and Applied AnalysisHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of Mathematics and Mathematical Sciences

Hindawi Publishing Corporationhttpwwwhindawicom Volume 201

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Algebra

Discrete Dynamics in Nature and Society

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Decision SciencesAdvances in

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014 Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Stochastic AnalysisInternational Journal of

Page 13: Improved Backtracking Search Algorithm Based on ...downloads.hindawi.com/journals/mpe/2017/3017608.pdfMathematicalProblemsinEngineering 3 3. The Improved Backtracking Search Algorithm

Mathematical Problems in Engineering 13

[3] J H HollandAdaptation in Natural and Artificial Systems MITPress Combridge MA USA 1992

[4] J J Liang A K Qin P N Suganthan and S BaskarldquoComprehensive learning particle swarm optimizer for globaloptimization of multimodal functionsrdquo IEEE Transactions onEvolutionary Computation vol 10 no 3 pp 281ndash295 2006

[5] K V Price R M Storn and J A Lampinen Differential Evo-lution A Practical Approach to Global Optimization SpringerBerlin Germany 2005

[6] J Zhang and A C Sanderson ldquoJADE Adaptive differentialevolution with optional external archiverdquo IEEE Transactions onEvolutionary Computation vol 13 no 5 pp 945ndash958 2009

[7] P Civicioglu ldquoTransforming geocentric cartesian coordinatesto geodetic coordinates by using differential search algorithmrdquoComputers and Geosciences vol 46 pp 229ndash247 2012

[8] D Karaboga and B Akay ldquoA comparative study of artificial beecolony algorithmrdquo Applied Mathematics and Computation vol214 no 1 pp 108ndash132 2009

[9] X S Yang and S Deb ldquoCuckoo search via Levy flightsrdquo inProceedings of the World Congress on Nature and BiologicallyInspired Computing (NABIC rsquo09) pp 210ndash214 CoimbatoreIndia December 2009

[10] X S Yang ldquoA new metaheuristic bat-inspired algorithmrdquo inNature Inspired Cooperative Strategies for Optimization (NICSOrsquo10) J R Gonzalez D A Pelta C Cruz G Terrazas andN Krasnogor Eds vol 284 of Studies in ComputationalIntelligence pp 65ndash74 Springer Berlin Germany 2010

[11] L Chen and L Y Zhang ldquoBlind signal separation algorithmbased on temporal predictability and differential search algo-rithmrdquo Journal on Communications vol 35 no 6 pp 117ndash1252014

[12] Z C Jia Y Y Xue L Chen Y J Guo and H D Xu ldquoBlindseparation algorithm for hyperspectral image based on thedenoising reduction and the bat optimizationrdquo Acta PhotonicaSinica vol 45 no 5 Article ID 0511001 2016

[13] J H Liang and M Ma ldquoArtificial bee colony algorithm basedresearch on image segmentationrdquo Computer Engineering andApplication vol 48 no 8 pp 194ndash197 2012

[14] P Civicioglu ldquoBacktracking search optimization algorithm fornumerical optimization problemsrdquo Applied Mathematics andComputation vol 219 no 15 pp 8121ndash8144 2013

[15] Z L Zhang and S Y Liu ldquoDecision-theoretic rough set attributereduction based on backtracking search algorithmrdquo ComputerEngineering and Applications vol 52 no 10 pp 71ndash74 2016

[16] X Chen S Y Liu and YWang ldquoEmergency resources schedul-ing based on improved backtracking search optimization algo-rithmrdquo Computer Applications and Software vol 32 no 12 pp235ndash238 2015

[17] M D Li and H Zhao ldquoBacktracking search optimizationalgorithm with comprehensive learning strategyrdquo System Engi-neering and Electronics vol 37 no 4 pp 958ndash963 2015

[18] X J Wang S Y Liu and W K Tian ldquoImproved backtrackingsearch optimization algorithm with new effective mutationscale factor and greedy crossover strategyrdquo Journal of ComputerApplications vol 34 no 9 pp 2543ndash2546 2014

[19] W K Tian S Y Liu and X J Wang ldquoStudy and improvementof backtracking search optimization algorithm based on differ-ential evolutionrdquoApplication Research of Computers vol 32 no6 pp 1653ndash1662 2015

[20] Y Shi and R Eberhart ldquoA modified particle swarm opti-mizerrdquo in Proceedings of the IEEE International Conference

on Evolutionary Computation and IEEE World Congress onComputational Intelligence (Cat No98TH8360) pp 69ndash73Anchorage Alaska USA May 1998

[21] W F Gao and S Y Liu ldquoA modified artificial bee colonyalgorithmrdquo Computers amp Operations Research vol 39 no 3 pp687ndash697 2012

[22] YW Shang and Y H Qiu ldquoA note on the extended Rosenbrockfunctionrdquo Evolutionary Computation vol 14 no 1 pp 119ndash1262006

[23] WHu and Z S Li ldquoA simpler andmore effective particle swarmoptimization algorithmrdquo Journal of Software vol 18 no 4 pp861ndash868 2007

[24] J Derrac S Garcıa D Molina and F Herrera ldquoA practicaltutorial on the use of nonparametric statistical tests as amethodology for comparing evolutionary and swarm intelli-gence algorithmsrdquo Swarm and Evolutionary Computation vol1 no 1 pp 3ndash18 2011

[25] S Garcıa D Molina M Lozano and F Herrera ldquoA study onthe use of non-parametric tests for analyzing the evolutionaryalgorithmsrsquo behaviour a case study on the CECrsquo2005 SpecialSession on Real Parameter Optimizationrdquo Journal of Heuristicsvol 15 no 6 pp 617ndash644 2009

Submit your manuscripts athttpswwwhindawicom

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical Problems in Engineering

Hindawi Publishing Corporationhttpwwwhindawicom

Differential EquationsInternational Journal of

Volume 2014

Applied MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Probability and StatisticsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical PhysicsAdvances in

Complex AnalysisJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

OptimizationJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

CombinatoricsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Operations ResearchAdvances in

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Function Spaces

Abstract and Applied AnalysisHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of Mathematics and Mathematical Sciences

Hindawi Publishing Corporationhttpwwwhindawicom Volume 201

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Algebra

Discrete Dynamics in Nature and Society

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Decision SciencesAdvances in

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014 Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Stochastic AnalysisInternational Journal of

Page 14: Improved Backtracking Search Algorithm Based on ...downloads.hindawi.com/journals/mpe/2017/3017608.pdfMathematicalProblemsinEngineering 3 3. The Improved Backtracking Search Algorithm

Submit your manuscripts athttpswwwhindawicom

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical Problems in Engineering

Hindawi Publishing Corporationhttpwwwhindawicom

Differential EquationsInternational Journal of

Volume 2014

Applied MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Probability and StatisticsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical PhysicsAdvances in

Complex AnalysisJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

OptimizationJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

CombinatoricsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Operations ResearchAdvances in

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Function Spaces

Abstract and Applied AnalysisHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of Mathematics and Mathematical Sciences

Hindawi Publishing Corporationhttpwwwhindawicom Volume 201

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Algebra

Discrete Dynamics in Nature and Society

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Decision SciencesAdvances in

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014 Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Stochastic AnalysisInternational Journal of