research article multiobjective particle swarm

18
Research Article Multiobjective Particle Swarm Optimization Based on PAM and Uniform Design Xiaoshu Zhu, Jie Zhang, and Junhong Feng School of Computer Science and Engineering, Guangxi Universities Key Lab of Complex System Optimization and Big Data Processing, Yulin Normal University, Yulin 537000, China Correspondence should be addressed to Jie Zhang; [email protected] and Junhong Feng; [email protected] Received 26 November 2014; Accepted 3 March 2015 Academic Editor: Pandian Vasant Copyright © 2015 Xiaoshu Zhu et al. is is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. In MOPSO (multiobjective particle swarm optimization), to maintain or increase the diversity of the swarm and help an algorithm to jump out of the local optimal solution, PAM (Partitioning Around Medoid) clustering algorithm and uniform design are respectively introduced to maintain the diversity of Pareto optimal solutions and the uniformity of the selected Pareto optimal solutions. In this paper, a novel algorithm, the multiobjective particle swarm optimization based on PAM and uniform design, is proposed. e differences between the proposed algorithm and the others lie in that PAM and uniform design are firstly introduced to MOPSO. e experimental results performing on several test problems illustrate that the proposed algorithm is efficient. 1. Introduction Many real-world optimization problems oſten need to simul- taneously optimize multiple objectives that are incommen- surable and generally conflicting with each other. ey can usually be written as min ∈Ω { 1 () , 2 () , . . . , ()} , (1) where = ( 1 , 2 ,..., ) is a variable vector in a real and -dimensional space, Ω is the feasible solution space, and is the number of the objective functions. Since the pio- neering attempt of Schaffer [1] to solve multiobjective opti- mization problems, many kinds of multiobjective evolution- ary algorithms (MOEAs), ranging from traditional evolution- ary algorithms to newly developed techniques, have been proposed and widely used in different applications [24]. Multiobjective evolutionary algorithms, MOEAs, have become well-known methods for solving the multiobjective optimization problems that are too complex to be solved by exact methods. e main challenge for MOEAs is to be satisfied with three goals at the same time: (1) the Pareto opti- mal solutions are as near to true Pareto front, which means the convergence of MOEAs, (2) the nondominated solutions are evenly scattered along the Pareto front, which means the diversity of MOEAs, and (3) MOEAs obtain Pareto opti- mal solutions in limited evolution times [5]. e particle swarm optimization algorithm, PSO, and MOEAs are both intelligent optimization algorithms. It was proposed by Eberhart and Kennedy in 1995 [6, 7]. It originates from sharing and exchanging of information in the process of searching food among the bird’s individuals. Each individual can benefit from the discovery and flight experience of the others. PSO seems particularly suitable for multiobjective optimization mainly because of the high speed of conver- gence [8, 9]. PAM is one of -medoids clustering algorithms based on partitioning methods. It attempts to divide data objects into partitions. Namely, it can divide a swarm into different subswarms with different features. is paper proposed a novel multiobjective particle swarm optimization based on PAM and uniform design, abbreviated as UKMOPSO. It first uses PAM to partition the data points into several clusters, and then the smallest cluster is implemented crossover operator based on the uniform design to generate some new data points. When the size of the Pareto solution is larger than the size of the external archive, PAM is used to determine which Pareto solution is to be Hindawi Publishing Corporation Mathematical Problems in Engineering Volume 2015, Article ID 126404, 17 pages http://dx.doi.org/10.1155/2015/126404

Upload: others

Post on 16-Jun-2022

1 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Research Article Multiobjective Particle Swarm

Research ArticleMultiobjective Particle Swarm Optimization Based onPAM and Uniform Design

Xiaoshu Zhu Jie Zhang and Junhong Feng

School of Computer Science and Engineering Guangxi Universities Key Lab of Complex SystemOptimization and Big Data ProcessingYulin Normal University Yulin 537000 China

Correspondence should be addressed to Jie Zhang jgxyzjzj126com and Junhong Feng jgxyfjh126com

Received 26 November 2014 Accepted 3 March 2015

Academic Editor Pandian Vasant

Copyright copy 2015 Xiaoshu Zhu et al This is an open access article distributed under the Creative Commons Attribution Licensewhich permits unrestricted use distribution and reproduction in any medium provided the original work is properly cited

In MOPSO (multiobjective particle swarm optimization) to maintain or increase the diversity of the swarm and help an algorithmto jump out of the local optimal solution PAM (Partitioning Around Medoid) clustering algorithm and uniform design arerespectively introduced to maintain the diversity of Pareto optimal solutions and the uniformity of the selected Pareto optimalsolutions In this paper a novel algorithm the multiobjective particle swarm optimization based on PAM and uniform design isproposedThe differences between the proposed algorithm and the others lie in that PAM and uniform design are firstly introducedto MOPSO The experimental results performing on several test problems illustrate that the proposed algorithm is efficient

1 Introduction

Many real-world optimization problems often need to simul-taneously optimize multiple objectives that are incommen-surable and generally conflicting with each other They canusually be written as

min119883isinΩ

1198911 (119883) 1198912 (119883) 119891119872 (119883) (1)

where 119883 = (1199091 1199092 119909

119873) is a variable vector in a real and

119873-dimensional space Ω is the feasible solution space and119872 is the number of the objective functions Since the pio-neering attempt of Schaffer [1] to solve multiobjective opti-mization problems many kinds of multiobjective evolution-ary algorithms (MOEAs) ranging from traditional evolution-ary algorithms to newly developed techniques have beenproposed and widely used in different applications [2ndash4]

Multiobjective evolutionary algorithms MOEAs havebecome well-known methods for solving the multiobjectiveoptimization problems that are too complex to be solvedby exact methods The main challenge for MOEAs is to besatisfied with three goals at the same time (1) the Pareto opti-mal solutions are as near to true Pareto front which meansthe convergence of MOEAs (2) the nondominated solutions

are evenly scattered along the Pareto front which meansthe diversity of MOEAs and (3) MOEAs obtain Pareto opti-mal solutions in limited evolution times [5]

The particle swarm optimization algorithm PSO andMOEAs are both intelligent optimization algorithms It wasproposed by Eberhart andKennedy in 1995 [6 7] It originatesfrom sharing and exchanging of information in the process ofsearching food among the birdrsquos individuals Each individualcan benefit from the discovery and flight experience of theothers PSO seems particularly suitable for multiobjectiveoptimization mainly because of the high speed of conver-gence [8 9]

PAM is one of 119896-medoids clustering algorithms based onpartitioning methods It attempts to divide data objects into119896 partitions Namely it can divide a swarm into 119896 differentsubswarms with different features

This paper proposed a novel multiobjective particleswarm optimization based on PAM and uniform designabbreviated as UKMOPSO It first uses PAM to partition thedata points into several clusters and then the smallest clusteris implemented crossover operator based on the uniformdesign to generate some new data pointsWhen the size of thePareto solution is larger than the size of the external archivePAM is used to determine which Pareto solution is to be

Hindawi Publishing CorporationMathematical Problems in EngineeringVolume 2015 Article ID 126404 17 pageshttpdxdoiorg1011552015126404

2 Mathematical Problems in Engineering

removed or appended The results of the experimental sim-ulation implemented on several well-known test problemsindicate that the proposed algorithm is efficient

The rest of this paper is organized as follows Section 2states the preliminaries of the proposed method Section 3presents our method in detail Section 4 gives the numericalresults of the proposed method The conclusion of the workis made in Section 5

2 Preliminaries

In this section we describe some concepts concerningparticle swarm optimization multiobjective particle swarmoptimization PAM and uniform design

21 Particle Swarm Optimization Algorithms In 119889-dimen-sional search space the position and velocity of the 119894th parti-cle are respectively represented as 119883

119894= [1199091198941 1199091198942 119909

119894119889]

and 119881119894= [V1198941 V1198942 V

119894119889] The optimal positions of the

119894th particle and the whole swarm namely the individualoptimal and the global optimal are denoted as 119875119887119890119904119905 =[1199011198941 1199011198942 119901

119894119889] and 119866119887119890119904119905 = [119901

1198921 1199011198922 119901

119892119889] respec-

tively The individuals or particles in the swarm update theirvelocities and positions according to the following formulas

V119894119895 (119896 + 1) = 119908 sdot V119894119895 (119896) + 1198881 sdot 1199031 sdot (119875119887119890119904119905119894119895 (119896) minus 119909119894119895 (119896))

+ 1198882times 1199032times (119866119887119890119904119905

119895 (119896) minus 119909119894119895 (119896))

(2)

119909119894119895 (119896 + 1) = 119909119894119895 (119896) + V

119894119895 (119896 + 1) (3)

where the inertia weight coefficient 119908 indicates the ability tomaintain the previous speed the acceleration coefficients 119888

1

and 1198882are used to coordinate the degrees of tracking individ-

ual optimal and global optimal and 1199031and 1199032are two ran-

dom numbers drawn from the uniform distribution on theinterval (0 1)

The update equation of the velocity consists of the previ-ous velocity component a cognitive component and a socialcomponent They are mainly controlled by three parametersthe inertia weight and two acceleration coefficients

From the theoretical analysis for the trajectory of particlesin PSO [10] the trajectory of a particle 119883

119894converges to a

weightedmean of119875119894and119875119892Whenever the particle converges

it will ldquoflyrdquo to the individual best position and the global bestposition According to the update equation the individualbest position of the particle will gradually move closer to theglobal best position Therefore all the particles will convergeonto the global best particlersquos position

22 Multiobjective Particle Swarm Optimization MOPSO isproposed by Coello et al it adopts swarm intelligence tooptimize MOPs and it uses the Pareto optimal set to guidethe particlersquos flight [9]

Particle swarm optimization has been proposed forsolving a large number of single objective problems Manyresearchers are interested in solving multiobjective problems(MOP) using PSO To modify a single objective PSO to

MOPSO a guide must be redefined in order to obtain a setof nondominated solutions (Pareto front) In MOPSO thePareto optimal solutions should be used to determine theguide for each particle How to select suitable local guidesfor attaining both convergence and diversity of solutionsbecomes an important issue

There have been some publications to use PSO to solveMOP A dynamic neighborhood PSO was proposed [11]which optimizes only one objective at a time and uses ascheme similar to lexicographic ordering In addition thisapproach also proposes an unconstrained elite archive nameddominated tree to store the nondominated solutions How-ever it is a difficult issue for this approach to pick up a bestlocal guide from the set of Pareto optimal solutions for eachparticle of the population A strategy for finding suitablelocal guides for each particle was proposed and named Sigmamethod The local guide is explicitly assigned to specifyparticles according to the Sigma value [12] This results inthe desired diversity and convergence but it is still not closeenough to the Pareto front On the other hand an enhancedarchiving technique to maintain the best (nondominated)solutions found during the course of a MO algorithm wasproposed [13] It shows that using archives in PSO for MOproblems will improve their performance directly A paral-lel vector evaluated particle swarm optimization (VEPSO)method for multiobjective problems was proposed [14]which adopted a ring migration topology and PVE system tosimultaneously work 2ndash10 CPUs to find nondominated solu-tions In [9] MOPSO method was proposed It incorporatesPareto dominance and a special mutation operator to solveMO problems [15]

Recently a hybrid multiobjective algorithm combiningboth genetic algorithm (GA) and particle swarm optimiza-tion (PSO) was proposed [16] A multiobjective particleswarm optimization based on self-update and grid strategywas proposed for improving the function of Pareto set [17]A new dynamic self-adaptive multiobjective particle swarmoptimization (DSAMOPSO) method is proposed to solvebinary-state multiobjective reliability redundancy allocationproblems [18] which used a modified nondominated sortinggenetic algorithm (NSGA-II)method and a customized time-variant multiobjective particle swarm optimization methodto generate nondominated solutions

The MOPSO method is becoming more popular due toits simplicity to be implemented and its ability to quicklyconverge to a reasonably acceptable solution for problems inscience and engineering

23 PAM Algorithm There are many clustering methodsavailable in data mining Typical clustering analysis methodsare clustering based on partition hierarchical clusteringclustering based on density clustering based on grid andclustering based on model

The most frequently used clustering methods based onpartition are 119896-means and 119896-medoids In contrast to the 119896-means algorithm 119896-medoids chooses data points as cen-troids which make 119896-medoids method more robust than 119896-means in the presence of noise and outliers The reason isthat 119896-medoids method is less influenced by outliers or other

Mathematical Problems in Engineering 3

Algorithm PAM a 119896-medoids algorithm for partitioning based on medoid or central objectsInput119896 the number of clusters119863 a data set containing 119899 objectsOutput A set of 119896 clustersMethod(1) Arbitrarily choose 119896 objects in119863 as the initial representative objects or seeds(2) Repeat(3) Assign each remaining object to the cluster with the nearest representative object(4) Randomly select a nonrepresentative object 119900 random(5) Compute the total cost 119878 of swapping representative object 119900

119895 with 119900 random

(6) if 119878 lt 0 then swap 119900119895with 119900 random to form the new set of 119896 representative objects

(7) Until no change

Algorithm 1 PAM algorithm

extreme values than 119896-means PAM (Partitioning AroundMedoids) is the first and the most frequently used 119896-medoidsalgorithms It is shown in Algorithm 1

PAM constructs 119896 partitions (clusters) of the givendataset where each partition represents a cluster Each clustermay be represented by a centroid or a cluster representativewhich is some sort of summary description of all the objectscontained in a cluster It needs to determine 119896 partitions for 119899objectsThe process of PAM is by and large as follows Firstlyrandomly select 119896 representative objects and cluster otherobjects to the same group as the representative object accord-ing to the minimum distances between the representativeobject and other objects Then try to replace these represen-tative objects with other nonrepresentative objects in orderto minimize squared error All the possible pairs of objectsare analyzed where one object in each pair is considered arepresentative object and the other is not The total cost ofthe clustering is calculated for each such combination Anobject will be replaced with such an object having mini-mized squared error The set of best objects in each clusterafter iteration forms the representative objects for the nextiteration The final set of representative objects is the respec-tive centroids of the clusters [19ndash22]

24 Uniform Design

241 Uniform Design Themain objective of uniform designis to sample a small set of points from a given set of pointssuch that the sampled points are uniformly scattered [23]

Let there be 119899 factors and 119902 levels per factor When 119899 and119902 are given the uniform design selects 119902 from 119902119899 possiblecombinations such that these 119902 combinations are uniformlyscattered over the space of all possible combinations Theselected 119902 combinations are expressed as a uniform array119880(119899 119902) = [119880

119894119895]119902 times 119899 where 119880

119894119895is the level of the 119895th factor

in the 119894th combination and can be calculated by the followingformula

119880119894119895= (119894120590119895minus1 mod 119902) + 1 (4)

where 120590 is a parameter given in Table 1

242 Improved Generation of Initial Population An algo-rithm for dividing the solution space and an algorithm forgenerating the initial population have been designed [23]However Algorithm 2 in [23] considers only the dividingof the solution space not the dividing of the 119873-dimensionspace This will bring about some serious problems If weassume that in Step 2 119876

0= 5 and 119873 = 10 then 119880(119873119876

0)

is impossible to be generated because 1198760must be larger than

119873 Namely Algorithm 2 in [23] is only suitable for the lowdimensional problem In order to overcome the shortcom-ings the dividing of the119873-dimension space is introduced toimprove the algorithm The improved algorithm can be suit-able for not only low dimensional but also high dimensionalproblem The improved algorithm is shown as follows

Algorithm A (improved generation of initial population)

Step 1 Judge whether1198760is valid or not as it must be found in

the 1st column of Table 1 If it is not valid it stops and showserror messages otherwise it continues

Step 2 Execute Algorithm 1 in [23] to divide [119897 119906] into 119878subspaces [119897(1) 119906(1)] [119897(2) 119906(2)] sdot sdot sdot [119897(119878) 119906(119878)]

Step 31 Judge whether1198760is more than119873 if yes turn to Step

32 otherwise turn to Step 33

Step 32 Quantize each subspace and apply the uniform array119880(119873119876

0) to sample 119876

0points

Step 33 Divide119873-dimension space into lfloor1198731198730rfloor parts where

1198730is an integer more than 1 and less than119876

0and is generally

taken as1198760minus1 Among lfloor119873119873

0rfloorparts the 1st part corresponds

to the dimension from 1 to1198730 and the 2nd part corresponds

to the dimension from 1198730+ 1 to 2 lowast 119873

0 and so forth If

the remainder 119877 of 1198731198730is not equal to 0 then a plus part

corresponds to the dimension from lfloor1198731198730rfloor lowast 119873

0+ 1 to 119873

whose length is surely less than1198730 Repeat to execute Step 34

for each part

Step 34 Quantize each subspace and then apply uniformarray 119880(119873

0 1198760) to sample 119876

0points It is noteworthy that

4 Mathematical Problems in Engineering

Table 1 Values of the parameter 120590 for different number of factorsand different number of levels per factor

Number of levelsper factors Number of factors 120590

5 2sim4 27 2sim6 311 2sim10 7

132 53 4

4sim12 617 2sim16 10

19 2sim3 84sim18 14

232 13sim14 20sim22 7

8sim12 153sim7 15sim19 17

29

2 123 9

4sim7 168sim12 16sim24 8

13sim15 1425sim28 18

31 2 5sim12 20sim30 123sim4 13sim19 22

119880(1198730 1198760) is replaced with 119880(119877119876

0) in the plus part where

the remainder 119877 is equal to119873 minus lfloor1198731198730rfloor lowast 1198730

243 Crossover Operator Based on the Uniform Design Thecrossover operator based on the uniform design acts ontwo parents It quantizes the solution space defined by theseparents into a finite number of points and then it applies theuniformdesign to select a small sample of uniformly scatteredpoints as the potential offspring

For any two parents 1199011and 119901

2 their minimal values

and the maximal values of each dimension are used toform the novel solution space [119897parent 119906parent] Each domain of[119897parent 119906parent] is quantized into 119876

1levels where 119876

1is a pre-

defined prime numberThen the uniform design is applied toselect a sample of points as the potential offspringThe detailsof the algorithm can be referred to [23]

3 The Proposed Algorithm

31 Elitist Selection or Elitism [4] Elitism means that eliteindividuals cannot be excluded from the mating pool of thepopulation A strategy presented can always include the bestindividual of the current population in the next generation inorder to prevent the loss of good solutions found so far Thisstrategy can be extended to copy the best 119899 individuals to thenext generation This is explanation of the elitism In MOPelitism plays an important role

Two strategies are often used to implement elitism Onemaintains elitist solutions in the population the other stores

population paretocreate (pop)Input pop indicates the populationOutput pareto indicates the non-dominated solutionsparetolarr popfor forall chr1 isin popfor forall chr2 isin pareto and chr2 = chr1if chr1 dominate chr2

paretolarr pareto minus chr2end if

end forend forreturn pareto

Algorithm 2 Pseudocode of selecting elitist

elitist solutions into an external secondary list or externalarchive and reintroduces them to the populationThe formercopies all nondominated solutions in the current populationto the next population and then fills the rest of the next popu-lation by selecting from the remaining dominated solutions inthe current populationThe latter uses an external secondarylist or external archive to store the elitist solutions Externallist stores the nondominated solutions found It will beupdated in the next generation by means of removing elitistsolutions dominated by a new solution or adding the newsolution if it is not dominated by any existing elitist solution

The work adopts the second strategy namely storing eli-tist solutions to an external secondary list Its advantage is thatit can preserve and dynamically adjust all the nondominatedsolutions set till the current generation The pseudocodes ofselecting elitist and updating elitist are respectively shown inAlgorithms 2 and 3

32 Selection Mechanism for the Swarm Based on UniformDesign This paper adopts the uniform design to select thebest 119879

1points (119879

1lt 119879) from the 119879 points and acquires their

objectives fit119881 The detailed steps are as follows

Algorithm B

Step 1 Calculate each of119872 objectives for each of the119879 pointsnormalize each of objectives 119891

119894(119909) as follows

ℎ119894119909 =

119891119894 (119909)

max119910isin120595

1003816100381610038161003816119891119894 (119910)1003816100381610038161003816 (5)

where 120595 is a set of points in the current population and ℎ119894119909 is

the normalized objective

Step 2 Apply the uniform design to generate the 1198630weight

vectors 1199081 1199082 119908

1198630 each of them is used to compose

one fitness function by the following formula where 1198630is a

design parameter and it is prime

fit119894= 1199081198941ℎ1 (119909) + 1199081198942ℎ2 (119909) + sdot sdot sdot + 1199081198941198630ℎ1198630 (119909)

1 le 119894 le 1198630

(6)

Mathematical Problems in Engineering 5

population paretoupdate(offspring pareto)Input offspring indicates the offsprings after performing the crossover operator

and mutation operator pareto indicates the non-dominated solutionsOutput pareto indicates the non-dominated solutionsoffspringlarr call paretocreate(offspring)for forall chr1 isin offspring

nondominatedlarr truefor forall chr2 isin paretoif chr1 dominate chr2paretolarr pareto minus chr2

else if chr2 dominate chr1nondominatedlarr false

elsecontinue

end ifend forif nondominatedparetolarr pareto cup chr1

end ifend forreturn pareto

Algorithm 3 Pseudocode of updating elitist

Step 3 Based on each of fitness functions evaluate the qualityof the 119879 points Assume the remainder 119879

11198630is 1198770 For the

first 1198770fitness functions select the best lceil119879

11198630rceil points for

the rest of fitness functions select the best lfloor11987911198630rfloor points

Overall a total of 1198791points are selected The objectives of

these selected points are correspondingly stored into fit119881

33 Selection Mechanism for Gbest Based on PAM InMOPSO 119866119887119890119904119905 plays an important role in guiding the entireswarm toward the global Pareto front [24] In contrast tothe single objective PSO having only one global best 119866119887119890119904119905MOPSO has multiple Pareto optimality solutions which arenondominated each other How to select a suitable 119866119887119890119904119905 foreach of particles fromPareto optimal solutions is one very keyissue

The paper presents the selection mechanism for 119866119887119890119904119905based on PAM as follows

Algorithm C Assume the population and the numbers ofparticles are denoted as pop and119873pop and the Pareto opti-mality and the numbers of Pareto optimality are denoted aspareto and119873119875

Step 1 Acquire the number of cluster according to the fol-lowing formula

119870 = 119870min + (119870max minus 119870min) sdot (119905

max119866) (7)

where 119870min and 119870max respectively denote the minimal andmaximal value of 119870 namely 119870 isin [119870min 119870max] 119905 and max119866indicate the 119905th iteration and the maximal iteration numberThe formula can acquire the linearly increasing number ofcluster so as to accord with the process from the coarse searchto the elaborate search

Step 2 If 119873119875 le 119870 then for each particle in pop find thenearest Pareto optimal solution as its 119866119887119890119904119905 Otherwise turnto Step 3

Step 3 Perform PAM described in Section 23 to partitionPareto and pop into 119870 clusters respectively the clustercentroids of which are denoted as119862

1= 11988811 11988812 119888

1119870 and

1198622= 11988821 11988822 119888

2119870

Step 4 For each 1198882119895isin 1198622 find the nearest 119888

1119894isin 1198621 For

all the particles in the cluster represented by 1198882119895 their 119866119887119890119904119905

randomly takes one of Pareto optimal solutions in the clusterrepresented by 119888

1119894

34 Adjustment of Pareto Optimal Solutions Based on theUniform Design and PAM The number of Pareto optimalsolutions in the external archivewill increasingly enlargewithevolution namely the size of the external archive will becomevery large with the iteration number This will increasethe computation complexity and the execution time of analgorithmTherefore the size of the external archive must becontrolled In the meanwhile Pareto optimal solutions in theexternal archive are possibly close to each other in the objec-tive space If the solutions are close to each other they arenearly the same choice It is desirable to find the Pareto opti-mal solutions scattered uniformly over the Pareto front sothat the decision maker can have a variety of choices There-fore how to control the size of the external archive and selectrepresentative Pareto optimal solutions scattered uniformlyover the Pareto front is a key issue

The paper presents the adjustment algorithm of Paretooptimal solutions based on the uniform design and PAMThe algorithm firstly implements PAM to partition thePareto front into 119870 clusters in the objective space and then

6 Mathematical Problems in Engineering

implements the uniform crossover operator on the minimalcluster so as to generate more Pareto optimality in theminimal cluster finally it keeps all the points in the lesserclusters and discards some points in the larger clusters Thedetailed steps of the algorithm are shown as follows

Algorithm DAssume the Pareto optimality their values of119872functions and the numbers of Pareto optimality are denotedas pareto 119865pareto and119873119875 The size of the external archive isassumed as119873119875 lim The number of clusters is 119870

Step 1 If the numbers of data points in the maximal andthe minimal clusters have too many differences or are bothvery small select all the data points from the minimal clusterand the same number of points from the cluster which isthe nearest to the minimal cluster For each pair of datapoints perform the uniform crossover operator described inSection 243 Update pareto 119865pareto and 119873119875 according toAlgorithm 3

Step 2 If 119873119875 gt 119873119875 lim implement PAM to partition the119865pareto into119870 clusters in the objective space and let119873 div =119873119875 lim 119870 it indicates the average points to select from eachof119870 clustersWe can classify three situations according to thenumber of points in each cluster and119873 div as follows

(i) Situation 1 the numbers of points in all clusters arelarger than or equal to119873 div

(ii) Situation 2 the number of data points is larger thanor equal to119873 div only in one cluster

(iii) Situation 3 the number of the clusters in which thenumber of data points is larger than 119873 div is largerthan 1 and less than119870

Step 3 For Situation 1 sort all clusters according to their con-taining points in ascending order Select 119873 div points fromthe previous 119870 minus 1 clusters and select the remainder points119873119875 limminus119873 div lowast(119870 minus 1) from the last cluster Turn to Step7

Step 4 For Situation 2 keep the points of all the clusters inwhich the number of the points is less than or equal to119873 divand the remainder points are selected from that only clusterTurn to Step 7

Step 5 For Situation 3 keep all the points of all the clustersin which the number of points is less than or equal to119873 divTurn to Step 6 to select the remainder point

Step 6 Recalculate 119873 div and let 119873 div = rem1198751198701015840 whererem119875 and 1198701015840 indicate the number of the remainder pointsand clusters According to the new 119873 div turn to Step 3 orStep 4

Step 7 Save the selected 119873119875 lim points and terminate thealgorithm

35 Thoughts on the Proposed Algorithm In MOP for 119872objectives 119891

1 1198912 119891

119872and any two points 119909 and 119910 in the

feasible solution space Ω if each objective is satisfied with119891119894(119909) le 119891

119894(119910) and at least one objective is satisfied with

119891119894(119909) lt 119891

119894(119910) namely 119909 is at least as good as 119910with respect to

all the objectives and 119909 is strictly better than 119910 with respectto at least one objective then we say that 119909 dominates 119910 Ifno other solution is strictly better than 119909 then 119909 is called anondominated solution or Pareto optimal solution

In single objective problems there is only one objective tobe optimized Therefore the global best (119866119887119890119904119905) of the wholeswarm is only one In multiobjective problems there aremultiple consistent or conflicting objectives to be optimizedTherefore there exist very large or infinite numbers of solu-tions which cannot dominate each other These solutions arenondominated solutions or Pareto optimal solutions There-fore 119866119887119890119904119905 of the whole swarm is more than one How toselect suitable 119866119887119890119904119905 is a very key issue

When it is not possible to find all these solutions it maybe desirable to find as many solutions as possible in order toprovide more choices to the decision maker However if thesolutions are close to each other they are nearly of the samechoice It is desirable to find the Pareto optimal solutionsscattered uniformly over the Pareto front so that the decisionmaker can have a variety of choices The paper introducesuniform design to ensure that the acquired solutions scatteruniformly over the Pareto front in the objective space

For the MOPSO the diversity of the population is a veryimportant factor It has a key impact on the convergence ofan algorithm and uniformly distribution of Pareto optimalsolution It can effectively get rid of the premature of thealgorithmsThe paper introduces PAM clustering algorithmsto maintain the diversity of the population The particles inthe same cluster have similar features whereas ones in differ-ent clusters have dissimilar features Thus choosing particlesfrom different clusters may necessarily increase the diversityof the population

36 Steps of the Proposed Algorithm

Step 1 According to the population size 119873pop determinethe number of subintervals 119878 and the population size 119876

0in

each subinterval such that 1198760lowast 119878 is more than 119873pop 119876

0is

a prime and must exist in Table 1 Execute Algorithm A inSection 242 to generate a temporary population containing1198760lowast 119878 ge 119873pop points

Step 2 PerformAlgorithm B described in Section 32 to selectthe best 119873pop points from the 119876

0lowast 119878 points as the initial

population pop and acquire their objectives fit119881

Step 3 Initialize the speed and119875119887119890119904119905 of the particle 119894 by119881[119894] =pop[119894] and 119875119887119890119904119905[119894] = pop[119894] 119891119875119887119890119904119905[119894] = fit119881[119894] where 119894 =1 sdot sdot sdot 119873pop 119891119875119887119890119904119905[119894] denotes objectives of 119875119887119890119904119905[119894]

Step 4 According to Algorithm 2 select elitist or Paretooptimal solutions from pop and store them to the externalsecondary list pareto the size of which is assumed as119873119875

Step 5 Choose the suitable119866119887119890119904119905 for each of the particles frompareto in terms of Algorithm C described in Section 33

Mathematical Problems in Engineering 7

Step 6 Update the position pop and velocity 119881 for each ofthe particles respectively using formulas (2) and (3) whereformula (2) is modified as follows

V119894119895 (119896 + 1) = 119908 sdot V119894119895 (119896)

+ 1198881sdot 1199031sdot (119875119887119890119904119905

119894119895 (119896) minus 119909119894119895 (119896))

+ 1198882times 1199032times (119866119887119890119904119905

119894119895 (119896) minus 119909119894119895 (119896))

(8)

Step 7 For the 119895th dimensional variable of the 119894th particle ifit goes beyond its lower boundary or upper boundary then ittakes the 119895th dimensional value of its corresponding bound-ary and the 119895th dimensional value of its velocity takes theopposite value

Step 8 Calculate each of119872 objectives for each of the particlesand update them into fit119881

Step 9 Update 119875119887119890119904119905 of each particle as followsFor the 119894th particle if fit119881[119894] dominates 119891119875119887119890119904119905[119894] then

let 119891119875119887119890119904119905[119894] = fit119881[119894] and 119875119887119890119904119905[119894] = pop[119894] otherwise119875119887119890119904119905[119894] and119891119875119887119890119904119905[119894] are kept unchanged If neither of themis dominated by the other then randomly select one of themas 119891119875119887119890119904119905[119894] and update its corresponding 119875119887119890119904119905[119894]

Step 10 Update the external archive pareto storing Paretooptimality and its size119873119875 according to Algorithm 3

Step 11 Implement Algorithm D described in Section 34 toadjust the Pareto optimal solutions such that the number ofPareto optimal solutions is less than or equal to the size of theexternal archive119873119875 lim

Step 12 If the stop criterion is satisfied terminate algorithmotherwise turn to Step 5 and continue

4 Numerical Results

In order to evaluate the performances of the proposedalgorithm we compare it with two outstanding algorithmsUMOGA [23] and NSGA-II [25]

41 Test Problems Several well-known multiobjective testfunctions are used to test the performance of the proposedalgorithm

Biobjective problem FON [25 26] KUR [25 27]ZDT1 ZDT2 ZDT3 and ZDT6 [25 28ndash31]Three-objective problems DTLZ1 and DTLZ [2 32]

The definitions of them are as followsFON is defined as

1198911 (119909) = 1 minus exp(minus

3

sum119894=1

(119909119894minus1

radic3)2

)

1198912 (119909) = 1 minus exp(minus

3

sum119894=1

(119909119894+1

radic3)2

)

(9)

where 119909119894isin [minus4 4]

This test function has a nonconvex Pareto optimal frontKUR is defined as

1198911 (119909) =

119899minus1

sum119894=1

(minus10 exp (minus02 sdot radic1199092119894+ 1199092119894+1))

1198912 (119909) =

119899

sum119894=1

(1003816100381610038161003816119909119894100381610038161003816100381608+ 5 sdot sin1199093

119894)

(10)

where 119899 = 3 and 119909119894isin [minus5 5]

This test function has a nonconvex Pareto optimal frontin which there are three discontinuous regions

ZDT1 is defined as1198911(1199091) = 1199091

119892 (1199092 119909

119899) = 1 +

9sum119899119894=2119909119894

(119899 minus 1)

ℎ (1198911 119892) = 1 minus radic

1198911

119892

(11)

where 119899 = 30 and 119909119894isin [0 1]

This test function has a convex Pareto optimal frontZDT2 is defined as

1198911(1199091) = 1199091

119892 (1199092 119909

119899) = 1 +

9sum119899119894=2119909119894

(119899 minus 1)

ℎ (1198911 119892) = 1 minus (

1198911

119892)2

(12)

where 119899 = 30 and 119909119894isin [0 1]

This test function has a nonconvex Pareto optimal frontZDT3 is defined as

1198911(1199091) = 1199091

119892 (1199092 119909

119899) = 1 +

9sum119899119894=2119909119894

(119899 minus 1)

ℎ (1198911 119892) = 1 minus radic

1198911

119892minus (

1198911

119892) sin (10120587119891

1)

(13)

where 119899 = 30 and 119909119894isin [0 1]

This test function represents the discreteness feature ItsPareto optimal front consists of several noncontiguous con-vex parts The introduction of the sine function in ℎ causesdiscontinuity in the Pareto optimal front

ZDT6 is defined as

1198911(1199091) = 1 minus exp (minus4119909

1) sin6 (6120587119909

1)

119892 (1199092 119909

119899) = 1 + 9(

sum119899119894=2119909119894

(119899 minus 1))025

ℎ (1198911 119892) = 1 minus (

1198911

119892)2

(14)

where 119899 = 10 and 119909119894isin [0 1]

8 Mathematical Problems in Engineering

This test function has a nonconvex Pareto optimal frontIt includes two difficulties caused by the nonuniformityof the search space Firstly Pareto optimal solutions arenonuniformly distributed along the global Pareto optimalfront (the front is biased for solutions in which 119891

1(1199091) is near

one) Secondly the density of the solutions is the lowest nearthe Pareto optimal front and the highest away from the front

DTLZ1 is defined as

1198911 (119909) = 0511990911199092 (1 + 119892 (119909))

1198912 (119909) = 051199091 (1 minus 1199092) (1 + 119892 (119909))

1198913 (119909) = 05 (1 minus 1199091) (1 + 119892 (119909))

119892 (119909) = 100

sdot [5 +119899

sum119894=3

((119909119894minus 05)

2minus cos (20120587 (119909

119894minus 05)))]

(15)

where 119899 = 7 and 119909119894isin [0 1]

This test function has difficulty in the convergence to thePareto optimal hyperplane and contains (115minus1) local Paretooptimal fronts

DTLZ2 is defined as

1198911 (119909) = (1 + 119892 (119909)) cos (051205871199091) cos (051205871199092)

1198912 (119909) = (1 + 119892 (119909)) cos (051205871199091) sin (051205871199092)

1198913 (119909) = (1 + 119892 (119909)) sin (051205871199091)

119892 (119909) =119899

sum119894=3

(119909119894minus 05)

2

(16)

where 119899 = 12 and 119909119894isin [0 1]

The Pareto optimal solutions of this test function must lieinside the first octant of the unit sphere in a three-objectiveplot It is more difficult than DTLZ1

42 Parameter Values The parameter values of the proposedalgorithm UKMOPSO are adopted as follows

(i) Parameters for PSO the linearly descending inertiaweight [33 34] is within the interval [01 1] and theacceleration coefficients 119888

1and 1198882are both taken as 2

(ii) Parameter for PAM the minimal and maximal num-bers of clusters are 119870min = 2 and119870max = 10

(iii) Population size the population size119873pop is 200(iv) Parameters for the uniform design the number of

subintervals 119878 is 64 the number of the sample pointsor the population size of each subinterval 119876

0is 31 set

1198630= 31 and 119876

1= 5

(v) Stopping condition the algorithm terminates if thenumber of iterations is larger than the given maximalgenerations of 20

All the parameter values in UMOGA [23] are set equalto the original values in [23] the different and additional

parameter values between UMOGA and UKMOPSO are asfollows the number of subintervals 119878 is 16 119865 = 119873 (thenumber of variables)119863

1= 7 119901

119898= 002

NSGA-II in [25] adopted a population size of 100 acrossover probability of 08 a mutation probability of 1119899(where 119899 is the number of variables) and maximal genera-tions of 250 In order to make the comparisons fair we haveused population size of 100 and maximal generations of 40 inNSGA-II so that the total number of function evaluations inNSGA-II UKMOPSO and UMOGA is the same

43 Performance Metric Based on the assumption that thetrue Pareto front of a test problem is known many kinds ofperformance metrics have been proposed and used by manyresearchers such as [2 9 24 25 29 35ndash40] Three of themare adopted in the paper to compare the performance of theproposed algorithm with them The first metric is 119862 metric[29 38] It is taken as the quantitative metric of the solutionquality and often used to show that the outcomes of one algo-rithm dominating the outcomes of another algorithm It isdefined as

119862 (119860 119861) =|119887 isin 119861 exist119886 isin 119860 119886 ≺ 119887|

|119861| (17)

where 119886 ≺ 119887 represents 119887 being dominated by 119886 119860 representsthe Pareto front obtained by Algorithm 119860 and 119861 representsthe Pareto front obtained by Algorithm 119861 in a typical runFurthermore |119861| represents the number of elements of 119861

The metric value 119862(119860 119861) = 1 means that all points in 119861are dominated by or equal to points in 119860 In contrast 119862(119860119861) = 0means that none of the points in 119861 are covered by theones in 119860

The second metric is the IGD metric (namely InvertedGenerational Distance) [39ndash41] It is the mean value of thedistances from the set of solutions uniformly distributed inthe true Pareto front to the set of solutions obtained by analgorithm in objective space Let 119875lowast be a set of uniformly dis-tributed points along the true PF (Pareto Front) Let 119860 be anapproximate set to the PF the average distance from119875lowast to119860 isdefined as

IGD (119860 119875lowast) =sumVisin119875lowast 119889 (V 119860)

|119875lowast| (18)

where 119889(V 119860) is the minimum Euclidean distance between Vand the points in 119860 If |119875lowast| is large enough to represent thePF very well IGD(119860 119875lowast) could measure both the diversityand convergence of 119860 in a sense To have a low value ofIGD(119860 119875lowast) the set119860must be very close to the PF and cannotmiss any part of the whole PF

The smaller the IGDvalue for the set of obtained solutionsis the better the performance of the algorithm will be

The third measure is the measure of maximum spread(MS) [2 24 35] which is proposed in [42 43] It canmeasure how well the true Pareto front (PFtrue) is coveredby the discovered Pareto front (PFknown) through hyperboxes

Mathematical Problems in Engineering 9

0 01 02 03 04 05 06 07 08 09 10

01

02

03

04

05

06

07

08

09

1100

True Pareto frontUKMOPSO

0 01 02 03 04 05 06 07 08 09 10

01

02

03

04

05

06

07

08

09

152

True Pareto frontUMOGA

0 01 02 03 04 05 06 07 08 09 10

01

02

03

04

05

06

07

08

09

140

True Pareto frontNSGA-II

f1(x)

f2(x)

f1(x)

f2(x)

f1(x)

f2(x)

Figure 1 Pareto fronts obtained by different algorithms for test problem FON

formed by the extreme function values observed in the PFtrueand PFknown It is defined as

MS = radic 1

119872

119872

sum119894=1

120575119894

120575119894= (

min (119891max119894 119865max119894) minusmax (119891min

119894 119865min119894)

119865max119894

minus 119865min119894

)

2

(19)

where 119872 is the number of objectives 119891max119894

and 119891min119894

arethe maximum and minimum values of the 119894th objective inPFknown respectively and 119865

max119894

and 119865min119894

are the maximumand minimum values of the 119894th objective in PFtrue respec-tively

Note that if 119891min119894

ge 119865max119894

then 120575119894= 0 Algorithms with

largerMS values are desirable andMS = 1means that the truePareto front is totally covered by the obtained Pareto front

44 Results For each test problem we perform the proposedalgorithm (called UKMOPSO) for 30 independent runs andcompare its performance with UMOGA [23] and NSGA-II[25] The values of several metrics the 119862metric IGDmetricand MS metric are shown in Tables 2 3 4 and 5 The Paretofronts obtained by several algorithms implemented on severaltest functions are illustrated in Figures 1ndash6

For brevity 119862(UKMOPSOUMOGA) 119862(UMOGAUKMOPSO) 119862(UKMOPSONSGA-II) and 119862(NSGA-IIUKMOPSO) are respectively marked as 119862

12 11986221 11986213 and

11986231

10 Mathematical Problems in Engineering

0100

UKMOPSOTrue Pareto front

14

UMOGATrue Pareto front

40

NSGA-IITrue Pareto front

f1(x)

f2(x)

minus20 minus19 minus18 minus17 minus16 minus15 minus14

f1(x)

minus20 minus19 minus18 minus17 minus16 minus15 minus14 minus13 minus12minus12

minus10

minus8

minus6

minus4

minus2

0

f2(x)

f1(x)

minus20 minus19 minus18 minus17 minus16 minus15 minus14minus12

minus10

minus8

minus6

minus4

minus2

0

f2(x)

minus12

minus10

minus8

minus6

minus4

minus2

Figure 2 Pareto fronts obtained by different algorithms for test problem KUR

Table 2 Comparison of C metric between UKMOPSO andUMOGA

C (UKMOPSO UMOGA) C (UMOGA UKMOPSO)FON 0769 001KUR 0929 001ZDT1 012 013ZDT2 0063 003ZDT3 0139 0098ZDT6 0063 001DTLZ1 1 0DTLZ2 1 0

Table 3 Comparison of Cmetric between UKMOPSO and NSGA-II

C (UKMOPSO NSGA-II) C (NSGA-II UKMOPSO)FON 0025 001KUR 0075 011ZDT1 1 0ZDT2 1 0ZDT3 1 0ZDT6 1 0DTLZ1 0 013DTLZ2 0025 006

Mathematical Problems in Engineering 11

0

02

04

06

08

1

12

14100 76

0

05

1

15

2

25

325

0 01 02 03 04 05 06 07 08 09 1

True Pareto frontUKMOPSO

0 01 02 03 04 05 06 07 08 09 1

True Pareto frontNSGA-II

f1(x)

f2(x)

f1(x)

0 01 02 03 04 05 06 07 08 09 10

01

02

03

04

05

06

07

08

09

1

True Pareto frontUMOGA

f1(x)

f2(x)

f2(x)

Figure 3 Pareto fronts obtained by different algorithms for test problem ZDT1

Table 4 Comparison of IGD metric

UKMOPSO UMOGA NSGA-IIFON 00072 00432 03707KUR 00668 11576 12323ZDT1 00067 00265 13933ZDT2 00067 00727 21409ZDT3 01964 02151 02816ZDT6 00033 00139 59223DTLZ1 74827 22965 54182DTLZ2 01123 04319 02724

Table 5 Comparison of maximum spread (MS) metric

UKMOPSO UMOGA NSGA-IIFON 09769 09754 02368KUR 09789 09580 03414ZDT1 09429 09984 05495ZDT2 09999 09986 00019ZDT3 09276 09275 07963ZDT6 1 09974 03738DTLZ1 1 1 06588DTLZ2 1 1 07125

12 Mathematical Problems in Engineering

0

02

04

06

08

1

12

14100 9

0

05

1

15

2

25

38

0 01 02 03 04 05 06 07 08 09 1

True Pareto frontUKMOPSO

0 01 02 03 04 05 06 07 08 09 10

01

02

03

04

05

06

07

08

09

1

True Pareto frontUMOGA

0 01 02 03 04 05 06 07 08 09 1

True Pareto frontNSGA-II

f1(x) f1(x)

f2(x)

f2(x)

f2(x)

f1(x)

Figure 4 Pareto fronts obtained by different algorithms for test problem ZDT2

As shown in Table 2 for the test functions Fon and KUR11986212= 0769 and 119862

12= 0929 mean that 768 and 929 of

the solutions obtained by UMOGA are dominated by thoseobtained by UKMOPSO Similarly 119862

21= 001means that 1

of the solutions obtained by UKMOPSO are dominated bythose obtained by UMOGA For DTLZ1 and DTLZ2119862

12= 1

and 11986221= 0 mean that all solutions obtained by UMOGA

are dominated by those obtained by UKMOPSO but noneof solutions from UKMOPSO is dominated by those fromUMOGA It means that the solution quality via UKMOPSOis much better than that via UMOGA for the above testfunctions For ZDT1 ZDT2 ZDT3 and ZDT6 the valuesof the 119862 metrics do not have too many differences betweenUKMOPSO and UMOGA It means the solution qualities viaboth are almost identical

FromTable 3 we can see that for ZDT1 ZDT2 ZDT3 andZDT6 119862

13= 1 and 119862

31= 0mean that all solutions obtained

byNSGA-II are dominated by those obtained byUKMOPSObut none of solutions from UKMOPSO is dominated bythose from NSGA-II It means that the solution quality viaUKMOPSO is much better than that via NSGA-II for theabove test functions For the rest of the test functions thesolution qualities via both are almost identical

In Table 4 for all test problems the IGD values ofUKMOPSO are the smallest among UKMOPSO UMOGAand NSGA-II This means the PF found by UKMOPSO isthe nearest to the true PF compared with the PF obtainedby the other two algorithms namely the performance ofUKMOPSO is the best in the three algorithms The IGDvalues of UMOGA are all larger than those of NSGA-II for

Mathematical Problems in Engineering 13

0

02

04

06

08

1 92 86

0

05

1

15

2

25

3

35

4

45 40

0 01 02 03 04 05 06 07 08 09f1(x)

0 01 02 03 04 05 06 07 08 09f1(x)

0 01 02 03 04 05 06 07 08 09f1(x)

f2(x)

minus08

minus06

minus04

minus02

0

02

04

06

08

1

f2(x)

f2(x)

minus08

minus05

minus06

minus04

minus02

UKMOPSO UMOGA

NSGA-II

Figure 5 Pareto fronts obtained by different algorithms for test problem ZDT3

all test functions It means that the PF found by UMOGA iscloser to the true PF than the PF obtained by NSGA-II

From Table 5 we can see that all the MS values ofUKMOPSO are almost close to 1 for each test function Itmeans that almost all the true PF is totally covered by the PFobtained by UKMOPSO UMOGA is similar to UKMOPSOThe MS values of NSGA-II are much lesser than UKMOPSOand UMOGA especially MS = 00019 for ZDT2

Figures 1ndash6 all demonstrate that the PF found byUKMOPSO is the nearest to the true PF and scatters mostuniformly in those obtained by three algorithms Most of thepoints in the PF found by UKMOPSO overlap the pointsin the true PF It means the solution quality obtained byUKMOPSO is very high

45 Influence of the Uniform Crossover and PAM In orderto find out the influence of the uniform crossover on the

proposed algorithm we compare the distribution and num-ber of Pareto optimal solutions before and after performingthe uniform crossover One of the simulating results on thetest function ZDT1 is shown in Figure 7

From Figure 7 it can be seen that before and afterperforming the uniform crossover the number of data pointsin the 1st cluster and the 2nd cluster varies from 7 to 15and from 19 to 26 respectively namely many new Paretooptimal solutions having not been found before performingthe uniform crossover are generated and the differences ofthe data points between two clusters have been decreasedThis will directly influence the uniformity of the PF andacquire more uniform Pareto solutions

PAM is used to determine which Pareto solutions are tobe removed fromor be inserted into the external archiveThisis to maintain the diversity of Pareto optimal solutions The

14 Mathematical Problems in Engineering

100 16

0

05

1

15

2

25

3

35

4

45

540

02 03 04 05 06 07 08 09 10

01

02

03

04

05

06

07

08

09

1

True Pareto frontUKMOPSO

02 03 04 05 06 07 08 09 10

01

02

03

04

05

06

07

08

09

1

True Pareto frontUMOGA

02 03 04 05 06 07 08 09 1

True Pareto frontNSGA-II

f1(x)

f2(x)

f1(x)

f2(x)

f2(x)

f1(x)

Figure 6 Pareto fronts obtained by different algorithms for test problem ZDT6

diversity can be computed by the ldquodistance-to-average-pointrdquomeasure [44] defined as

diversity (119878) = 1

|119878|

|119878|

sum119894=1

radic119873

sum119895=1

(119901119894119895minus 119901119895)2

(20)

where 119878 is the population |119878| is the swarm size 119873 is thedimensionality of the problem 119901

119894119895is the 1198951015840th value of the 1198941015840th

particle and 119901119895is the 1198951015840th value of the average point 119901

If the number of Pareto optimality is less than or equalto the size of the external archive PAM has no influenceon the proposed algorithm Otherwise it is used to selectthe different type of Pareto optimality from several clustersTherefore The diversity of Pareto optimality will certainlyincreaseWemonitor the diversity of Pareto optimality before

and after performing PAMonZDT1 at a certain timeThe val-ues are respectively 8762 and 11752 This fully demonstratesthat PAM can improve the diversity of Pareto optimality

5 Conclusion and Future Work

In this paper a multiobjective particle swarm optimizationbased on PAM and uniform design is presented It firstlyimplements PAM to partition the Pareto front into119870 clustersin the objective space and then it implements the uniformcrossover operator on the minimal cluster so as to generatemore Pareto optimality in the minimal cluster When thesize of the Pareto solution is larger than that of the externalarchive PAM is used to determine which Pareto solutions areto be removed from or be inserted into the external archive

Mathematical Problems in Engineering 15

0 01 02 03 04 05 06 07 08 09

19

10

01

02

03

04

05

06

077

08

09

1

The 1st clusterThe 2nd cluster

26

(a)

41

0 01 02 03 04 05 06

26

07 08 09 10

01

02

03

04

051506

07

08

09

1

The 1st clusterThe 2nd cluster

(b)

Figure 7 Pareto optimal solutions before (a) and after (b) performing the uniform crossover

Finally it keeps all the points in the lesser clusters anddiscards some points in the larger clusters This can ensurethat each of the clusters will contain approximately the samedata points Therefore the diversity of the Pareto solutionswill increase and they can scatter uniformly over the Paretofront The results of the experimental simulation performedon several well-known test problems indicate that theproposed algorithm obviously outperforms the other twoalgorithms

This algorithm is going on for further enhancement andimprovement One attempt is to use a more efficient orapproximate clustering algorithm to speed up the executiontime of this algorithm Another attempt is to extend its appli-cation scopes

Conflict of Interests

The authors declare that there is no conflict of interestsregarding the publication of this paper

Acknowledgments

This research was supported by Guangxi Natural ScienceFoundation (no 2013GXNSFAA019337) Guangxi Universi-ties Key Project of Science and Technology Research (noKY2015ZD099) Key project of Guangxi Education Depart-ment (no 2013ZD055) Scientific Research Staring Founda-tion for the PHD Scholars of Yulin Normal University (noG2014005) Special Project of Yulin Normal University (no2012YJZX04) and Key Project of Yulin Normal University(no 2014YJZD05)The authors are grateful to the anonymousreferee for a careful checking of the details and for helpfulcomments that improved this paper

References

[1] J D Schaffer ldquoMultiple objective optimization with vector eval-uated genetic algorithmsrdquo in Proceedings of the 1st InternationalConference on Genetic Algorithms pp 93ndash100 1985

[2] L Tang and X Wang ldquoA hybrid multiobjective evolutionaryalgorithm for multiobjective optimization problemsrdquo IEEETransactions on Evolutionary Computation vol 17 no 1 pp 20ndash45 2013

[3] K Deb S Agrawal A Pratap and T Meyarivan ldquoA fast elitistnon-dominated sorting genetic algorithm for multi-objectiveoptimizationNSGA-IIrdquo inParallel ProblemSolving fromNaturePPSN VI vol 1917 of Lecture Notes in Computer Science pp849ndash858 Springer Berlin Germany 2000

[4] J Zhang Y Wang and J Feng ldquoAttribute index and uniformdesign based multiobjective association rule mining with evo-lutionary algorithmrdquo The Scientific World Journal vol 2013Article ID 259347 16 pages 2013

[5] S Jiang and Z Cai ldquoFaster convergence and higher hypervol-ume for multi-objective evolutionary algorithms by orthogonaland uniform designrdquo in Advances in Computation and Intelli-gence vol 6382 of Lecture Notes in Computer Science pp 312ndash328 Springer Berlin Germany 2010

[6] J Kennedy and R Eberhart ldquoParticle swarm optimizationrdquoin Proceedings of the IEEE International Conference on NeuralNetworks pp 1942ndash1948 December 1995

[7] R Eberhart and J Kennedy ldquoA new optimizer using particleswarm theoryrdquo in Proceedings of the 6th International Sympo-sium on Micro Machine and Human Science pp 39ndash43 IEEENagoya Japan October 1995

[8] L Yang ldquoA fast and elitist multi-objective particle swarm algo-rithm NSPSOrdquo in Proceedings of the IEEE International Con-ference on Granular Computing (GRC rsquo08) pp 470ndash475 August2008

[9] C A C Coello G T Pulido and M S Lechuga ldquoHandlingmultiple objectives with particle swarm optimizationrdquo IEEE

16 Mathematical Problems in Engineering

Transactions on Evolutionary Computation vol 8 no 3 pp256ndash279 2004

[10] M Clerc and J Kennedy ldquoThe particle swarm-explosion sta-bility and convergence in a multidimensional complex spacerdquoIEEE Transactions on Evolutionary Computation vol 6 no 1pp 58ndash73 2002

[11] H Xiaohui and R Eberhart ldquoMultiobjective optimization usingdynamic neighborhood particle swarm optimizationrdquo in Pro-ceedings of the Congress on Evolutionary Computation (CEC rsquo02)vol 2 pp 1677ndash1681 May 2002

[12] S Mostaghim and J Teich ldquoStrategies for finding goodlocal guides in multi-objective particle swarm optimization(MOPSO)rdquo in Proceedings of the IEEE Swarm IntelligenceSymposium (SIS rsquo03) pp 26ndash33 Indianapolis Ind USA April2003

[13] T Bartz-Beielstein P Limbourg J Mehnen K Schmitt K EParsopoulos and M N Vrahatis ldquoParticle swarm optimizersfor Pareto optimizationwith enhanced archiving techniquesrdquo inProceedings of the Congress on Evolutionary Computation (CECrsquo03) vol 3 pp 1780ndash1787 December 2003

[14] K E Parsopoulos D K Tasoulis and M N Vrahatis ldquoMulti-objective optimization using parallel vector evaluated particleswarm optimizationrdquo in Proceedings of the IASTED Interna-tional Conference on Artificial Intelligence and Applications (AIArsquo04) vol 2 pp 823ndash828 February 2004

[15] S-J Tsai T-Y Sun C-C Liu S-T Hsieh W-C Wu and S-YChiu ldquoAn improved multi-objective particle swarm optimizerformulti-objective problemsrdquo Expert Systems with Applicationsvol 37 no 8 pp 5872ndash5886 2010

[16] A A Mousa M A El-Shorbagy and W F Abd-El-WahedldquoLocal search based hybrid particle swarm optimization algo-rithm for multiobjective optimizationrdquo Swarm and Evolution-ary Computation vol 3 pp 1ndash14 2012

[17] J Wang W Liu W Zhang and B Yang ldquoMulti-objective parti-cle swarm optimization based on self-update and grid strategyrdquoin Proceedings of the 2012 International Conference on Informa-tion Technology and Software Engineering vol 211 of LectureNotes in Electrical Engineering pp 869ndash876 Springer BerlinGermany 2013

[18] K Khalili-Damghani A-R Abtahi and M Tavana ldquoA newmulti-objective particle swarmoptimizationmethod for solvingreliability redundancy allocation problemsrdquo Reliability Engi-neering amp System Safety vol 111 pp 58ndash75 2013

[19] J Zhang Y Wang and J Feng ldquoParallel particle swarm opti-mization based on PAMrdquo in Proceedings of the 2nd InternationalConference on Information Engineering and Computer Science(ICIECS rsquo10) Wuhan China December 2010

[20] J Han and M Kamber Data Mining Concepts and TechniquesMorgan Kaufmann 2006

[21] L F Ibrahim ldquoUsing of clustering algorithm CWSP-PAM forrural network planningrdquo in Proceedings of the 3rd InternationalConference on Information Technology and Applications (ICITArsquo05) vol 1 pp 280ndash283 IEEE July 2005

[22] P Chopra J Kang and J Lee ldquoUsing gene pair combinationsto improve the accuracy of the PAM classifierrdquo in Proceedingsof the IEEE International Conference on Bioinformatics andBiomedicine (BIBM rsquo09) pp 174ndash177 November 2009

[23] Y-W Leung and Y Wang ldquoMultiobjective programming usinguniform design and genetic algorithmrdquo IEEE Transactions onSystems Man and Cybernetics Part C Applications and Reviewsvol 30 no 3 pp 293ndash304 2000

[24] D Liu K C Tan C K Goh and W K Ho ldquoA multiobjectivememetic algorithm based on particle swarm optimizationrdquoIEEE Transactions on Systems Man and Cybernetics Part BCybernetics vol 37 no 1 pp 42ndash50 2007

[25] K Deb A Pratap S Agarwal and T Meyarivan ldquoA fast andelitist multiobjective genetic algorithm NSGA-IIrdquo IEEE Trans-actions on Evolutionary Computation vol 6 no 2 pp 182ndash1972002

[26] C M Fonseca and P J Fleming ldquoMultiobjective optimizationandmultiple constraint handling with evolutionary algorithmsII Application examplerdquo IEEE Transactions on Systems Manand Cybernetics Part A Systems and Humans vol 28 no 1 pp38ndash47 1998

[27] F Kursawe ldquoA variant of evolution strategies for vector opti-mizationrdquo in Parallel Problem Solving from Nature vol 496 ofLecture Notes in Computer Science pp 193ndash197 Springer BerlinGermany 1991

[28] E Zitzler K Deb and LThiele ldquoComparison of multiobjectiveevolutionary algorithms empirical resultsrdquo Evolutionary Com-putation vol 8 no 2 pp 173ndash195 2000

[29] H-L Liu Y Wang and Y-M Cheung ldquoA Multi-objective evo-lutionary algorithm usingmin-max strategy and sphere coordi-nate transformationrdquo Intelligent Automation amp Soft Computingvol 15 no 3 pp 361ndash384 2009

[30] N Chakraborti B S Kumar V S Babu S Moitra andA Mukhopadhyay ldquoA new multi-objective genetic algorithmapplied to hot-rolling processrdquoAppliedMathematicalModellingvol 32 no 9 pp 1781ndash1789 2008

[31] K Deb ldquoMulti-objective genetic algorithms problem difficul-ties and construction of test problemsrdquo Evolutionary Computa-tion vol 7 no 3 pp 205ndash230 1999

[32] K Deb L Thiele M Laumanns and E Zitzler ldquoScalable testproblems for evolutionarymultiobjective optimizationrdquo in Evo-lutionary Multiobjective Optimization Advanced Informationand Knowledge Processing pp 105ndash145 Springer London UK2005

[33] Y Shi and R Eberhart ldquoModified particle swarm optimizerrdquoin Proceedings of the IEEE World Congress on ComputationalIntelligence pp 69ndash73 May 1998

[34] C S Feng S Cong and X Y Feng ldquoA new adaptive inertiaweight strategy in particle swarm optimizationrdquo in Procedingsof the IEEE Congress on Evolutionary Computation (CEC rsquo07)pp 4186ndash4190 IEEE Singapore September 2007

[35] C K Goh K C Tan D S Liu and S C Chiam ldquoA competitiveand cooperative co-evolutionary approach to multi-objectiveparticle swarm optimization algorithm designrdquo European Jour-nal of Operational Research vol 202 no 1 pp 42ndash54 2010

[36] P Cheng J-S Pan L Li Y Tang and C Huang ldquoA surveyof performance assessment for multiobjective optimizersrdquo inProceedings of the 4th International Conference on Genetic andEvolutionary Computing (ICGEC rsquo10) pp 341ndash345 December2010

[37] E Zitzler L Thiele M Laumanns C M Fonseca and V GDa Fonseca ldquoPerformance assessment of multiobjective opti-mizers an analysis and reviewrdquo IEEE Transactions on Evolution-ary Computation vol 7 no 2 pp 117ndash132 2003

[38] E Zitzler and L Thiele ldquoMultiobjective evolutionary algo-rithms a comparative case study and the strength Paretoapproachrdquo IEEETransactions onEvolutionaryComputation vol3 no 4 pp 257ndash271 1999

[39] H Li and Q Zhang ldquoMultiobjective optimization problemswith complicated pareto sets MOEAD and NSGA-IIrdquo IEEE

Mathematical Problems in Engineering 17

Transactions on Evolutionary Computation vol 13 no 2 pp284ndash302 2009

[40] Q Zhang A Zhou S Zhao P N Suganthan W Liu andS Tiwari ldquoMultiobjective optimization test instances for theCEC 2009 special session and competitionrdquo Working ReportDepartment of Computing and Electronic Systems Universityof Essex 2008

[41] Q Zhang A Zhou and Y Jin ldquoRM-MEDA a regularity model-based multiobjective estimation of distribution algorithmrdquoIEEE Transactions on Evolutionary Computation vol 12 no 1pp 41ndash63 2008

[42] C-S Tsou H-H Fang H-H Chang and C-H Kao ldquoAnimproved particle swarmPareto optimizerwith local search andclusteringrdquo in Simulated Evolution and Learning vol 4247 ofLecture Notes in Computer Science pp 400ndash407 2006

[43] U Wickramasinghe and X Li ldquoChoosing leaders for multiob-jective PSO algorithms using differential evolutionrdquo in Proceed-ing of the 7th International Conference Simulated Evolution andLearning vol 5361 of Lecture Notes in Computer Science pp249ndash258 Springer Berlin Germany 2008

[44] T Krink J S Vesterstrom and J Riget ldquoParticle swarm opti-misation with spatial particle extensionrdquo in Proceedings of theCongress on Evolutionary Computation (CEC rsquo02) pp 1474ndash1479 May 2002

Submit your manuscripts athttpwwwhindawicom

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical Problems in Engineering

Hindawi Publishing Corporationhttpwwwhindawicom

Differential EquationsInternational Journal of

Volume 2014

Applied MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Probability and StatisticsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical PhysicsAdvances in

Complex AnalysisJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

OptimizationJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

CombinatoricsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Operations ResearchAdvances in

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Function Spaces

Abstract and Applied AnalysisHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of Mathematics and Mathematical Sciences

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Algebra

Discrete Dynamics in Nature and Society

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Decision SciencesAdvances in

Discrete MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014 Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Stochastic AnalysisInternational Journal of

Page 2: Research Article Multiobjective Particle Swarm

2 Mathematical Problems in Engineering

removed or appended The results of the experimental sim-ulation implemented on several well-known test problemsindicate that the proposed algorithm is efficient

The rest of this paper is organized as follows Section 2states the preliminaries of the proposed method Section 3presents our method in detail Section 4 gives the numericalresults of the proposed method The conclusion of the workis made in Section 5

2 Preliminaries

In this section we describe some concepts concerningparticle swarm optimization multiobjective particle swarmoptimization PAM and uniform design

21 Particle Swarm Optimization Algorithms In 119889-dimen-sional search space the position and velocity of the 119894th parti-cle are respectively represented as 119883

119894= [1199091198941 1199091198942 119909

119894119889]

and 119881119894= [V1198941 V1198942 V

119894119889] The optimal positions of the

119894th particle and the whole swarm namely the individualoptimal and the global optimal are denoted as 119875119887119890119904119905 =[1199011198941 1199011198942 119901

119894119889] and 119866119887119890119904119905 = [119901

1198921 1199011198922 119901

119892119889] respec-

tively The individuals or particles in the swarm update theirvelocities and positions according to the following formulas

V119894119895 (119896 + 1) = 119908 sdot V119894119895 (119896) + 1198881 sdot 1199031 sdot (119875119887119890119904119905119894119895 (119896) minus 119909119894119895 (119896))

+ 1198882times 1199032times (119866119887119890119904119905

119895 (119896) minus 119909119894119895 (119896))

(2)

119909119894119895 (119896 + 1) = 119909119894119895 (119896) + V

119894119895 (119896 + 1) (3)

where the inertia weight coefficient 119908 indicates the ability tomaintain the previous speed the acceleration coefficients 119888

1

and 1198882are used to coordinate the degrees of tracking individ-

ual optimal and global optimal and 1199031and 1199032are two ran-

dom numbers drawn from the uniform distribution on theinterval (0 1)

The update equation of the velocity consists of the previ-ous velocity component a cognitive component and a socialcomponent They are mainly controlled by three parametersthe inertia weight and two acceleration coefficients

From the theoretical analysis for the trajectory of particlesin PSO [10] the trajectory of a particle 119883

119894converges to a

weightedmean of119875119894and119875119892Whenever the particle converges

it will ldquoflyrdquo to the individual best position and the global bestposition According to the update equation the individualbest position of the particle will gradually move closer to theglobal best position Therefore all the particles will convergeonto the global best particlersquos position

22 Multiobjective Particle Swarm Optimization MOPSO isproposed by Coello et al it adopts swarm intelligence tooptimize MOPs and it uses the Pareto optimal set to guidethe particlersquos flight [9]

Particle swarm optimization has been proposed forsolving a large number of single objective problems Manyresearchers are interested in solving multiobjective problems(MOP) using PSO To modify a single objective PSO to

MOPSO a guide must be redefined in order to obtain a setof nondominated solutions (Pareto front) In MOPSO thePareto optimal solutions should be used to determine theguide for each particle How to select suitable local guidesfor attaining both convergence and diversity of solutionsbecomes an important issue

There have been some publications to use PSO to solveMOP A dynamic neighborhood PSO was proposed [11]which optimizes only one objective at a time and uses ascheme similar to lexicographic ordering In addition thisapproach also proposes an unconstrained elite archive nameddominated tree to store the nondominated solutions How-ever it is a difficult issue for this approach to pick up a bestlocal guide from the set of Pareto optimal solutions for eachparticle of the population A strategy for finding suitablelocal guides for each particle was proposed and named Sigmamethod The local guide is explicitly assigned to specifyparticles according to the Sigma value [12] This results inthe desired diversity and convergence but it is still not closeenough to the Pareto front On the other hand an enhancedarchiving technique to maintain the best (nondominated)solutions found during the course of a MO algorithm wasproposed [13] It shows that using archives in PSO for MOproblems will improve their performance directly A paral-lel vector evaluated particle swarm optimization (VEPSO)method for multiobjective problems was proposed [14]which adopted a ring migration topology and PVE system tosimultaneously work 2ndash10 CPUs to find nondominated solu-tions In [9] MOPSO method was proposed It incorporatesPareto dominance and a special mutation operator to solveMO problems [15]

Recently a hybrid multiobjective algorithm combiningboth genetic algorithm (GA) and particle swarm optimiza-tion (PSO) was proposed [16] A multiobjective particleswarm optimization based on self-update and grid strategywas proposed for improving the function of Pareto set [17]A new dynamic self-adaptive multiobjective particle swarmoptimization (DSAMOPSO) method is proposed to solvebinary-state multiobjective reliability redundancy allocationproblems [18] which used a modified nondominated sortinggenetic algorithm (NSGA-II)method and a customized time-variant multiobjective particle swarm optimization methodto generate nondominated solutions

The MOPSO method is becoming more popular due toits simplicity to be implemented and its ability to quicklyconverge to a reasonably acceptable solution for problems inscience and engineering

23 PAM Algorithm There are many clustering methodsavailable in data mining Typical clustering analysis methodsare clustering based on partition hierarchical clusteringclustering based on density clustering based on grid andclustering based on model

The most frequently used clustering methods based onpartition are 119896-means and 119896-medoids In contrast to the 119896-means algorithm 119896-medoids chooses data points as cen-troids which make 119896-medoids method more robust than 119896-means in the presence of noise and outliers The reason isthat 119896-medoids method is less influenced by outliers or other

Mathematical Problems in Engineering 3

Algorithm PAM a 119896-medoids algorithm for partitioning based on medoid or central objectsInput119896 the number of clusters119863 a data set containing 119899 objectsOutput A set of 119896 clustersMethod(1) Arbitrarily choose 119896 objects in119863 as the initial representative objects or seeds(2) Repeat(3) Assign each remaining object to the cluster with the nearest representative object(4) Randomly select a nonrepresentative object 119900 random(5) Compute the total cost 119878 of swapping representative object 119900

119895 with 119900 random

(6) if 119878 lt 0 then swap 119900119895with 119900 random to form the new set of 119896 representative objects

(7) Until no change

Algorithm 1 PAM algorithm

extreme values than 119896-means PAM (Partitioning AroundMedoids) is the first and the most frequently used 119896-medoidsalgorithms It is shown in Algorithm 1

PAM constructs 119896 partitions (clusters) of the givendataset where each partition represents a cluster Each clustermay be represented by a centroid or a cluster representativewhich is some sort of summary description of all the objectscontained in a cluster It needs to determine 119896 partitions for 119899objectsThe process of PAM is by and large as follows Firstlyrandomly select 119896 representative objects and cluster otherobjects to the same group as the representative object accord-ing to the minimum distances between the representativeobject and other objects Then try to replace these represen-tative objects with other nonrepresentative objects in orderto minimize squared error All the possible pairs of objectsare analyzed where one object in each pair is considered arepresentative object and the other is not The total cost ofthe clustering is calculated for each such combination Anobject will be replaced with such an object having mini-mized squared error The set of best objects in each clusterafter iteration forms the representative objects for the nextiteration The final set of representative objects is the respec-tive centroids of the clusters [19ndash22]

24 Uniform Design

241 Uniform Design Themain objective of uniform designis to sample a small set of points from a given set of pointssuch that the sampled points are uniformly scattered [23]

Let there be 119899 factors and 119902 levels per factor When 119899 and119902 are given the uniform design selects 119902 from 119902119899 possiblecombinations such that these 119902 combinations are uniformlyscattered over the space of all possible combinations Theselected 119902 combinations are expressed as a uniform array119880(119899 119902) = [119880

119894119895]119902 times 119899 where 119880

119894119895is the level of the 119895th factor

in the 119894th combination and can be calculated by the followingformula

119880119894119895= (119894120590119895minus1 mod 119902) + 1 (4)

where 120590 is a parameter given in Table 1

242 Improved Generation of Initial Population An algo-rithm for dividing the solution space and an algorithm forgenerating the initial population have been designed [23]However Algorithm 2 in [23] considers only the dividingof the solution space not the dividing of the 119873-dimensionspace This will bring about some serious problems If weassume that in Step 2 119876

0= 5 and 119873 = 10 then 119880(119873119876

0)

is impossible to be generated because 1198760must be larger than

119873 Namely Algorithm 2 in [23] is only suitable for the lowdimensional problem In order to overcome the shortcom-ings the dividing of the119873-dimension space is introduced toimprove the algorithm The improved algorithm can be suit-able for not only low dimensional but also high dimensionalproblem The improved algorithm is shown as follows

Algorithm A (improved generation of initial population)

Step 1 Judge whether1198760is valid or not as it must be found in

the 1st column of Table 1 If it is not valid it stops and showserror messages otherwise it continues

Step 2 Execute Algorithm 1 in [23] to divide [119897 119906] into 119878subspaces [119897(1) 119906(1)] [119897(2) 119906(2)] sdot sdot sdot [119897(119878) 119906(119878)]

Step 31 Judge whether1198760is more than119873 if yes turn to Step

32 otherwise turn to Step 33

Step 32 Quantize each subspace and apply the uniform array119880(119873119876

0) to sample 119876

0points

Step 33 Divide119873-dimension space into lfloor1198731198730rfloor parts where

1198730is an integer more than 1 and less than119876

0and is generally

taken as1198760minus1 Among lfloor119873119873

0rfloorparts the 1st part corresponds

to the dimension from 1 to1198730 and the 2nd part corresponds

to the dimension from 1198730+ 1 to 2 lowast 119873

0 and so forth If

the remainder 119877 of 1198731198730is not equal to 0 then a plus part

corresponds to the dimension from lfloor1198731198730rfloor lowast 119873

0+ 1 to 119873

whose length is surely less than1198730 Repeat to execute Step 34

for each part

Step 34 Quantize each subspace and then apply uniformarray 119880(119873

0 1198760) to sample 119876

0points It is noteworthy that

4 Mathematical Problems in Engineering

Table 1 Values of the parameter 120590 for different number of factorsand different number of levels per factor

Number of levelsper factors Number of factors 120590

5 2sim4 27 2sim6 311 2sim10 7

132 53 4

4sim12 617 2sim16 10

19 2sim3 84sim18 14

232 13sim14 20sim22 7

8sim12 153sim7 15sim19 17

29

2 123 9

4sim7 168sim12 16sim24 8

13sim15 1425sim28 18

31 2 5sim12 20sim30 123sim4 13sim19 22

119880(1198730 1198760) is replaced with 119880(119877119876

0) in the plus part where

the remainder 119877 is equal to119873 minus lfloor1198731198730rfloor lowast 1198730

243 Crossover Operator Based on the Uniform Design Thecrossover operator based on the uniform design acts ontwo parents It quantizes the solution space defined by theseparents into a finite number of points and then it applies theuniformdesign to select a small sample of uniformly scatteredpoints as the potential offspring

For any two parents 1199011and 119901

2 their minimal values

and the maximal values of each dimension are used toform the novel solution space [119897parent 119906parent] Each domain of[119897parent 119906parent] is quantized into 119876

1levels where 119876

1is a pre-

defined prime numberThen the uniform design is applied toselect a sample of points as the potential offspringThe detailsof the algorithm can be referred to [23]

3 The Proposed Algorithm

31 Elitist Selection or Elitism [4] Elitism means that eliteindividuals cannot be excluded from the mating pool of thepopulation A strategy presented can always include the bestindividual of the current population in the next generation inorder to prevent the loss of good solutions found so far Thisstrategy can be extended to copy the best 119899 individuals to thenext generation This is explanation of the elitism In MOPelitism plays an important role

Two strategies are often used to implement elitism Onemaintains elitist solutions in the population the other stores

population paretocreate (pop)Input pop indicates the populationOutput pareto indicates the non-dominated solutionsparetolarr popfor forall chr1 isin popfor forall chr2 isin pareto and chr2 = chr1if chr1 dominate chr2

paretolarr pareto minus chr2end if

end forend forreturn pareto

Algorithm 2 Pseudocode of selecting elitist

elitist solutions into an external secondary list or externalarchive and reintroduces them to the populationThe formercopies all nondominated solutions in the current populationto the next population and then fills the rest of the next popu-lation by selecting from the remaining dominated solutions inthe current populationThe latter uses an external secondarylist or external archive to store the elitist solutions Externallist stores the nondominated solutions found It will beupdated in the next generation by means of removing elitistsolutions dominated by a new solution or adding the newsolution if it is not dominated by any existing elitist solution

The work adopts the second strategy namely storing eli-tist solutions to an external secondary list Its advantage is thatit can preserve and dynamically adjust all the nondominatedsolutions set till the current generation The pseudocodes ofselecting elitist and updating elitist are respectively shown inAlgorithms 2 and 3

32 Selection Mechanism for the Swarm Based on UniformDesign This paper adopts the uniform design to select thebest 119879

1points (119879

1lt 119879) from the 119879 points and acquires their

objectives fit119881 The detailed steps are as follows

Algorithm B

Step 1 Calculate each of119872 objectives for each of the119879 pointsnormalize each of objectives 119891

119894(119909) as follows

ℎ119894119909 =

119891119894 (119909)

max119910isin120595

1003816100381610038161003816119891119894 (119910)1003816100381610038161003816 (5)

where 120595 is a set of points in the current population and ℎ119894119909 is

the normalized objective

Step 2 Apply the uniform design to generate the 1198630weight

vectors 1199081 1199082 119908

1198630 each of them is used to compose

one fitness function by the following formula where 1198630is a

design parameter and it is prime

fit119894= 1199081198941ℎ1 (119909) + 1199081198942ℎ2 (119909) + sdot sdot sdot + 1199081198941198630ℎ1198630 (119909)

1 le 119894 le 1198630

(6)

Mathematical Problems in Engineering 5

population paretoupdate(offspring pareto)Input offspring indicates the offsprings after performing the crossover operator

and mutation operator pareto indicates the non-dominated solutionsOutput pareto indicates the non-dominated solutionsoffspringlarr call paretocreate(offspring)for forall chr1 isin offspring

nondominatedlarr truefor forall chr2 isin paretoif chr1 dominate chr2paretolarr pareto minus chr2

else if chr2 dominate chr1nondominatedlarr false

elsecontinue

end ifend forif nondominatedparetolarr pareto cup chr1

end ifend forreturn pareto

Algorithm 3 Pseudocode of updating elitist

Step 3 Based on each of fitness functions evaluate the qualityof the 119879 points Assume the remainder 119879

11198630is 1198770 For the

first 1198770fitness functions select the best lceil119879

11198630rceil points for

the rest of fitness functions select the best lfloor11987911198630rfloor points

Overall a total of 1198791points are selected The objectives of

these selected points are correspondingly stored into fit119881

33 Selection Mechanism for Gbest Based on PAM InMOPSO 119866119887119890119904119905 plays an important role in guiding the entireswarm toward the global Pareto front [24] In contrast tothe single objective PSO having only one global best 119866119887119890119904119905MOPSO has multiple Pareto optimality solutions which arenondominated each other How to select a suitable 119866119887119890119904119905 foreach of particles fromPareto optimal solutions is one very keyissue

The paper presents the selection mechanism for 119866119887119890119904119905based on PAM as follows

Algorithm C Assume the population and the numbers ofparticles are denoted as pop and119873pop and the Pareto opti-mality and the numbers of Pareto optimality are denoted aspareto and119873119875

Step 1 Acquire the number of cluster according to the fol-lowing formula

119870 = 119870min + (119870max minus 119870min) sdot (119905

max119866) (7)

where 119870min and 119870max respectively denote the minimal andmaximal value of 119870 namely 119870 isin [119870min 119870max] 119905 and max119866indicate the 119905th iteration and the maximal iteration numberThe formula can acquire the linearly increasing number ofcluster so as to accord with the process from the coarse searchto the elaborate search

Step 2 If 119873119875 le 119870 then for each particle in pop find thenearest Pareto optimal solution as its 119866119887119890119904119905 Otherwise turnto Step 3

Step 3 Perform PAM described in Section 23 to partitionPareto and pop into 119870 clusters respectively the clustercentroids of which are denoted as119862

1= 11988811 11988812 119888

1119870 and

1198622= 11988821 11988822 119888

2119870

Step 4 For each 1198882119895isin 1198622 find the nearest 119888

1119894isin 1198621 For

all the particles in the cluster represented by 1198882119895 their 119866119887119890119904119905

randomly takes one of Pareto optimal solutions in the clusterrepresented by 119888

1119894

34 Adjustment of Pareto Optimal Solutions Based on theUniform Design and PAM The number of Pareto optimalsolutions in the external archivewill increasingly enlargewithevolution namely the size of the external archive will becomevery large with the iteration number This will increasethe computation complexity and the execution time of analgorithmTherefore the size of the external archive must becontrolled In the meanwhile Pareto optimal solutions in theexternal archive are possibly close to each other in the objec-tive space If the solutions are close to each other they arenearly the same choice It is desirable to find the Pareto opti-mal solutions scattered uniformly over the Pareto front sothat the decision maker can have a variety of choices There-fore how to control the size of the external archive and selectrepresentative Pareto optimal solutions scattered uniformlyover the Pareto front is a key issue

The paper presents the adjustment algorithm of Paretooptimal solutions based on the uniform design and PAMThe algorithm firstly implements PAM to partition thePareto front into 119870 clusters in the objective space and then

6 Mathematical Problems in Engineering

implements the uniform crossover operator on the minimalcluster so as to generate more Pareto optimality in theminimal cluster finally it keeps all the points in the lesserclusters and discards some points in the larger clusters Thedetailed steps of the algorithm are shown as follows

Algorithm DAssume the Pareto optimality their values of119872functions and the numbers of Pareto optimality are denotedas pareto 119865pareto and119873119875 The size of the external archive isassumed as119873119875 lim The number of clusters is 119870

Step 1 If the numbers of data points in the maximal andthe minimal clusters have too many differences or are bothvery small select all the data points from the minimal clusterand the same number of points from the cluster which isthe nearest to the minimal cluster For each pair of datapoints perform the uniform crossover operator described inSection 243 Update pareto 119865pareto and 119873119875 according toAlgorithm 3

Step 2 If 119873119875 gt 119873119875 lim implement PAM to partition the119865pareto into119870 clusters in the objective space and let119873 div =119873119875 lim 119870 it indicates the average points to select from eachof119870 clustersWe can classify three situations according to thenumber of points in each cluster and119873 div as follows

(i) Situation 1 the numbers of points in all clusters arelarger than or equal to119873 div

(ii) Situation 2 the number of data points is larger thanor equal to119873 div only in one cluster

(iii) Situation 3 the number of the clusters in which thenumber of data points is larger than 119873 div is largerthan 1 and less than119870

Step 3 For Situation 1 sort all clusters according to their con-taining points in ascending order Select 119873 div points fromthe previous 119870 minus 1 clusters and select the remainder points119873119875 limminus119873 div lowast(119870 minus 1) from the last cluster Turn to Step7

Step 4 For Situation 2 keep the points of all the clusters inwhich the number of the points is less than or equal to119873 divand the remainder points are selected from that only clusterTurn to Step 7

Step 5 For Situation 3 keep all the points of all the clustersin which the number of points is less than or equal to119873 divTurn to Step 6 to select the remainder point

Step 6 Recalculate 119873 div and let 119873 div = rem1198751198701015840 whererem119875 and 1198701015840 indicate the number of the remainder pointsand clusters According to the new 119873 div turn to Step 3 orStep 4

Step 7 Save the selected 119873119875 lim points and terminate thealgorithm

35 Thoughts on the Proposed Algorithm In MOP for 119872objectives 119891

1 1198912 119891

119872and any two points 119909 and 119910 in the

feasible solution space Ω if each objective is satisfied with119891119894(119909) le 119891

119894(119910) and at least one objective is satisfied with

119891119894(119909) lt 119891

119894(119910) namely 119909 is at least as good as 119910with respect to

all the objectives and 119909 is strictly better than 119910 with respectto at least one objective then we say that 119909 dominates 119910 Ifno other solution is strictly better than 119909 then 119909 is called anondominated solution or Pareto optimal solution

In single objective problems there is only one objective tobe optimized Therefore the global best (119866119887119890119904119905) of the wholeswarm is only one In multiobjective problems there aremultiple consistent or conflicting objectives to be optimizedTherefore there exist very large or infinite numbers of solu-tions which cannot dominate each other These solutions arenondominated solutions or Pareto optimal solutions There-fore 119866119887119890119904119905 of the whole swarm is more than one How toselect suitable 119866119887119890119904119905 is a very key issue

When it is not possible to find all these solutions it maybe desirable to find as many solutions as possible in order toprovide more choices to the decision maker However if thesolutions are close to each other they are nearly of the samechoice It is desirable to find the Pareto optimal solutionsscattered uniformly over the Pareto front so that the decisionmaker can have a variety of choices The paper introducesuniform design to ensure that the acquired solutions scatteruniformly over the Pareto front in the objective space

For the MOPSO the diversity of the population is a veryimportant factor It has a key impact on the convergence ofan algorithm and uniformly distribution of Pareto optimalsolution It can effectively get rid of the premature of thealgorithmsThe paper introduces PAM clustering algorithmsto maintain the diversity of the population The particles inthe same cluster have similar features whereas ones in differ-ent clusters have dissimilar features Thus choosing particlesfrom different clusters may necessarily increase the diversityof the population

36 Steps of the Proposed Algorithm

Step 1 According to the population size 119873pop determinethe number of subintervals 119878 and the population size 119876

0in

each subinterval such that 1198760lowast 119878 is more than 119873pop 119876

0is

a prime and must exist in Table 1 Execute Algorithm A inSection 242 to generate a temporary population containing1198760lowast 119878 ge 119873pop points

Step 2 PerformAlgorithm B described in Section 32 to selectthe best 119873pop points from the 119876

0lowast 119878 points as the initial

population pop and acquire their objectives fit119881

Step 3 Initialize the speed and119875119887119890119904119905 of the particle 119894 by119881[119894] =pop[119894] and 119875119887119890119904119905[119894] = pop[119894] 119891119875119887119890119904119905[119894] = fit119881[119894] where 119894 =1 sdot sdot sdot 119873pop 119891119875119887119890119904119905[119894] denotes objectives of 119875119887119890119904119905[119894]

Step 4 According to Algorithm 2 select elitist or Paretooptimal solutions from pop and store them to the externalsecondary list pareto the size of which is assumed as119873119875

Step 5 Choose the suitable119866119887119890119904119905 for each of the particles frompareto in terms of Algorithm C described in Section 33

Mathematical Problems in Engineering 7

Step 6 Update the position pop and velocity 119881 for each ofthe particles respectively using formulas (2) and (3) whereformula (2) is modified as follows

V119894119895 (119896 + 1) = 119908 sdot V119894119895 (119896)

+ 1198881sdot 1199031sdot (119875119887119890119904119905

119894119895 (119896) minus 119909119894119895 (119896))

+ 1198882times 1199032times (119866119887119890119904119905

119894119895 (119896) minus 119909119894119895 (119896))

(8)

Step 7 For the 119895th dimensional variable of the 119894th particle ifit goes beyond its lower boundary or upper boundary then ittakes the 119895th dimensional value of its corresponding bound-ary and the 119895th dimensional value of its velocity takes theopposite value

Step 8 Calculate each of119872 objectives for each of the particlesand update them into fit119881

Step 9 Update 119875119887119890119904119905 of each particle as followsFor the 119894th particle if fit119881[119894] dominates 119891119875119887119890119904119905[119894] then

let 119891119875119887119890119904119905[119894] = fit119881[119894] and 119875119887119890119904119905[119894] = pop[119894] otherwise119875119887119890119904119905[119894] and119891119875119887119890119904119905[119894] are kept unchanged If neither of themis dominated by the other then randomly select one of themas 119891119875119887119890119904119905[119894] and update its corresponding 119875119887119890119904119905[119894]

Step 10 Update the external archive pareto storing Paretooptimality and its size119873119875 according to Algorithm 3

Step 11 Implement Algorithm D described in Section 34 toadjust the Pareto optimal solutions such that the number ofPareto optimal solutions is less than or equal to the size of theexternal archive119873119875 lim

Step 12 If the stop criterion is satisfied terminate algorithmotherwise turn to Step 5 and continue

4 Numerical Results

In order to evaluate the performances of the proposedalgorithm we compare it with two outstanding algorithmsUMOGA [23] and NSGA-II [25]

41 Test Problems Several well-known multiobjective testfunctions are used to test the performance of the proposedalgorithm

Biobjective problem FON [25 26] KUR [25 27]ZDT1 ZDT2 ZDT3 and ZDT6 [25 28ndash31]Three-objective problems DTLZ1 and DTLZ [2 32]

The definitions of them are as followsFON is defined as

1198911 (119909) = 1 minus exp(minus

3

sum119894=1

(119909119894minus1

radic3)2

)

1198912 (119909) = 1 minus exp(minus

3

sum119894=1

(119909119894+1

radic3)2

)

(9)

where 119909119894isin [minus4 4]

This test function has a nonconvex Pareto optimal frontKUR is defined as

1198911 (119909) =

119899minus1

sum119894=1

(minus10 exp (minus02 sdot radic1199092119894+ 1199092119894+1))

1198912 (119909) =

119899

sum119894=1

(1003816100381610038161003816119909119894100381610038161003816100381608+ 5 sdot sin1199093

119894)

(10)

where 119899 = 3 and 119909119894isin [minus5 5]

This test function has a nonconvex Pareto optimal frontin which there are three discontinuous regions

ZDT1 is defined as1198911(1199091) = 1199091

119892 (1199092 119909

119899) = 1 +

9sum119899119894=2119909119894

(119899 minus 1)

ℎ (1198911 119892) = 1 minus radic

1198911

119892

(11)

where 119899 = 30 and 119909119894isin [0 1]

This test function has a convex Pareto optimal frontZDT2 is defined as

1198911(1199091) = 1199091

119892 (1199092 119909

119899) = 1 +

9sum119899119894=2119909119894

(119899 minus 1)

ℎ (1198911 119892) = 1 minus (

1198911

119892)2

(12)

where 119899 = 30 and 119909119894isin [0 1]

This test function has a nonconvex Pareto optimal frontZDT3 is defined as

1198911(1199091) = 1199091

119892 (1199092 119909

119899) = 1 +

9sum119899119894=2119909119894

(119899 minus 1)

ℎ (1198911 119892) = 1 minus radic

1198911

119892minus (

1198911

119892) sin (10120587119891

1)

(13)

where 119899 = 30 and 119909119894isin [0 1]

This test function represents the discreteness feature ItsPareto optimal front consists of several noncontiguous con-vex parts The introduction of the sine function in ℎ causesdiscontinuity in the Pareto optimal front

ZDT6 is defined as

1198911(1199091) = 1 minus exp (minus4119909

1) sin6 (6120587119909

1)

119892 (1199092 119909

119899) = 1 + 9(

sum119899119894=2119909119894

(119899 minus 1))025

ℎ (1198911 119892) = 1 minus (

1198911

119892)2

(14)

where 119899 = 10 and 119909119894isin [0 1]

8 Mathematical Problems in Engineering

This test function has a nonconvex Pareto optimal frontIt includes two difficulties caused by the nonuniformityof the search space Firstly Pareto optimal solutions arenonuniformly distributed along the global Pareto optimalfront (the front is biased for solutions in which 119891

1(1199091) is near

one) Secondly the density of the solutions is the lowest nearthe Pareto optimal front and the highest away from the front

DTLZ1 is defined as

1198911 (119909) = 0511990911199092 (1 + 119892 (119909))

1198912 (119909) = 051199091 (1 minus 1199092) (1 + 119892 (119909))

1198913 (119909) = 05 (1 minus 1199091) (1 + 119892 (119909))

119892 (119909) = 100

sdot [5 +119899

sum119894=3

((119909119894minus 05)

2minus cos (20120587 (119909

119894minus 05)))]

(15)

where 119899 = 7 and 119909119894isin [0 1]

This test function has difficulty in the convergence to thePareto optimal hyperplane and contains (115minus1) local Paretooptimal fronts

DTLZ2 is defined as

1198911 (119909) = (1 + 119892 (119909)) cos (051205871199091) cos (051205871199092)

1198912 (119909) = (1 + 119892 (119909)) cos (051205871199091) sin (051205871199092)

1198913 (119909) = (1 + 119892 (119909)) sin (051205871199091)

119892 (119909) =119899

sum119894=3

(119909119894minus 05)

2

(16)

where 119899 = 12 and 119909119894isin [0 1]

The Pareto optimal solutions of this test function must lieinside the first octant of the unit sphere in a three-objectiveplot It is more difficult than DTLZ1

42 Parameter Values The parameter values of the proposedalgorithm UKMOPSO are adopted as follows

(i) Parameters for PSO the linearly descending inertiaweight [33 34] is within the interval [01 1] and theacceleration coefficients 119888

1and 1198882are both taken as 2

(ii) Parameter for PAM the minimal and maximal num-bers of clusters are 119870min = 2 and119870max = 10

(iii) Population size the population size119873pop is 200(iv) Parameters for the uniform design the number of

subintervals 119878 is 64 the number of the sample pointsor the population size of each subinterval 119876

0is 31 set

1198630= 31 and 119876

1= 5

(v) Stopping condition the algorithm terminates if thenumber of iterations is larger than the given maximalgenerations of 20

All the parameter values in UMOGA [23] are set equalto the original values in [23] the different and additional

parameter values between UMOGA and UKMOPSO are asfollows the number of subintervals 119878 is 16 119865 = 119873 (thenumber of variables)119863

1= 7 119901

119898= 002

NSGA-II in [25] adopted a population size of 100 acrossover probability of 08 a mutation probability of 1119899(where 119899 is the number of variables) and maximal genera-tions of 250 In order to make the comparisons fair we haveused population size of 100 and maximal generations of 40 inNSGA-II so that the total number of function evaluations inNSGA-II UKMOPSO and UMOGA is the same

43 Performance Metric Based on the assumption that thetrue Pareto front of a test problem is known many kinds ofperformance metrics have been proposed and used by manyresearchers such as [2 9 24 25 29 35ndash40] Three of themare adopted in the paper to compare the performance of theproposed algorithm with them The first metric is 119862 metric[29 38] It is taken as the quantitative metric of the solutionquality and often used to show that the outcomes of one algo-rithm dominating the outcomes of another algorithm It isdefined as

119862 (119860 119861) =|119887 isin 119861 exist119886 isin 119860 119886 ≺ 119887|

|119861| (17)

where 119886 ≺ 119887 represents 119887 being dominated by 119886 119860 representsthe Pareto front obtained by Algorithm 119860 and 119861 representsthe Pareto front obtained by Algorithm 119861 in a typical runFurthermore |119861| represents the number of elements of 119861

The metric value 119862(119860 119861) = 1 means that all points in 119861are dominated by or equal to points in 119860 In contrast 119862(119860119861) = 0means that none of the points in 119861 are covered by theones in 119860

The second metric is the IGD metric (namely InvertedGenerational Distance) [39ndash41] It is the mean value of thedistances from the set of solutions uniformly distributed inthe true Pareto front to the set of solutions obtained by analgorithm in objective space Let 119875lowast be a set of uniformly dis-tributed points along the true PF (Pareto Front) Let 119860 be anapproximate set to the PF the average distance from119875lowast to119860 isdefined as

IGD (119860 119875lowast) =sumVisin119875lowast 119889 (V 119860)

|119875lowast| (18)

where 119889(V 119860) is the minimum Euclidean distance between Vand the points in 119860 If |119875lowast| is large enough to represent thePF very well IGD(119860 119875lowast) could measure both the diversityand convergence of 119860 in a sense To have a low value ofIGD(119860 119875lowast) the set119860must be very close to the PF and cannotmiss any part of the whole PF

The smaller the IGDvalue for the set of obtained solutionsis the better the performance of the algorithm will be

The third measure is the measure of maximum spread(MS) [2 24 35] which is proposed in [42 43] It canmeasure how well the true Pareto front (PFtrue) is coveredby the discovered Pareto front (PFknown) through hyperboxes

Mathematical Problems in Engineering 9

0 01 02 03 04 05 06 07 08 09 10

01

02

03

04

05

06

07

08

09

1100

True Pareto frontUKMOPSO

0 01 02 03 04 05 06 07 08 09 10

01

02

03

04

05

06

07

08

09

152

True Pareto frontUMOGA

0 01 02 03 04 05 06 07 08 09 10

01

02

03

04

05

06

07

08

09

140

True Pareto frontNSGA-II

f1(x)

f2(x)

f1(x)

f2(x)

f1(x)

f2(x)

Figure 1 Pareto fronts obtained by different algorithms for test problem FON

formed by the extreme function values observed in the PFtrueand PFknown It is defined as

MS = radic 1

119872

119872

sum119894=1

120575119894

120575119894= (

min (119891max119894 119865max119894) minusmax (119891min

119894 119865min119894)

119865max119894

minus 119865min119894

)

2

(19)

where 119872 is the number of objectives 119891max119894

and 119891min119894

arethe maximum and minimum values of the 119894th objective inPFknown respectively and 119865

max119894

and 119865min119894

are the maximumand minimum values of the 119894th objective in PFtrue respec-tively

Note that if 119891min119894

ge 119865max119894

then 120575119894= 0 Algorithms with

largerMS values are desirable andMS = 1means that the truePareto front is totally covered by the obtained Pareto front

44 Results For each test problem we perform the proposedalgorithm (called UKMOPSO) for 30 independent runs andcompare its performance with UMOGA [23] and NSGA-II[25] The values of several metrics the 119862metric IGDmetricand MS metric are shown in Tables 2 3 4 and 5 The Paretofronts obtained by several algorithms implemented on severaltest functions are illustrated in Figures 1ndash6

For brevity 119862(UKMOPSOUMOGA) 119862(UMOGAUKMOPSO) 119862(UKMOPSONSGA-II) and 119862(NSGA-IIUKMOPSO) are respectively marked as 119862

12 11986221 11986213 and

11986231

10 Mathematical Problems in Engineering

0100

UKMOPSOTrue Pareto front

14

UMOGATrue Pareto front

40

NSGA-IITrue Pareto front

f1(x)

f2(x)

minus20 minus19 minus18 minus17 minus16 minus15 minus14

f1(x)

minus20 minus19 minus18 minus17 minus16 minus15 minus14 minus13 minus12minus12

minus10

minus8

minus6

minus4

minus2

0

f2(x)

f1(x)

minus20 minus19 minus18 minus17 minus16 minus15 minus14minus12

minus10

minus8

minus6

minus4

minus2

0

f2(x)

minus12

minus10

minus8

minus6

minus4

minus2

Figure 2 Pareto fronts obtained by different algorithms for test problem KUR

Table 2 Comparison of C metric between UKMOPSO andUMOGA

C (UKMOPSO UMOGA) C (UMOGA UKMOPSO)FON 0769 001KUR 0929 001ZDT1 012 013ZDT2 0063 003ZDT3 0139 0098ZDT6 0063 001DTLZ1 1 0DTLZ2 1 0

Table 3 Comparison of Cmetric between UKMOPSO and NSGA-II

C (UKMOPSO NSGA-II) C (NSGA-II UKMOPSO)FON 0025 001KUR 0075 011ZDT1 1 0ZDT2 1 0ZDT3 1 0ZDT6 1 0DTLZ1 0 013DTLZ2 0025 006

Mathematical Problems in Engineering 11

0

02

04

06

08

1

12

14100 76

0

05

1

15

2

25

325

0 01 02 03 04 05 06 07 08 09 1

True Pareto frontUKMOPSO

0 01 02 03 04 05 06 07 08 09 1

True Pareto frontNSGA-II

f1(x)

f2(x)

f1(x)

0 01 02 03 04 05 06 07 08 09 10

01

02

03

04

05

06

07

08

09

1

True Pareto frontUMOGA

f1(x)

f2(x)

f2(x)

Figure 3 Pareto fronts obtained by different algorithms for test problem ZDT1

Table 4 Comparison of IGD metric

UKMOPSO UMOGA NSGA-IIFON 00072 00432 03707KUR 00668 11576 12323ZDT1 00067 00265 13933ZDT2 00067 00727 21409ZDT3 01964 02151 02816ZDT6 00033 00139 59223DTLZ1 74827 22965 54182DTLZ2 01123 04319 02724

Table 5 Comparison of maximum spread (MS) metric

UKMOPSO UMOGA NSGA-IIFON 09769 09754 02368KUR 09789 09580 03414ZDT1 09429 09984 05495ZDT2 09999 09986 00019ZDT3 09276 09275 07963ZDT6 1 09974 03738DTLZ1 1 1 06588DTLZ2 1 1 07125

12 Mathematical Problems in Engineering

0

02

04

06

08

1

12

14100 9

0

05

1

15

2

25

38

0 01 02 03 04 05 06 07 08 09 1

True Pareto frontUKMOPSO

0 01 02 03 04 05 06 07 08 09 10

01

02

03

04

05

06

07

08

09

1

True Pareto frontUMOGA

0 01 02 03 04 05 06 07 08 09 1

True Pareto frontNSGA-II

f1(x) f1(x)

f2(x)

f2(x)

f2(x)

f1(x)

Figure 4 Pareto fronts obtained by different algorithms for test problem ZDT2

As shown in Table 2 for the test functions Fon and KUR11986212= 0769 and 119862

12= 0929 mean that 768 and 929 of

the solutions obtained by UMOGA are dominated by thoseobtained by UKMOPSO Similarly 119862

21= 001means that 1

of the solutions obtained by UKMOPSO are dominated bythose obtained by UMOGA For DTLZ1 and DTLZ2119862

12= 1

and 11986221= 0 mean that all solutions obtained by UMOGA

are dominated by those obtained by UKMOPSO but noneof solutions from UKMOPSO is dominated by those fromUMOGA It means that the solution quality via UKMOPSOis much better than that via UMOGA for the above testfunctions For ZDT1 ZDT2 ZDT3 and ZDT6 the valuesof the 119862 metrics do not have too many differences betweenUKMOPSO and UMOGA It means the solution qualities viaboth are almost identical

FromTable 3 we can see that for ZDT1 ZDT2 ZDT3 andZDT6 119862

13= 1 and 119862

31= 0mean that all solutions obtained

byNSGA-II are dominated by those obtained byUKMOPSObut none of solutions from UKMOPSO is dominated bythose from NSGA-II It means that the solution quality viaUKMOPSO is much better than that via NSGA-II for theabove test functions For the rest of the test functions thesolution qualities via both are almost identical

In Table 4 for all test problems the IGD values ofUKMOPSO are the smallest among UKMOPSO UMOGAand NSGA-II This means the PF found by UKMOPSO isthe nearest to the true PF compared with the PF obtainedby the other two algorithms namely the performance ofUKMOPSO is the best in the three algorithms The IGDvalues of UMOGA are all larger than those of NSGA-II for

Mathematical Problems in Engineering 13

0

02

04

06

08

1 92 86

0

05

1

15

2

25

3

35

4

45 40

0 01 02 03 04 05 06 07 08 09f1(x)

0 01 02 03 04 05 06 07 08 09f1(x)

0 01 02 03 04 05 06 07 08 09f1(x)

f2(x)

minus08

minus06

minus04

minus02

0

02

04

06

08

1

f2(x)

f2(x)

minus08

minus05

minus06

minus04

minus02

UKMOPSO UMOGA

NSGA-II

Figure 5 Pareto fronts obtained by different algorithms for test problem ZDT3

all test functions It means that the PF found by UMOGA iscloser to the true PF than the PF obtained by NSGA-II

From Table 5 we can see that all the MS values ofUKMOPSO are almost close to 1 for each test function Itmeans that almost all the true PF is totally covered by the PFobtained by UKMOPSO UMOGA is similar to UKMOPSOThe MS values of NSGA-II are much lesser than UKMOPSOand UMOGA especially MS = 00019 for ZDT2

Figures 1ndash6 all demonstrate that the PF found byUKMOPSO is the nearest to the true PF and scatters mostuniformly in those obtained by three algorithms Most of thepoints in the PF found by UKMOPSO overlap the pointsin the true PF It means the solution quality obtained byUKMOPSO is very high

45 Influence of the Uniform Crossover and PAM In orderto find out the influence of the uniform crossover on the

proposed algorithm we compare the distribution and num-ber of Pareto optimal solutions before and after performingthe uniform crossover One of the simulating results on thetest function ZDT1 is shown in Figure 7

From Figure 7 it can be seen that before and afterperforming the uniform crossover the number of data pointsin the 1st cluster and the 2nd cluster varies from 7 to 15and from 19 to 26 respectively namely many new Paretooptimal solutions having not been found before performingthe uniform crossover are generated and the differences ofthe data points between two clusters have been decreasedThis will directly influence the uniformity of the PF andacquire more uniform Pareto solutions

PAM is used to determine which Pareto solutions are tobe removed fromor be inserted into the external archiveThisis to maintain the diversity of Pareto optimal solutions The

14 Mathematical Problems in Engineering

100 16

0

05

1

15

2

25

3

35

4

45

540

02 03 04 05 06 07 08 09 10

01

02

03

04

05

06

07

08

09

1

True Pareto frontUKMOPSO

02 03 04 05 06 07 08 09 10

01

02

03

04

05

06

07

08

09

1

True Pareto frontUMOGA

02 03 04 05 06 07 08 09 1

True Pareto frontNSGA-II

f1(x)

f2(x)

f1(x)

f2(x)

f2(x)

f1(x)

Figure 6 Pareto fronts obtained by different algorithms for test problem ZDT6

diversity can be computed by the ldquodistance-to-average-pointrdquomeasure [44] defined as

diversity (119878) = 1

|119878|

|119878|

sum119894=1

radic119873

sum119895=1

(119901119894119895minus 119901119895)2

(20)

where 119878 is the population |119878| is the swarm size 119873 is thedimensionality of the problem 119901

119894119895is the 1198951015840th value of the 1198941015840th

particle and 119901119895is the 1198951015840th value of the average point 119901

If the number of Pareto optimality is less than or equalto the size of the external archive PAM has no influenceon the proposed algorithm Otherwise it is used to selectthe different type of Pareto optimality from several clustersTherefore The diversity of Pareto optimality will certainlyincreaseWemonitor the diversity of Pareto optimality before

and after performing PAMonZDT1 at a certain timeThe val-ues are respectively 8762 and 11752 This fully demonstratesthat PAM can improve the diversity of Pareto optimality

5 Conclusion and Future Work

In this paper a multiobjective particle swarm optimizationbased on PAM and uniform design is presented It firstlyimplements PAM to partition the Pareto front into119870 clustersin the objective space and then it implements the uniformcrossover operator on the minimal cluster so as to generatemore Pareto optimality in the minimal cluster When thesize of the Pareto solution is larger than that of the externalarchive PAM is used to determine which Pareto solutions areto be removed from or be inserted into the external archive

Mathematical Problems in Engineering 15

0 01 02 03 04 05 06 07 08 09

19

10

01

02

03

04

05

06

077

08

09

1

The 1st clusterThe 2nd cluster

26

(a)

41

0 01 02 03 04 05 06

26

07 08 09 10

01

02

03

04

051506

07

08

09

1

The 1st clusterThe 2nd cluster

(b)

Figure 7 Pareto optimal solutions before (a) and after (b) performing the uniform crossover

Finally it keeps all the points in the lesser clusters anddiscards some points in the larger clusters This can ensurethat each of the clusters will contain approximately the samedata points Therefore the diversity of the Pareto solutionswill increase and they can scatter uniformly over the Paretofront The results of the experimental simulation performedon several well-known test problems indicate that theproposed algorithm obviously outperforms the other twoalgorithms

This algorithm is going on for further enhancement andimprovement One attempt is to use a more efficient orapproximate clustering algorithm to speed up the executiontime of this algorithm Another attempt is to extend its appli-cation scopes

Conflict of Interests

The authors declare that there is no conflict of interestsregarding the publication of this paper

Acknowledgments

This research was supported by Guangxi Natural ScienceFoundation (no 2013GXNSFAA019337) Guangxi Universi-ties Key Project of Science and Technology Research (noKY2015ZD099) Key project of Guangxi Education Depart-ment (no 2013ZD055) Scientific Research Staring Founda-tion for the PHD Scholars of Yulin Normal University (noG2014005) Special Project of Yulin Normal University (no2012YJZX04) and Key Project of Yulin Normal University(no 2014YJZD05)The authors are grateful to the anonymousreferee for a careful checking of the details and for helpfulcomments that improved this paper

References

[1] J D Schaffer ldquoMultiple objective optimization with vector eval-uated genetic algorithmsrdquo in Proceedings of the 1st InternationalConference on Genetic Algorithms pp 93ndash100 1985

[2] L Tang and X Wang ldquoA hybrid multiobjective evolutionaryalgorithm for multiobjective optimization problemsrdquo IEEETransactions on Evolutionary Computation vol 17 no 1 pp 20ndash45 2013

[3] K Deb S Agrawal A Pratap and T Meyarivan ldquoA fast elitistnon-dominated sorting genetic algorithm for multi-objectiveoptimizationNSGA-IIrdquo inParallel ProblemSolving fromNaturePPSN VI vol 1917 of Lecture Notes in Computer Science pp849ndash858 Springer Berlin Germany 2000

[4] J Zhang Y Wang and J Feng ldquoAttribute index and uniformdesign based multiobjective association rule mining with evo-lutionary algorithmrdquo The Scientific World Journal vol 2013Article ID 259347 16 pages 2013

[5] S Jiang and Z Cai ldquoFaster convergence and higher hypervol-ume for multi-objective evolutionary algorithms by orthogonaland uniform designrdquo in Advances in Computation and Intelli-gence vol 6382 of Lecture Notes in Computer Science pp 312ndash328 Springer Berlin Germany 2010

[6] J Kennedy and R Eberhart ldquoParticle swarm optimizationrdquoin Proceedings of the IEEE International Conference on NeuralNetworks pp 1942ndash1948 December 1995

[7] R Eberhart and J Kennedy ldquoA new optimizer using particleswarm theoryrdquo in Proceedings of the 6th International Sympo-sium on Micro Machine and Human Science pp 39ndash43 IEEENagoya Japan October 1995

[8] L Yang ldquoA fast and elitist multi-objective particle swarm algo-rithm NSPSOrdquo in Proceedings of the IEEE International Con-ference on Granular Computing (GRC rsquo08) pp 470ndash475 August2008

[9] C A C Coello G T Pulido and M S Lechuga ldquoHandlingmultiple objectives with particle swarm optimizationrdquo IEEE

16 Mathematical Problems in Engineering

Transactions on Evolutionary Computation vol 8 no 3 pp256ndash279 2004

[10] M Clerc and J Kennedy ldquoThe particle swarm-explosion sta-bility and convergence in a multidimensional complex spacerdquoIEEE Transactions on Evolutionary Computation vol 6 no 1pp 58ndash73 2002

[11] H Xiaohui and R Eberhart ldquoMultiobjective optimization usingdynamic neighborhood particle swarm optimizationrdquo in Pro-ceedings of the Congress on Evolutionary Computation (CEC rsquo02)vol 2 pp 1677ndash1681 May 2002

[12] S Mostaghim and J Teich ldquoStrategies for finding goodlocal guides in multi-objective particle swarm optimization(MOPSO)rdquo in Proceedings of the IEEE Swarm IntelligenceSymposium (SIS rsquo03) pp 26ndash33 Indianapolis Ind USA April2003

[13] T Bartz-Beielstein P Limbourg J Mehnen K Schmitt K EParsopoulos and M N Vrahatis ldquoParticle swarm optimizersfor Pareto optimizationwith enhanced archiving techniquesrdquo inProceedings of the Congress on Evolutionary Computation (CECrsquo03) vol 3 pp 1780ndash1787 December 2003

[14] K E Parsopoulos D K Tasoulis and M N Vrahatis ldquoMulti-objective optimization using parallel vector evaluated particleswarm optimizationrdquo in Proceedings of the IASTED Interna-tional Conference on Artificial Intelligence and Applications (AIArsquo04) vol 2 pp 823ndash828 February 2004

[15] S-J Tsai T-Y Sun C-C Liu S-T Hsieh W-C Wu and S-YChiu ldquoAn improved multi-objective particle swarm optimizerformulti-objective problemsrdquo Expert Systems with Applicationsvol 37 no 8 pp 5872ndash5886 2010

[16] A A Mousa M A El-Shorbagy and W F Abd-El-WahedldquoLocal search based hybrid particle swarm optimization algo-rithm for multiobjective optimizationrdquo Swarm and Evolution-ary Computation vol 3 pp 1ndash14 2012

[17] J Wang W Liu W Zhang and B Yang ldquoMulti-objective parti-cle swarm optimization based on self-update and grid strategyrdquoin Proceedings of the 2012 International Conference on Informa-tion Technology and Software Engineering vol 211 of LectureNotes in Electrical Engineering pp 869ndash876 Springer BerlinGermany 2013

[18] K Khalili-Damghani A-R Abtahi and M Tavana ldquoA newmulti-objective particle swarmoptimizationmethod for solvingreliability redundancy allocation problemsrdquo Reliability Engi-neering amp System Safety vol 111 pp 58ndash75 2013

[19] J Zhang Y Wang and J Feng ldquoParallel particle swarm opti-mization based on PAMrdquo in Proceedings of the 2nd InternationalConference on Information Engineering and Computer Science(ICIECS rsquo10) Wuhan China December 2010

[20] J Han and M Kamber Data Mining Concepts and TechniquesMorgan Kaufmann 2006

[21] L F Ibrahim ldquoUsing of clustering algorithm CWSP-PAM forrural network planningrdquo in Proceedings of the 3rd InternationalConference on Information Technology and Applications (ICITArsquo05) vol 1 pp 280ndash283 IEEE July 2005

[22] P Chopra J Kang and J Lee ldquoUsing gene pair combinationsto improve the accuracy of the PAM classifierrdquo in Proceedingsof the IEEE International Conference on Bioinformatics andBiomedicine (BIBM rsquo09) pp 174ndash177 November 2009

[23] Y-W Leung and Y Wang ldquoMultiobjective programming usinguniform design and genetic algorithmrdquo IEEE Transactions onSystems Man and Cybernetics Part C Applications and Reviewsvol 30 no 3 pp 293ndash304 2000

[24] D Liu K C Tan C K Goh and W K Ho ldquoA multiobjectivememetic algorithm based on particle swarm optimizationrdquoIEEE Transactions on Systems Man and Cybernetics Part BCybernetics vol 37 no 1 pp 42ndash50 2007

[25] K Deb A Pratap S Agarwal and T Meyarivan ldquoA fast andelitist multiobjective genetic algorithm NSGA-IIrdquo IEEE Trans-actions on Evolutionary Computation vol 6 no 2 pp 182ndash1972002

[26] C M Fonseca and P J Fleming ldquoMultiobjective optimizationandmultiple constraint handling with evolutionary algorithmsII Application examplerdquo IEEE Transactions on Systems Manand Cybernetics Part A Systems and Humans vol 28 no 1 pp38ndash47 1998

[27] F Kursawe ldquoA variant of evolution strategies for vector opti-mizationrdquo in Parallel Problem Solving from Nature vol 496 ofLecture Notes in Computer Science pp 193ndash197 Springer BerlinGermany 1991

[28] E Zitzler K Deb and LThiele ldquoComparison of multiobjectiveevolutionary algorithms empirical resultsrdquo Evolutionary Com-putation vol 8 no 2 pp 173ndash195 2000

[29] H-L Liu Y Wang and Y-M Cheung ldquoA Multi-objective evo-lutionary algorithm usingmin-max strategy and sphere coordi-nate transformationrdquo Intelligent Automation amp Soft Computingvol 15 no 3 pp 361ndash384 2009

[30] N Chakraborti B S Kumar V S Babu S Moitra andA Mukhopadhyay ldquoA new multi-objective genetic algorithmapplied to hot-rolling processrdquoAppliedMathematicalModellingvol 32 no 9 pp 1781ndash1789 2008

[31] K Deb ldquoMulti-objective genetic algorithms problem difficul-ties and construction of test problemsrdquo Evolutionary Computa-tion vol 7 no 3 pp 205ndash230 1999

[32] K Deb L Thiele M Laumanns and E Zitzler ldquoScalable testproblems for evolutionarymultiobjective optimizationrdquo in Evo-lutionary Multiobjective Optimization Advanced Informationand Knowledge Processing pp 105ndash145 Springer London UK2005

[33] Y Shi and R Eberhart ldquoModified particle swarm optimizerrdquoin Proceedings of the IEEE World Congress on ComputationalIntelligence pp 69ndash73 May 1998

[34] C S Feng S Cong and X Y Feng ldquoA new adaptive inertiaweight strategy in particle swarm optimizationrdquo in Procedingsof the IEEE Congress on Evolutionary Computation (CEC rsquo07)pp 4186ndash4190 IEEE Singapore September 2007

[35] C K Goh K C Tan D S Liu and S C Chiam ldquoA competitiveand cooperative co-evolutionary approach to multi-objectiveparticle swarm optimization algorithm designrdquo European Jour-nal of Operational Research vol 202 no 1 pp 42ndash54 2010

[36] P Cheng J-S Pan L Li Y Tang and C Huang ldquoA surveyof performance assessment for multiobjective optimizersrdquo inProceedings of the 4th International Conference on Genetic andEvolutionary Computing (ICGEC rsquo10) pp 341ndash345 December2010

[37] E Zitzler L Thiele M Laumanns C M Fonseca and V GDa Fonseca ldquoPerformance assessment of multiobjective opti-mizers an analysis and reviewrdquo IEEE Transactions on Evolution-ary Computation vol 7 no 2 pp 117ndash132 2003

[38] E Zitzler and L Thiele ldquoMultiobjective evolutionary algo-rithms a comparative case study and the strength Paretoapproachrdquo IEEETransactions onEvolutionaryComputation vol3 no 4 pp 257ndash271 1999

[39] H Li and Q Zhang ldquoMultiobjective optimization problemswith complicated pareto sets MOEAD and NSGA-IIrdquo IEEE

Mathematical Problems in Engineering 17

Transactions on Evolutionary Computation vol 13 no 2 pp284ndash302 2009

[40] Q Zhang A Zhou S Zhao P N Suganthan W Liu andS Tiwari ldquoMultiobjective optimization test instances for theCEC 2009 special session and competitionrdquo Working ReportDepartment of Computing and Electronic Systems Universityof Essex 2008

[41] Q Zhang A Zhou and Y Jin ldquoRM-MEDA a regularity model-based multiobjective estimation of distribution algorithmrdquoIEEE Transactions on Evolutionary Computation vol 12 no 1pp 41ndash63 2008

[42] C-S Tsou H-H Fang H-H Chang and C-H Kao ldquoAnimproved particle swarmPareto optimizerwith local search andclusteringrdquo in Simulated Evolution and Learning vol 4247 ofLecture Notes in Computer Science pp 400ndash407 2006

[43] U Wickramasinghe and X Li ldquoChoosing leaders for multiob-jective PSO algorithms using differential evolutionrdquo in Proceed-ing of the 7th International Conference Simulated Evolution andLearning vol 5361 of Lecture Notes in Computer Science pp249ndash258 Springer Berlin Germany 2008

[44] T Krink J S Vesterstrom and J Riget ldquoParticle swarm opti-misation with spatial particle extensionrdquo in Proceedings of theCongress on Evolutionary Computation (CEC rsquo02) pp 1474ndash1479 May 2002

Submit your manuscripts athttpwwwhindawicom

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical Problems in Engineering

Hindawi Publishing Corporationhttpwwwhindawicom

Differential EquationsInternational Journal of

Volume 2014

Applied MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Probability and StatisticsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical PhysicsAdvances in

Complex AnalysisJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

OptimizationJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

CombinatoricsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Operations ResearchAdvances in

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Function Spaces

Abstract and Applied AnalysisHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of Mathematics and Mathematical Sciences

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Algebra

Discrete Dynamics in Nature and Society

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Decision SciencesAdvances in

Discrete MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014 Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Stochastic AnalysisInternational Journal of

Page 3: Research Article Multiobjective Particle Swarm

Mathematical Problems in Engineering 3

Algorithm PAM a 119896-medoids algorithm for partitioning based on medoid or central objectsInput119896 the number of clusters119863 a data set containing 119899 objectsOutput A set of 119896 clustersMethod(1) Arbitrarily choose 119896 objects in119863 as the initial representative objects or seeds(2) Repeat(3) Assign each remaining object to the cluster with the nearest representative object(4) Randomly select a nonrepresentative object 119900 random(5) Compute the total cost 119878 of swapping representative object 119900

119895 with 119900 random

(6) if 119878 lt 0 then swap 119900119895with 119900 random to form the new set of 119896 representative objects

(7) Until no change

Algorithm 1 PAM algorithm

extreme values than 119896-means PAM (Partitioning AroundMedoids) is the first and the most frequently used 119896-medoidsalgorithms It is shown in Algorithm 1

PAM constructs 119896 partitions (clusters) of the givendataset where each partition represents a cluster Each clustermay be represented by a centroid or a cluster representativewhich is some sort of summary description of all the objectscontained in a cluster It needs to determine 119896 partitions for 119899objectsThe process of PAM is by and large as follows Firstlyrandomly select 119896 representative objects and cluster otherobjects to the same group as the representative object accord-ing to the minimum distances between the representativeobject and other objects Then try to replace these represen-tative objects with other nonrepresentative objects in orderto minimize squared error All the possible pairs of objectsare analyzed where one object in each pair is considered arepresentative object and the other is not The total cost ofthe clustering is calculated for each such combination Anobject will be replaced with such an object having mini-mized squared error The set of best objects in each clusterafter iteration forms the representative objects for the nextiteration The final set of representative objects is the respec-tive centroids of the clusters [19ndash22]

24 Uniform Design

241 Uniform Design Themain objective of uniform designis to sample a small set of points from a given set of pointssuch that the sampled points are uniformly scattered [23]

Let there be 119899 factors and 119902 levels per factor When 119899 and119902 are given the uniform design selects 119902 from 119902119899 possiblecombinations such that these 119902 combinations are uniformlyscattered over the space of all possible combinations Theselected 119902 combinations are expressed as a uniform array119880(119899 119902) = [119880

119894119895]119902 times 119899 where 119880

119894119895is the level of the 119895th factor

in the 119894th combination and can be calculated by the followingformula

119880119894119895= (119894120590119895minus1 mod 119902) + 1 (4)

where 120590 is a parameter given in Table 1

242 Improved Generation of Initial Population An algo-rithm for dividing the solution space and an algorithm forgenerating the initial population have been designed [23]However Algorithm 2 in [23] considers only the dividingof the solution space not the dividing of the 119873-dimensionspace This will bring about some serious problems If weassume that in Step 2 119876

0= 5 and 119873 = 10 then 119880(119873119876

0)

is impossible to be generated because 1198760must be larger than

119873 Namely Algorithm 2 in [23] is only suitable for the lowdimensional problem In order to overcome the shortcom-ings the dividing of the119873-dimension space is introduced toimprove the algorithm The improved algorithm can be suit-able for not only low dimensional but also high dimensionalproblem The improved algorithm is shown as follows

Algorithm A (improved generation of initial population)

Step 1 Judge whether1198760is valid or not as it must be found in

the 1st column of Table 1 If it is not valid it stops and showserror messages otherwise it continues

Step 2 Execute Algorithm 1 in [23] to divide [119897 119906] into 119878subspaces [119897(1) 119906(1)] [119897(2) 119906(2)] sdot sdot sdot [119897(119878) 119906(119878)]

Step 31 Judge whether1198760is more than119873 if yes turn to Step

32 otherwise turn to Step 33

Step 32 Quantize each subspace and apply the uniform array119880(119873119876

0) to sample 119876

0points

Step 33 Divide119873-dimension space into lfloor1198731198730rfloor parts where

1198730is an integer more than 1 and less than119876

0and is generally

taken as1198760minus1 Among lfloor119873119873

0rfloorparts the 1st part corresponds

to the dimension from 1 to1198730 and the 2nd part corresponds

to the dimension from 1198730+ 1 to 2 lowast 119873

0 and so forth If

the remainder 119877 of 1198731198730is not equal to 0 then a plus part

corresponds to the dimension from lfloor1198731198730rfloor lowast 119873

0+ 1 to 119873

whose length is surely less than1198730 Repeat to execute Step 34

for each part

Step 34 Quantize each subspace and then apply uniformarray 119880(119873

0 1198760) to sample 119876

0points It is noteworthy that

4 Mathematical Problems in Engineering

Table 1 Values of the parameter 120590 for different number of factorsand different number of levels per factor

Number of levelsper factors Number of factors 120590

5 2sim4 27 2sim6 311 2sim10 7

132 53 4

4sim12 617 2sim16 10

19 2sim3 84sim18 14

232 13sim14 20sim22 7

8sim12 153sim7 15sim19 17

29

2 123 9

4sim7 168sim12 16sim24 8

13sim15 1425sim28 18

31 2 5sim12 20sim30 123sim4 13sim19 22

119880(1198730 1198760) is replaced with 119880(119877119876

0) in the plus part where

the remainder 119877 is equal to119873 minus lfloor1198731198730rfloor lowast 1198730

243 Crossover Operator Based on the Uniform Design Thecrossover operator based on the uniform design acts ontwo parents It quantizes the solution space defined by theseparents into a finite number of points and then it applies theuniformdesign to select a small sample of uniformly scatteredpoints as the potential offspring

For any two parents 1199011and 119901

2 their minimal values

and the maximal values of each dimension are used toform the novel solution space [119897parent 119906parent] Each domain of[119897parent 119906parent] is quantized into 119876

1levels where 119876

1is a pre-

defined prime numberThen the uniform design is applied toselect a sample of points as the potential offspringThe detailsof the algorithm can be referred to [23]

3 The Proposed Algorithm

31 Elitist Selection or Elitism [4] Elitism means that eliteindividuals cannot be excluded from the mating pool of thepopulation A strategy presented can always include the bestindividual of the current population in the next generation inorder to prevent the loss of good solutions found so far Thisstrategy can be extended to copy the best 119899 individuals to thenext generation This is explanation of the elitism In MOPelitism plays an important role

Two strategies are often used to implement elitism Onemaintains elitist solutions in the population the other stores

population paretocreate (pop)Input pop indicates the populationOutput pareto indicates the non-dominated solutionsparetolarr popfor forall chr1 isin popfor forall chr2 isin pareto and chr2 = chr1if chr1 dominate chr2

paretolarr pareto minus chr2end if

end forend forreturn pareto

Algorithm 2 Pseudocode of selecting elitist

elitist solutions into an external secondary list or externalarchive and reintroduces them to the populationThe formercopies all nondominated solutions in the current populationto the next population and then fills the rest of the next popu-lation by selecting from the remaining dominated solutions inthe current populationThe latter uses an external secondarylist or external archive to store the elitist solutions Externallist stores the nondominated solutions found It will beupdated in the next generation by means of removing elitistsolutions dominated by a new solution or adding the newsolution if it is not dominated by any existing elitist solution

The work adopts the second strategy namely storing eli-tist solutions to an external secondary list Its advantage is thatit can preserve and dynamically adjust all the nondominatedsolutions set till the current generation The pseudocodes ofselecting elitist and updating elitist are respectively shown inAlgorithms 2 and 3

32 Selection Mechanism for the Swarm Based on UniformDesign This paper adopts the uniform design to select thebest 119879

1points (119879

1lt 119879) from the 119879 points and acquires their

objectives fit119881 The detailed steps are as follows

Algorithm B

Step 1 Calculate each of119872 objectives for each of the119879 pointsnormalize each of objectives 119891

119894(119909) as follows

ℎ119894119909 =

119891119894 (119909)

max119910isin120595

1003816100381610038161003816119891119894 (119910)1003816100381610038161003816 (5)

where 120595 is a set of points in the current population and ℎ119894119909 is

the normalized objective

Step 2 Apply the uniform design to generate the 1198630weight

vectors 1199081 1199082 119908

1198630 each of them is used to compose

one fitness function by the following formula where 1198630is a

design parameter and it is prime

fit119894= 1199081198941ℎ1 (119909) + 1199081198942ℎ2 (119909) + sdot sdot sdot + 1199081198941198630ℎ1198630 (119909)

1 le 119894 le 1198630

(6)

Mathematical Problems in Engineering 5

population paretoupdate(offspring pareto)Input offspring indicates the offsprings after performing the crossover operator

and mutation operator pareto indicates the non-dominated solutionsOutput pareto indicates the non-dominated solutionsoffspringlarr call paretocreate(offspring)for forall chr1 isin offspring

nondominatedlarr truefor forall chr2 isin paretoif chr1 dominate chr2paretolarr pareto minus chr2

else if chr2 dominate chr1nondominatedlarr false

elsecontinue

end ifend forif nondominatedparetolarr pareto cup chr1

end ifend forreturn pareto

Algorithm 3 Pseudocode of updating elitist

Step 3 Based on each of fitness functions evaluate the qualityof the 119879 points Assume the remainder 119879

11198630is 1198770 For the

first 1198770fitness functions select the best lceil119879

11198630rceil points for

the rest of fitness functions select the best lfloor11987911198630rfloor points

Overall a total of 1198791points are selected The objectives of

these selected points are correspondingly stored into fit119881

33 Selection Mechanism for Gbest Based on PAM InMOPSO 119866119887119890119904119905 plays an important role in guiding the entireswarm toward the global Pareto front [24] In contrast tothe single objective PSO having only one global best 119866119887119890119904119905MOPSO has multiple Pareto optimality solutions which arenondominated each other How to select a suitable 119866119887119890119904119905 foreach of particles fromPareto optimal solutions is one very keyissue

The paper presents the selection mechanism for 119866119887119890119904119905based on PAM as follows

Algorithm C Assume the population and the numbers ofparticles are denoted as pop and119873pop and the Pareto opti-mality and the numbers of Pareto optimality are denoted aspareto and119873119875

Step 1 Acquire the number of cluster according to the fol-lowing formula

119870 = 119870min + (119870max minus 119870min) sdot (119905

max119866) (7)

where 119870min and 119870max respectively denote the minimal andmaximal value of 119870 namely 119870 isin [119870min 119870max] 119905 and max119866indicate the 119905th iteration and the maximal iteration numberThe formula can acquire the linearly increasing number ofcluster so as to accord with the process from the coarse searchto the elaborate search

Step 2 If 119873119875 le 119870 then for each particle in pop find thenearest Pareto optimal solution as its 119866119887119890119904119905 Otherwise turnto Step 3

Step 3 Perform PAM described in Section 23 to partitionPareto and pop into 119870 clusters respectively the clustercentroids of which are denoted as119862

1= 11988811 11988812 119888

1119870 and

1198622= 11988821 11988822 119888

2119870

Step 4 For each 1198882119895isin 1198622 find the nearest 119888

1119894isin 1198621 For

all the particles in the cluster represented by 1198882119895 their 119866119887119890119904119905

randomly takes one of Pareto optimal solutions in the clusterrepresented by 119888

1119894

34 Adjustment of Pareto Optimal Solutions Based on theUniform Design and PAM The number of Pareto optimalsolutions in the external archivewill increasingly enlargewithevolution namely the size of the external archive will becomevery large with the iteration number This will increasethe computation complexity and the execution time of analgorithmTherefore the size of the external archive must becontrolled In the meanwhile Pareto optimal solutions in theexternal archive are possibly close to each other in the objec-tive space If the solutions are close to each other they arenearly the same choice It is desirable to find the Pareto opti-mal solutions scattered uniformly over the Pareto front sothat the decision maker can have a variety of choices There-fore how to control the size of the external archive and selectrepresentative Pareto optimal solutions scattered uniformlyover the Pareto front is a key issue

The paper presents the adjustment algorithm of Paretooptimal solutions based on the uniform design and PAMThe algorithm firstly implements PAM to partition thePareto front into 119870 clusters in the objective space and then

6 Mathematical Problems in Engineering

implements the uniform crossover operator on the minimalcluster so as to generate more Pareto optimality in theminimal cluster finally it keeps all the points in the lesserclusters and discards some points in the larger clusters Thedetailed steps of the algorithm are shown as follows

Algorithm DAssume the Pareto optimality their values of119872functions and the numbers of Pareto optimality are denotedas pareto 119865pareto and119873119875 The size of the external archive isassumed as119873119875 lim The number of clusters is 119870

Step 1 If the numbers of data points in the maximal andthe minimal clusters have too many differences or are bothvery small select all the data points from the minimal clusterand the same number of points from the cluster which isthe nearest to the minimal cluster For each pair of datapoints perform the uniform crossover operator described inSection 243 Update pareto 119865pareto and 119873119875 according toAlgorithm 3

Step 2 If 119873119875 gt 119873119875 lim implement PAM to partition the119865pareto into119870 clusters in the objective space and let119873 div =119873119875 lim 119870 it indicates the average points to select from eachof119870 clustersWe can classify three situations according to thenumber of points in each cluster and119873 div as follows

(i) Situation 1 the numbers of points in all clusters arelarger than or equal to119873 div

(ii) Situation 2 the number of data points is larger thanor equal to119873 div only in one cluster

(iii) Situation 3 the number of the clusters in which thenumber of data points is larger than 119873 div is largerthan 1 and less than119870

Step 3 For Situation 1 sort all clusters according to their con-taining points in ascending order Select 119873 div points fromthe previous 119870 minus 1 clusters and select the remainder points119873119875 limminus119873 div lowast(119870 minus 1) from the last cluster Turn to Step7

Step 4 For Situation 2 keep the points of all the clusters inwhich the number of the points is less than or equal to119873 divand the remainder points are selected from that only clusterTurn to Step 7

Step 5 For Situation 3 keep all the points of all the clustersin which the number of points is less than or equal to119873 divTurn to Step 6 to select the remainder point

Step 6 Recalculate 119873 div and let 119873 div = rem1198751198701015840 whererem119875 and 1198701015840 indicate the number of the remainder pointsand clusters According to the new 119873 div turn to Step 3 orStep 4

Step 7 Save the selected 119873119875 lim points and terminate thealgorithm

35 Thoughts on the Proposed Algorithm In MOP for 119872objectives 119891

1 1198912 119891

119872and any two points 119909 and 119910 in the

feasible solution space Ω if each objective is satisfied with119891119894(119909) le 119891

119894(119910) and at least one objective is satisfied with

119891119894(119909) lt 119891

119894(119910) namely 119909 is at least as good as 119910with respect to

all the objectives and 119909 is strictly better than 119910 with respectto at least one objective then we say that 119909 dominates 119910 Ifno other solution is strictly better than 119909 then 119909 is called anondominated solution or Pareto optimal solution

In single objective problems there is only one objective tobe optimized Therefore the global best (119866119887119890119904119905) of the wholeswarm is only one In multiobjective problems there aremultiple consistent or conflicting objectives to be optimizedTherefore there exist very large or infinite numbers of solu-tions which cannot dominate each other These solutions arenondominated solutions or Pareto optimal solutions There-fore 119866119887119890119904119905 of the whole swarm is more than one How toselect suitable 119866119887119890119904119905 is a very key issue

When it is not possible to find all these solutions it maybe desirable to find as many solutions as possible in order toprovide more choices to the decision maker However if thesolutions are close to each other they are nearly of the samechoice It is desirable to find the Pareto optimal solutionsscattered uniformly over the Pareto front so that the decisionmaker can have a variety of choices The paper introducesuniform design to ensure that the acquired solutions scatteruniformly over the Pareto front in the objective space

For the MOPSO the diversity of the population is a veryimportant factor It has a key impact on the convergence ofan algorithm and uniformly distribution of Pareto optimalsolution It can effectively get rid of the premature of thealgorithmsThe paper introduces PAM clustering algorithmsto maintain the diversity of the population The particles inthe same cluster have similar features whereas ones in differ-ent clusters have dissimilar features Thus choosing particlesfrom different clusters may necessarily increase the diversityof the population

36 Steps of the Proposed Algorithm

Step 1 According to the population size 119873pop determinethe number of subintervals 119878 and the population size 119876

0in

each subinterval such that 1198760lowast 119878 is more than 119873pop 119876

0is

a prime and must exist in Table 1 Execute Algorithm A inSection 242 to generate a temporary population containing1198760lowast 119878 ge 119873pop points

Step 2 PerformAlgorithm B described in Section 32 to selectthe best 119873pop points from the 119876

0lowast 119878 points as the initial

population pop and acquire their objectives fit119881

Step 3 Initialize the speed and119875119887119890119904119905 of the particle 119894 by119881[119894] =pop[119894] and 119875119887119890119904119905[119894] = pop[119894] 119891119875119887119890119904119905[119894] = fit119881[119894] where 119894 =1 sdot sdot sdot 119873pop 119891119875119887119890119904119905[119894] denotes objectives of 119875119887119890119904119905[119894]

Step 4 According to Algorithm 2 select elitist or Paretooptimal solutions from pop and store them to the externalsecondary list pareto the size of which is assumed as119873119875

Step 5 Choose the suitable119866119887119890119904119905 for each of the particles frompareto in terms of Algorithm C described in Section 33

Mathematical Problems in Engineering 7

Step 6 Update the position pop and velocity 119881 for each ofthe particles respectively using formulas (2) and (3) whereformula (2) is modified as follows

V119894119895 (119896 + 1) = 119908 sdot V119894119895 (119896)

+ 1198881sdot 1199031sdot (119875119887119890119904119905

119894119895 (119896) minus 119909119894119895 (119896))

+ 1198882times 1199032times (119866119887119890119904119905

119894119895 (119896) minus 119909119894119895 (119896))

(8)

Step 7 For the 119895th dimensional variable of the 119894th particle ifit goes beyond its lower boundary or upper boundary then ittakes the 119895th dimensional value of its corresponding bound-ary and the 119895th dimensional value of its velocity takes theopposite value

Step 8 Calculate each of119872 objectives for each of the particlesand update them into fit119881

Step 9 Update 119875119887119890119904119905 of each particle as followsFor the 119894th particle if fit119881[119894] dominates 119891119875119887119890119904119905[119894] then

let 119891119875119887119890119904119905[119894] = fit119881[119894] and 119875119887119890119904119905[119894] = pop[119894] otherwise119875119887119890119904119905[119894] and119891119875119887119890119904119905[119894] are kept unchanged If neither of themis dominated by the other then randomly select one of themas 119891119875119887119890119904119905[119894] and update its corresponding 119875119887119890119904119905[119894]

Step 10 Update the external archive pareto storing Paretooptimality and its size119873119875 according to Algorithm 3

Step 11 Implement Algorithm D described in Section 34 toadjust the Pareto optimal solutions such that the number ofPareto optimal solutions is less than or equal to the size of theexternal archive119873119875 lim

Step 12 If the stop criterion is satisfied terminate algorithmotherwise turn to Step 5 and continue

4 Numerical Results

In order to evaluate the performances of the proposedalgorithm we compare it with two outstanding algorithmsUMOGA [23] and NSGA-II [25]

41 Test Problems Several well-known multiobjective testfunctions are used to test the performance of the proposedalgorithm

Biobjective problem FON [25 26] KUR [25 27]ZDT1 ZDT2 ZDT3 and ZDT6 [25 28ndash31]Three-objective problems DTLZ1 and DTLZ [2 32]

The definitions of them are as followsFON is defined as

1198911 (119909) = 1 minus exp(minus

3

sum119894=1

(119909119894minus1

radic3)2

)

1198912 (119909) = 1 minus exp(minus

3

sum119894=1

(119909119894+1

radic3)2

)

(9)

where 119909119894isin [minus4 4]

This test function has a nonconvex Pareto optimal frontKUR is defined as

1198911 (119909) =

119899minus1

sum119894=1

(minus10 exp (minus02 sdot radic1199092119894+ 1199092119894+1))

1198912 (119909) =

119899

sum119894=1

(1003816100381610038161003816119909119894100381610038161003816100381608+ 5 sdot sin1199093

119894)

(10)

where 119899 = 3 and 119909119894isin [minus5 5]

This test function has a nonconvex Pareto optimal frontin which there are three discontinuous regions

ZDT1 is defined as1198911(1199091) = 1199091

119892 (1199092 119909

119899) = 1 +

9sum119899119894=2119909119894

(119899 minus 1)

ℎ (1198911 119892) = 1 minus radic

1198911

119892

(11)

where 119899 = 30 and 119909119894isin [0 1]

This test function has a convex Pareto optimal frontZDT2 is defined as

1198911(1199091) = 1199091

119892 (1199092 119909

119899) = 1 +

9sum119899119894=2119909119894

(119899 minus 1)

ℎ (1198911 119892) = 1 minus (

1198911

119892)2

(12)

where 119899 = 30 and 119909119894isin [0 1]

This test function has a nonconvex Pareto optimal frontZDT3 is defined as

1198911(1199091) = 1199091

119892 (1199092 119909

119899) = 1 +

9sum119899119894=2119909119894

(119899 minus 1)

ℎ (1198911 119892) = 1 minus radic

1198911

119892minus (

1198911

119892) sin (10120587119891

1)

(13)

where 119899 = 30 and 119909119894isin [0 1]

This test function represents the discreteness feature ItsPareto optimal front consists of several noncontiguous con-vex parts The introduction of the sine function in ℎ causesdiscontinuity in the Pareto optimal front

ZDT6 is defined as

1198911(1199091) = 1 minus exp (minus4119909

1) sin6 (6120587119909

1)

119892 (1199092 119909

119899) = 1 + 9(

sum119899119894=2119909119894

(119899 minus 1))025

ℎ (1198911 119892) = 1 minus (

1198911

119892)2

(14)

where 119899 = 10 and 119909119894isin [0 1]

8 Mathematical Problems in Engineering

This test function has a nonconvex Pareto optimal frontIt includes two difficulties caused by the nonuniformityof the search space Firstly Pareto optimal solutions arenonuniformly distributed along the global Pareto optimalfront (the front is biased for solutions in which 119891

1(1199091) is near

one) Secondly the density of the solutions is the lowest nearthe Pareto optimal front and the highest away from the front

DTLZ1 is defined as

1198911 (119909) = 0511990911199092 (1 + 119892 (119909))

1198912 (119909) = 051199091 (1 minus 1199092) (1 + 119892 (119909))

1198913 (119909) = 05 (1 minus 1199091) (1 + 119892 (119909))

119892 (119909) = 100

sdot [5 +119899

sum119894=3

((119909119894minus 05)

2minus cos (20120587 (119909

119894minus 05)))]

(15)

where 119899 = 7 and 119909119894isin [0 1]

This test function has difficulty in the convergence to thePareto optimal hyperplane and contains (115minus1) local Paretooptimal fronts

DTLZ2 is defined as

1198911 (119909) = (1 + 119892 (119909)) cos (051205871199091) cos (051205871199092)

1198912 (119909) = (1 + 119892 (119909)) cos (051205871199091) sin (051205871199092)

1198913 (119909) = (1 + 119892 (119909)) sin (051205871199091)

119892 (119909) =119899

sum119894=3

(119909119894minus 05)

2

(16)

where 119899 = 12 and 119909119894isin [0 1]

The Pareto optimal solutions of this test function must lieinside the first octant of the unit sphere in a three-objectiveplot It is more difficult than DTLZ1

42 Parameter Values The parameter values of the proposedalgorithm UKMOPSO are adopted as follows

(i) Parameters for PSO the linearly descending inertiaweight [33 34] is within the interval [01 1] and theacceleration coefficients 119888

1and 1198882are both taken as 2

(ii) Parameter for PAM the minimal and maximal num-bers of clusters are 119870min = 2 and119870max = 10

(iii) Population size the population size119873pop is 200(iv) Parameters for the uniform design the number of

subintervals 119878 is 64 the number of the sample pointsor the population size of each subinterval 119876

0is 31 set

1198630= 31 and 119876

1= 5

(v) Stopping condition the algorithm terminates if thenumber of iterations is larger than the given maximalgenerations of 20

All the parameter values in UMOGA [23] are set equalto the original values in [23] the different and additional

parameter values between UMOGA and UKMOPSO are asfollows the number of subintervals 119878 is 16 119865 = 119873 (thenumber of variables)119863

1= 7 119901

119898= 002

NSGA-II in [25] adopted a population size of 100 acrossover probability of 08 a mutation probability of 1119899(where 119899 is the number of variables) and maximal genera-tions of 250 In order to make the comparisons fair we haveused population size of 100 and maximal generations of 40 inNSGA-II so that the total number of function evaluations inNSGA-II UKMOPSO and UMOGA is the same

43 Performance Metric Based on the assumption that thetrue Pareto front of a test problem is known many kinds ofperformance metrics have been proposed and used by manyresearchers such as [2 9 24 25 29 35ndash40] Three of themare adopted in the paper to compare the performance of theproposed algorithm with them The first metric is 119862 metric[29 38] It is taken as the quantitative metric of the solutionquality and often used to show that the outcomes of one algo-rithm dominating the outcomes of another algorithm It isdefined as

119862 (119860 119861) =|119887 isin 119861 exist119886 isin 119860 119886 ≺ 119887|

|119861| (17)

where 119886 ≺ 119887 represents 119887 being dominated by 119886 119860 representsthe Pareto front obtained by Algorithm 119860 and 119861 representsthe Pareto front obtained by Algorithm 119861 in a typical runFurthermore |119861| represents the number of elements of 119861

The metric value 119862(119860 119861) = 1 means that all points in 119861are dominated by or equal to points in 119860 In contrast 119862(119860119861) = 0means that none of the points in 119861 are covered by theones in 119860

The second metric is the IGD metric (namely InvertedGenerational Distance) [39ndash41] It is the mean value of thedistances from the set of solutions uniformly distributed inthe true Pareto front to the set of solutions obtained by analgorithm in objective space Let 119875lowast be a set of uniformly dis-tributed points along the true PF (Pareto Front) Let 119860 be anapproximate set to the PF the average distance from119875lowast to119860 isdefined as

IGD (119860 119875lowast) =sumVisin119875lowast 119889 (V 119860)

|119875lowast| (18)

where 119889(V 119860) is the minimum Euclidean distance between Vand the points in 119860 If |119875lowast| is large enough to represent thePF very well IGD(119860 119875lowast) could measure both the diversityand convergence of 119860 in a sense To have a low value ofIGD(119860 119875lowast) the set119860must be very close to the PF and cannotmiss any part of the whole PF

The smaller the IGDvalue for the set of obtained solutionsis the better the performance of the algorithm will be

The third measure is the measure of maximum spread(MS) [2 24 35] which is proposed in [42 43] It canmeasure how well the true Pareto front (PFtrue) is coveredby the discovered Pareto front (PFknown) through hyperboxes

Mathematical Problems in Engineering 9

0 01 02 03 04 05 06 07 08 09 10

01

02

03

04

05

06

07

08

09

1100

True Pareto frontUKMOPSO

0 01 02 03 04 05 06 07 08 09 10

01

02

03

04

05

06

07

08

09

152

True Pareto frontUMOGA

0 01 02 03 04 05 06 07 08 09 10

01

02

03

04

05

06

07

08

09

140

True Pareto frontNSGA-II

f1(x)

f2(x)

f1(x)

f2(x)

f1(x)

f2(x)

Figure 1 Pareto fronts obtained by different algorithms for test problem FON

formed by the extreme function values observed in the PFtrueand PFknown It is defined as

MS = radic 1

119872

119872

sum119894=1

120575119894

120575119894= (

min (119891max119894 119865max119894) minusmax (119891min

119894 119865min119894)

119865max119894

minus 119865min119894

)

2

(19)

where 119872 is the number of objectives 119891max119894

and 119891min119894

arethe maximum and minimum values of the 119894th objective inPFknown respectively and 119865

max119894

and 119865min119894

are the maximumand minimum values of the 119894th objective in PFtrue respec-tively

Note that if 119891min119894

ge 119865max119894

then 120575119894= 0 Algorithms with

largerMS values are desirable andMS = 1means that the truePareto front is totally covered by the obtained Pareto front

44 Results For each test problem we perform the proposedalgorithm (called UKMOPSO) for 30 independent runs andcompare its performance with UMOGA [23] and NSGA-II[25] The values of several metrics the 119862metric IGDmetricand MS metric are shown in Tables 2 3 4 and 5 The Paretofronts obtained by several algorithms implemented on severaltest functions are illustrated in Figures 1ndash6

For brevity 119862(UKMOPSOUMOGA) 119862(UMOGAUKMOPSO) 119862(UKMOPSONSGA-II) and 119862(NSGA-IIUKMOPSO) are respectively marked as 119862

12 11986221 11986213 and

11986231

10 Mathematical Problems in Engineering

0100

UKMOPSOTrue Pareto front

14

UMOGATrue Pareto front

40

NSGA-IITrue Pareto front

f1(x)

f2(x)

minus20 minus19 minus18 minus17 minus16 minus15 minus14

f1(x)

minus20 minus19 minus18 minus17 minus16 minus15 minus14 minus13 minus12minus12

minus10

minus8

minus6

minus4

minus2

0

f2(x)

f1(x)

minus20 minus19 minus18 minus17 minus16 minus15 minus14minus12

minus10

minus8

minus6

minus4

minus2

0

f2(x)

minus12

minus10

minus8

minus6

minus4

minus2

Figure 2 Pareto fronts obtained by different algorithms for test problem KUR

Table 2 Comparison of C metric between UKMOPSO andUMOGA

C (UKMOPSO UMOGA) C (UMOGA UKMOPSO)FON 0769 001KUR 0929 001ZDT1 012 013ZDT2 0063 003ZDT3 0139 0098ZDT6 0063 001DTLZ1 1 0DTLZ2 1 0

Table 3 Comparison of Cmetric between UKMOPSO and NSGA-II

C (UKMOPSO NSGA-II) C (NSGA-II UKMOPSO)FON 0025 001KUR 0075 011ZDT1 1 0ZDT2 1 0ZDT3 1 0ZDT6 1 0DTLZ1 0 013DTLZ2 0025 006

Mathematical Problems in Engineering 11

0

02

04

06

08

1

12

14100 76

0

05

1

15

2

25

325

0 01 02 03 04 05 06 07 08 09 1

True Pareto frontUKMOPSO

0 01 02 03 04 05 06 07 08 09 1

True Pareto frontNSGA-II

f1(x)

f2(x)

f1(x)

0 01 02 03 04 05 06 07 08 09 10

01

02

03

04

05

06

07

08

09

1

True Pareto frontUMOGA

f1(x)

f2(x)

f2(x)

Figure 3 Pareto fronts obtained by different algorithms for test problem ZDT1

Table 4 Comparison of IGD metric

UKMOPSO UMOGA NSGA-IIFON 00072 00432 03707KUR 00668 11576 12323ZDT1 00067 00265 13933ZDT2 00067 00727 21409ZDT3 01964 02151 02816ZDT6 00033 00139 59223DTLZ1 74827 22965 54182DTLZ2 01123 04319 02724

Table 5 Comparison of maximum spread (MS) metric

UKMOPSO UMOGA NSGA-IIFON 09769 09754 02368KUR 09789 09580 03414ZDT1 09429 09984 05495ZDT2 09999 09986 00019ZDT3 09276 09275 07963ZDT6 1 09974 03738DTLZ1 1 1 06588DTLZ2 1 1 07125

12 Mathematical Problems in Engineering

0

02

04

06

08

1

12

14100 9

0

05

1

15

2

25

38

0 01 02 03 04 05 06 07 08 09 1

True Pareto frontUKMOPSO

0 01 02 03 04 05 06 07 08 09 10

01

02

03

04

05

06

07

08

09

1

True Pareto frontUMOGA

0 01 02 03 04 05 06 07 08 09 1

True Pareto frontNSGA-II

f1(x) f1(x)

f2(x)

f2(x)

f2(x)

f1(x)

Figure 4 Pareto fronts obtained by different algorithms for test problem ZDT2

As shown in Table 2 for the test functions Fon and KUR11986212= 0769 and 119862

12= 0929 mean that 768 and 929 of

the solutions obtained by UMOGA are dominated by thoseobtained by UKMOPSO Similarly 119862

21= 001means that 1

of the solutions obtained by UKMOPSO are dominated bythose obtained by UMOGA For DTLZ1 and DTLZ2119862

12= 1

and 11986221= 0 mean that all solutions obtained by UMOGA

are dominated by those obtained by UKMOPSO but noneof solutions from UKMOPSO is dominated by those fromUMOGA It means that the solution quality via UKMOPSOis much better than that via UMOGA for the above testfunctions For ZDT1 ZDT2 ZDT3 and ZDT6 the valuesof the 119862 metrics do not have too many differences betweenUKMOPSO and UMOGA It means the solution qualities viaboth are almost identical

FromTable 3 we can see that for ZDT1 ZDT2 ZDT3 andZDT6 119862

13= 1 and 119862

31= 0mean that all solutions obtained

byNSGA-II are dominated by those obtained byUKMOPSObut none of solutions from UKMOPSO is dominated bythose from NSGA-II It means that the solution quality viaUKMOPSO is much better than that via NSGA-II for theabove test functions For the rest of the test functions thesolution qualities via both are almost identical

In Table 4 for all test problems the IGD values ofUKMOPSO are the smallest among UKMOPSO UMOGAand NSGA-II This means the PF found by UKMOPSO isthe nearest to the true PF compared with the PF obtainedby the other two algorithms namely the performance ofUKMOPSO is the best in the three algorithms The IGDvalues of UMOGA are all larger than those of NSGA-II for

Mathematical Problems in Engineering 13

0

02

04

06

08

1 92 86

0

05

1

15

2

25

3

35

4

45 40

0 01 02 03 04 05 06 07 08 09f1(x)

0 01 02 03 04 05 06 07 08 09f1(x)

0 01 02 03 04 05 06 07 08 09f1(x)

f2(x)

minus08

minus06

minus04

minus02

0

02

04

06

08

1

f2(x)

f2(x)

minus08

minus05

minus06

minus04

minus02

UKMOPSO UMOGA

NSGA-II

Figure 5 Pareto fronts obtained by different algorithms for test problem ZDT3

all test functions It means that the PF found by UMOGA iscloser to the true PF than the PF obtained by NSGA-II

From Table 5 we can see that all the MS values ofUKMOPSO are almost close to 1 for each test function Itmeans that almost all the true PF is totally covered by the PFobtained by UKMOPSO UMOGA is similar to UKMOPSOThe MS values of NSGA-II are much lesser than UKMOPSOand UMOGA especially MS = 00019 for ZDT2

Figures 1ndash6 all demonstrate that the PF found byUKMOPSO is the nearest to the true PF and scatters mostuniformly in those obtained by three algorithms Most of thepoints in the PF found by UKMOPSO overlap the pointsin the true PF It means the solution quality obtained byUKMOPSO is very high

45 Influence of the Uniform Crossover and PAM In orderto find out the influence of the uniform crossover on the

proposed algorithm we compare the distribution and num-ber of Pareto optimal solutions before and after performingthe uniform crossover One of the simulating results on thetest function ZDT1 is shown in Figure 7

From Figure 7 it can be seen that before and afterperforming the uniform crossover the number of data pointsin the 1st cluster and the 2nd cluster varies from 7 to 15and from 19 to 26 respectively namely many new Paretooptimal solutions having not been found before performingthe uniform crossover are generated and the differences ofthe data points between two clusters have been decreasedThis will directly influence the uniformity of the PF andacquire more uniform Pareto solutions

PAM is used to determine which Pareto solutions are tobe removed fromor be inserted into the external archiveThisis to maintain the diversity of Pareto optimal solutions The

14 Mathematical Problems in Engineering

100 16

0

05

1

15

2

25

3

35

4

45

540

02 03 04 05 06 07 08 09 10

01

02

03

04

05

06

07

08

09

1

True Pareto frontUKMOPSO

02 03 04 05 06 07 08 09 10

01

02

03

04

05

06

07

08

09

1

True Pareto frontUMOGA

02 03 04 05 06 07 08 09 1

True Pareto frontNSGA-II

f1(x)

f2(x)

f1(x)

f2(x)

f2(x)

f1(x)

Figure 6 Pareto fronts obtained by different algorithms for test problem ZDT6

diversity can be computed by the ldquodistance-to-average-pointrdquomeasure [44] defined as

diversity (119878) = 1

|119878|

|119878|

sum119894=1

radic119873

sum119895=1

(119901119894119895minus 119901119895)2

(20)

where 119878 is the population |119878| is the swarm size 119873 is thedimensionality of the problem 119901

119894119895is the 1198951015840th value of the 1198941015840th

particle and 119901119895is the 1198951015840th value of the average point 119901

If the number of Pareto optimality is less than or equalto the size of the external archive PAM has no influenceon the proposed algorithm Otherwise it is used to selectthe different type of Pareto optimality from several clustersTherefore The diversity of Pareto optimality will certainlyincreaseWemonitor the diversity of Pareto optimality before

and after performing PAMonZDT1 at a certain timeThe val-ues are respectively 8762 and 11752 This fully demonstratesthat PAM can improve the diversity of Pareto optimality

5 Conclusion and Future Work

In this paper a multiobjective particle swarm optimizationbased on PAM and uniform design is presented It firstlyimplements PAM to partition the Pareto front into119870 clustersin the objective space and then it implements the uniformcrossover operator on the minimal cluster so as to generatemore Pareto optimality in the minimal cluster When thesize of the Pareto solution is larger than that of the externalarchive PAM is used to determine which Pareto solutions areto be removed from or be inserted into the external archive

Mathematical Problems in Engineering 15

0 01 02 03 04 05 06 07 08 09

19

10

01

02

03

04

05

06

077

08

09

1

The 1st clusterThe 2nd cluster

26

(a)

41

0 01 02 03 04 05 06

26

07 08 09 10

01

02

03

04

051506

07

08

09

1

The 1st clusterThe 2nd cluster

(b)

Figure 7 Pareto optimal solutions before (a) and after (b) performing the uniform crossover

Finally it keeps all the points in the lesser clusters anddiscards some points in the larger clusters This can ensurethat each of the clusters will contain approximately the samedata points Therefore the diversity of the Pareto solutionswill increase and they can scatter uniformly over the Paretofront The results of the experimental simulation performedon several well-known test problems indicate that theproposed algorithm obviously outperforms the other twoalgorithms

This algorithm is going on for further enhancement andimprovement One attempt is to use a more efficient orapproximate clustering algorithm to speed up the executiontime of this algorithm Another attempt is to extend its appli-cation scopes

Conflict of Interests

The authors declare that there is no conflict of interestsregarding the publication of this paper

Acknowledgments

This research was supported by Guangxi Natural ScienceFoundation (no 2013GXNSFAA019337) Guangxi Universi-ties Key Project of Science and Technology Research (noKY2015ZD099) Key project of Guangxi Education Depart-ment (no 2013ZD055) Scientific Research Staring Founda-tion for the PHD Scholars of Yulin Normal University (noG2014005) Special Project of Yulin Normal University (no2012YJZX04) and Key Project of Yulin Normal University(no 2014YJZD05)The authors are grateful to the anonymousreferee for a careful checking of the details and for helpfulcomments that improved this paper

References

[1] J D Schaffer ldquoMultiple objective optimization with vector eval-uated genetic algorithmsrdquo in Proceedings of the 1st InternationalConference on Genetic Algorithms pp 93ndash100 1985

[2] L Tang and X Wang ldquoA hybrid multiobjective evolutionaryalgorithm for multiobjective optimization problemsrdquo IEEETransactions on Evolutionary Computation vol 17 no 1 pp 20ndash45 2013

[3] K Deb S Agrawal A Pratap and T Meyarivan ldquoA fast elitistnon-dominated sorting genetic algorithm for multi-objectiveoptimizationNSGA-IIrdquo inParallel ProblemSolving fromNaturePPSN VI vol 1917 of Lecture Notes in Computer Science pp849ndash858 Springer Berlin Germany 2000

[4] J Zhang Y Wang and J Feng ldquoAttribute index and uniformdesign based multiobjective association rule mining with evo-lutionary algorithmrdquo The Scientific World Journal vol 2013Article ID 259347 16 pages 2013

[5] S Jiang and Z Cai ldquoFaster convergence and higher hypervol-ume for multi-objective evolutionary algorithms by orthogonaland uniform designrdquo in Advances in Computation and Intelli-gence vol 6382 of Lecture Notes in Computer Science pp 312ndash328 Springer Berlin Germany 2010

[6] J Kennedy and R Eberhart ldquoParticle swarm optimizationrdquoin Proceedings of the IEEE International Conference on NeuralNetworks pp 1942ndash1948 December 1995

[7] R Eberhart and J Kennedy ldquoA new optimizer using particleswarm theoryrdquo in Proceedings of the 6th International Sympo-sium on Micro Machine and Human Science pp 39ndash43 IEEENagoya Japan October 1995

[8] L Yang ldquoA fast and elitist multi-objective particle swarm algo-rithm NSPSOrdquo in Proceedings of the IEEE International Con-ference on Granular Computing (GRC rsquo08) pp 470ndash475 August2008

[9] C A C Coello G T Pulido and M S Lechuga ldquoHandlingmultiple objectives with particle swarm optimizationrdquo IEEE

16 Mathematical Problems in Engineering

Transactions on Evolutionary Computation vol 8 no 3 pp256ndash279 2004

[10] M Clerc and J Kennedy ldquoThe particle swarm-explosion sta-bility and convergence in a multidimensional complex spacerdquoIEEE Transactions on Evolutionary Computation vol 6 no 1pp 58ndash73 2002

[11] H Xiaohui and R Eberhart ldquoMultiobjective optimization usingdynamic neighborhood particle swarm optimizationrdquo in Pro-ceedings of the Congress on Evolutionary Computation (CEC rsquo02)vol 2 pp 1677ndash1681 May 2002

[12] S Mostaghim and J Teich ldquoStrategies for finding goodlocal guides in multi-objective particle swarm optimization(MOPSO)rdquo in Proceedings of the IEEE Swarm IntelligenceSymposium (SIS rsquo03) pp 26ndash33 Indianapolis Ind USA April2003

[13] T Bartz-Beielstein P Limbourg J Mehnen K Schmitt K EParsopoulos and M N Vrahatis ldquoParticle swarm optimizersfor Pareto optimizationwith enhanced archiving techniquesrdquo inProceedings of the Congress on Evolutionary Computation (CECrsquo03) vol 3 pp 1780ndash1787 December 2003

[14] K E Parsopoulos D K Tasoulis and M N Vrahatis ldquoMulti-objective optimization using parallel vector evaluated particleswarm optimizationrdquo in Proceedings of the IASTED Interna-tional Conference on Artificial Intelligence and Applications (AIArsquo04) vol 2 pp 823ndash828 February 2004

[15] S-J Tsai T-Y Sun C-C Liu S-T Hsieh W-C Wu and S-YChiu ldquoAn improved multi-objective particle swarm optimizerformulti-objective problemsrdquo Expert Systems with Applicationsvol 37 no 8 pp 5872ndash5886 2010

[16] A A Mousa M A El-Shorbagy and W F Abd-El-WahedldquoLocal search based hybrid particle swarm optimization algo-rithm for multiobjective optimizationrdquo Swarm and Evolution-ary Computation vol 3 pp 1ndash14 2012

[17] J Wang W Liu W Zhang and B Yang ldquoMulti-objective parti-cle swarm optimization based on self-update and grid strategyrdquoin Proceedings of the 2012 International Conference on Informa-tion Technology and Software Engineering vol 211 of LectureNotes in Electrical Engineering pp 869ndash876 Springer BerlinGermany 2013

[18] K Khalili-Damghani A-R Abtahi and M Tavana ldquoA newmulti-objective particle swarmoptimizationmethod for solvingreliability redundancy allocation problemsrdquo Reliability Engi-neering amp System Safety vol 111 pp 58ndash75 2013

[19] J Zhang Y Wang and J Feng ldquoParallel particle swarm opti-mization based on PAMrdquo in Proceedings of the 2nd InternationalConference on Information Engineering and Computer Science(ICIECS rsquo10) Wuhan China December 2010

[20] J Han and M Kamber Data Mining Concepts and TechniquesMorgan Kaufmann 2006

[21] L F Ibrahim ldquoUsing of clustering algorithm CWSP-PAM forrural network planningrdquo in Proceedings of the 3rd InternationalConference on Information Technology and Applications (ICITArsquo05) vol 1 pp 280ndash283 IEEE July 2005

[22] P Chopra J Kang and J Lee ldquoUsing gene pair combinationsto improve the accuracy of the PAM classifierrdquo in Proceedingsof the IEEE International Conference on Bioinformatics andBiomedicine (BIBM rsquo09) pp 174ndash177 November 2009

[23] Y-W Leung and Y Wang ldquoMultiobjective programming usinguniform design and genetic algorithmrdquo IEEE Transactions onSystems Man and Cybernetics Part C Applications and Reviewsvol 30 no 3 pp 293ndash304 2000

[24] D Liu K C Tan C K Goh and W K Ho ldquoA multiobjectivememetic algorithm based on particle swarm optimizationrdquoIEEE Transactions on Systems Man and Cybernetics Part BCybernetics vol 37 no 1 pp 42ndash50 2007

[25] K Deb A Pratap S Agarwal and T Meyarivan ldquoA fast andelitist multiobjective genetic algorithm NSGA-IIrdquo IEEE Trans-actions on Evolutionary Computation vol 6 no 2 pp 182ndash1972002

[26] C M Fonseca and P J Fleming ldquoMultiobjective optimizationandmultiple constraint handling with evolutionary algorithmsII Application examplerdquo IEEE Transactions on Systems Manand Cybernetics Part A Systems and Humans vol 28 no 1 pp38ndash47 1998

[27] F Kursawe ldquoA variant of evolution strategies for vector opti-mizationrdquo in Parallel Problem Solving from Nature vol 496 ofLecture Notes in Computer Science pp 193ndash197 Springer BerlinGermany 1991

[28] E Zitzler K Deb and LThiele ldquoComparison of multiobjectiveevolutionary algorithms empirical resultsrdquo Evolutionary Com-putation vol 8 no 2 pp 173ndash195 2000

[29] H-L Liu Y Wang and Y-M Cheung ldquoA Multi-objective evo-lutionary algorithm usingmin-max strategy and sphere coordi-nate transformationrdquo Intelligent Automation amp Soft Computingvol 15 no 3 pp 361ndash384 2009

[30] N Chakraborti B S Kumar V S Babu S Moitra andA Mukhopadhyay ldquoA new multi-objective genetic algorithmapplied to hot-rolling processrdquoAppliedMathematicalModellingvol 32 no 9 pp 1781ndash1789 2008

[31] K Deb ldquoMulti-objective genetic algorithms problem difficul-ties and construction of test problemsrdquo Evolutionary Computa-tion vol 7 no 3 pp 205ndash230 1999

[32] K Deb L Thiele M Laumanns and E Zitzler ldquoScalable testproblems for evolutionarymultiobjective optimizationrdquo in Evo-lutionary Multiobjective Optimization Advanced Informationand Knowledge Processing pp 105ndash145 Springer London UK2005

[33] Y Shi and R Eberhart ldquoModified particle swarm optimizerrdquoin Proceedings of the IEEE World Congress on ComputationalIntelligence pp 69ndash73 May 1998

[34] C S Feng S Cong and X Y Feng ldquoA new adaptive inertiaweight strategy in particle swarm optimizationrdquo in Procedingsof the IEEE Congress on Evolutionary Computation (CEC rsquo07)pp 4186ndash4190 IEEE Singapore September 2007

[35] C K Goh K C Tan D S Liu and S C Chiam ldquoA competitiveand cooperative co-evolutionary approach to multi-objectiveparticle swarm optimization algorithm designrdquo European Jour-nal of Operational Research vol 202 no 1 pp 42ndash54 2010

[36] P Cheng J-S Pan L Li Y Tang and C Huang ldquoA surveyof performance assessment for multiobjective optimizersrdquo inProceedings of the 4th International Conference on Genetic andEvolutionary Computing (ICGEC rsquo10) pp 341ndash345 December2010

[37] E Zitzler L Thiele M Laumanns C M Fonseca and V GDa Fonseca ldquoPerformance assessment of multiobjective opti-mizers an analysis and reviewrdquo IEEE Transactions on Evolution-ary Computation vol 7 no 2 pp 117ndash132 2003

[38] E Zitzler and L Thiele ldquoMultiobjective evolutionary algo-rithms a comparative case study and the strength Paretoapproachrdquo IEEETransactions onEvolutionaryComputation vol3 no 4 pp 257ndash271 1999

[39] H Li and Q Zhang ldquoMultiobjective optimization problemswith complicated pareto sets MOEAD and NSGA-IIrdquo IEEE

Mathematical Problems in Engineering 17

Transactions on Evolutionary Computation vol 13 no 2 pp284ndash302 2009

[40] Q Zhang A Zhou S Zhao P N Suganthan W Liu andS Tiwari ldquoMultiobjective optimization test instances for theCEC 2009 special session and competitionrdquo Working ReportDepartment of Computing and Electronic Systems Universityof Essex 2008

[41] Q Zhang A Zhou and Y Jin ldquoRM-MEDA a regularity model-based multiobjective estimation of distribution algorithmrdquoIEEE Transactions on Evolutionary Computation vol 12 no 1pp 41ndash63 2008

[42] C-S Tsou H-H Fang H-H Chang and C-H Kao ldquoAnimproved particle swarmPareto optimizerwith local search andclusteringrdquo in Simulated Evolution and Learning vol 4247 ofLecture Notes in Computer Science pp 400ndash407 2006

[43] U Wickramasinghe and X Li ldquoChoosing leaders for multiob-jective PSO algorithms using differential evolutionrdquo in Proceed-ing of the 7th International Conference Simulated Evolution andLearning vol 5361 of Lecture Notes in Computer Science pp249ndash258 Springer Berlin Germany 2008

[44] T Krink J S Vesterstrom and J Riget ldquoParticle swarm opti-misation with spatial particle extensionrdquo in Proceedings of theCongress on Evolutionary Computation (CEC rsquo02) pp 1474ndash1479 May 2002

Submit your manuscripts athttpwwwhindawicom

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical Problems in Engineering

Hindawi Publishing Corporationhttpwwwhindawicom

Differential EquationsInternational Journal of

Volume 2014

Applied MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Probability and StatisticsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical PhysicsAdvances in

Complex AnalysisJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

OptimizationJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

CombinatoricsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Operations ResearchAdvances in

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Function Spaces

Abstract and Applied AnalysisHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of Mathematics and Mathematical Sciences

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Algebra

Discrete Dynamics in Nature and Society

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Decision SciencesAdvances in

Discrete MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014 Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Stochastic AnalysisInternational Journal of

Page 4: Research Article Multiobjective Particle Swarm

4 Mathematical Problems in Engineering

Table 1 Values of the parameter 120590 for different number of factorsand different number of levels per factor

Number of levelsper factors Number of factors 120590

5 2sim4 27 2sim6 311 2sim10 7

132 53 4

4sim12 617 2sim16 10

19 2sim3 84sim18 14

232 13sim14 20sim22 7

8sim12 153sim7 15sim19 17

29

2 123 9

4sim7 168sim12 16sim24 8

13sim15 1425sim28 18

31 2 5sim12 20sim30 123sim4 13sim19 22

119880(1198730 1198760) is replaced with 119880(119877119876

0) in the plus part where

the remainder 119877 is equal to119873 minus lfloor1198731198730rfloor lowast 1198730

243 Crossover Operator Based on the Uniform Design Thecrossover operator based on the uniform design acts ontwo parents It quantizes the solution space defined by theseparents into a finite number of points and then it applies theuniformdesign to select a small sample of uniformly scatteredpoints as the potential offspring

For any two parents 1199011and 119901

2 their minimal values

and the maximal values of each dimension are used toform the novel solution space [119897parent 119906parent] Each domain of[119897parent 119906parent] is quantized into 119876

1levels where 119876

1is a pre-

defined prime numberThen the uniform design is applied toselect a sample of points as the potential offspringThe detailsof the algorithm can be referred to [23]

3 The Proposed Algorithm

31 Elitist Selection or Elitism [4] Elitism means that eliteindividuals cannot be excluded from the mating pool of thepopulation A strategy presented can always include the bestindividual of the current population in the next generation inorder to prevent the loss of good solutions found so far Thisstrategy can be extended to copy the best 119899 individuals to thenext generation This is explanation of the elitism In MOPelitism plays an important role

Two strategies are often used to implement elitism Onemaintains elitist solutions in the population the other stores

population paretocreate (pop)Input pop indicates the populationOutput pareto indicates the non-dominated solutionsparetolarr popfor forall chr1 isin popfor forall chr2 isin pareto and chr2 = chr1if chr1 dominate chr2

paretolarr pareto minus chr2end if

end forend forreturn pareto

Algorithm 2 Pseudocode of selecting elitist

elitist solutions into an external secondary list or externalarchive and reintroduces them to the populationThe formercopies all nondominated solutions in the current populationto the next population and then fills the rest of the next popu-lation by selecting from the remaining dominated solutions inthe current populationThe latter uses an external secondarylist or external archive to store the elitist solutions Externallist stores the nondominated solutions found It will beupdated in the next generation by means of removing elitistsolutions dominated by a new solution or adding the newsolution if it is not dominated by any existing elitist solution

The work adopts the second strategy namely storing eli-tist solutions to an external secondary list Its advantage is thatit can preserve and dynamically adjust all the nondominatedsolutions set till the current generation The pseudocodes ofselecting elitist and updating elitist are respectively shown inAlgorithms 2 and 3

32 Selection Mechanism for the Swarm Based on UniformDesign This paper adopts the uniform design to select thebest 119879

1points (119879

1lt 119879) from the 119879 points and acquires their

objectives fit119881 The detailed steps are as follows

Algorithm B

Step 1 Calculate each of119872 objectives for each of the119879 pointsnormalize each of objectives 119891

119894(119909) as follows

ℎ119894119909 =

119891119894 (119909)

max119910isin120595

1003816100381610038161003816119891119894 (119910)1003816100381610038161003816 (5)

where 120595 is a set of points in the current population and ℎ119894119909 is

the normalized objective

Step 2 Apply the uniform design to generate the 1198630weight

vectors 1199081 1199082 119908

1198630 each of them is used to compose

one fitness function by the following formula where 1198630is a

design parameter and it is prime

fit119894= 1199081198941ℎ1 (119909) + 1199081198942ℎ2 (119909) + sdot sdot sdot + 1199081198941198630ℎ1198630 (119909)

1 le 119894 le 1198630

(6)

Mathematical Problems in Engineering 5

population paretoupdate(offspring pareto)Input offspring indicates the offsprings after performing the crossover operator

and mutation operator pareto indicates the non-dominated solutionsOutput pareto indicates the non-dominated solutionsoffspringlarr call paretocreate(offspring)for forall chr1 isin offspring

nondominatedlarr truefor forall chr2 isin paretoif chr1 dominate chr2paretolarr pareto minus chr2

else if chr2 dominate chr1nondominatedlarr false

elsecontinue

end ifend forif nondominatedparetolarr pareto cup chr1

end ifend forreturn pareto

Algorithm 3 Pseudocode of updating elitist

Step 3 Based on each of fitness functions evaluate the qualityof the 119879 points Assume the remainder 119879

11198630is 1198770 For the

first 1198770fitness functions select the best lceil119879

11198630rceil points for

the rest of fitness functions select the best lfloor11987911198630rfloor points

Overall a total of 1198791points are selected The objectives of

these selected points are correspondingly stored into fit119881

33 Selection Mechanism for Gbest Based on PAM InMOPSO 119866119887119890119904119905 plays an important role in guiding the entireswarm toward the global Pareto front [24] In contrast tothe single objective PSO having only one global best 119866119887119890119904119905MOPSO has multiple Pareto optimality solutions which arenondominated each other How to select a suitable 119866119887119890119904119905 foreach of particles fromPareto optimal solutions is one very keyissue

The paper presents the selection mechanism for 119866119887119890119904119905based on PAM as follows

Algorithm C Assume the population and the numbers ofparticles are denoted as pop and119873pop and the Pareto opti-mality and the numbers of Pareto optimality are denoted aspareto and119873119875

Step 1 Acquire the number of cluster according to the fol-lowing formula

119870 = 119870min + (119870max minus 119870min) sdot (119905

max119866) (7)

where 119870min and 119870max respectively denote the minimal andmaximal value of 119870 namely 119870 isin [119870min 119870max] 119905 and max119866indicate the 119905th iteration and the maximal iteration numberThe formula can acquire the linearly increasing number ofcluster so as to accord with the process from the coarse searchto the elaborate search

Step 2 If 119873119875 le 119870 then for each particle in pop find thenearest Pareto optimal solution as its 119866119887119890119904119905 Otherwise turnto Step 3

Step 3 Perform PAM described in Section 23 to partitionPareto and pop into 119870 clusters respectively the clustercentroids of which are denoted as119862

1= 11988811 11988812 119888

1119870 and

1198622= 11988821 11988822 119888

2119870

Step 4 For each 1198882119895isin 1198622 find the nearest 119888

1119894isin 1198621 For

all the particles in the cluster represented by 1198882119895 their 119866119887119890119904119905

randomly takes one of Pareto optimal solutions in the clusterrepresented by 119888

1119894

34 Adjustment of Pareto Optimal Solutions Based on theUniform Design and PAM The number of Pareto optimalsolutions in the external archivewill increasingly enlargewithevolution namely the size of the external archive will becomevery large with the iteration number This will increasethe computation complexity and the execution time of analgorithmTherefore the size of the external archive must becontrolled In the meanwhile Pareto optimal solutions in theexternal archive are possibly close to each other in the objec-tive space If the solutions are close to each other they arenearly the same choice It is desirable to find the Pareto opti-mal solutions scattered uniformly over the Pareto front sothat the decision maker can have a variety of choices There-fore how to control the size of the external archive and selectrepresentative Pareto optimal solutions scattered uniformlyover the Pareto front is a key issue

The paper presents the adjustment algorithm of Paretooptimal solutions based on the uniform design and PAMThe algorithm firstly implements PAM to partition thePareto front into 119870 clusters in the objective space and then

6 Mathematical Problems in Engineering

implements the uniform crossover operator on the minimalcluster so as to generate more Pareto optimality in theminimal cluster finally it keeps all the points in the lesserclusters and discards some points in the larger clusters Thedetailed steps of the algorithm are shown as follows

Algorithm DAssume the Pareto optimality their values of119872functions and the numbers of Pareto optimality are denotedas pareto 119865pareto and119873119875 The size of the external archive isassumed as119873119875 lim The number of clusters is 119870

Step 1 If the numbers of data points in the maximal andthe minimal clusters have too many differences or are bothvery small select all the data points from the minimal clusterand the same number of points from the cluster which isthe nearest to the minimal cluster For each pair of datapoints perform the uniform crossover operator described inSection 243 Update pareto 119865pareto and 119873119875 according toAlgorithm 3

Step 2 If 119873119875 gt 119873119875 lim implement PAM to partition the119865pareto into119870 clusters in the objective space and let119873 div =119873119875 lim 119870 it indicates the average points to select from eachof119870 clustersWe can classify three situations according to thenumber of points in each cluster and119873 div as follows

(i) Situation 1 the numbers of points in all clusters arelarger than or equal to119873 div

(ii) Situation 2 the number of data points is larger thanor equal to119873 div only in one cluster

(iii) Situation 3 the number of the clusters in which thenumber of data points is larger than 119873 div is largerthan 1 and less than119870

Step 3 For Situation 1 sort all clusters according to their con-taining points in ascending order Select 119873 div points fromthe previous 119870 minus 1 clusters and select the remainder points119873119875 limminus119873 div lowast(119870 minus 1) from the last cluster Turn to Step7

Step 4 For Situation 2 keep the points of all the clusters inwhich the number of the points is less than or equal to119873 divand the remainder points are selected from that only clusterTurn to Step 7

Step 5 For Situation 3 keep all the points of all the clustersin which the number of points is less than or equal to119873 divTurn to Step 6 to select the remainder point

Step 6 Recalculate 119873 div and let 119873 div = rem1198751198701015840 whererem119875 and 1198701015840 indicate the number of the remainder pointsand clusters According to the new 119873 div turn to Step 3 orStep 4

Step 7 Save the selected 119873119875 lim points and terminate thealgorithm

35 Thoughts on the Proposed Algorithm In MOP for 119872objectives 119891

1 1198912 119891

119872and any two points 119909 and 119910 in the

feasible solution space Ω if each objective is satisfied with119891119894(119909) le 119891

119894(119910) and at least one objective is satisfied with

119891119894(119909) lt 119891

119894(119910) namely 119909 is at least as good as 119910with respect to

all the objectives and 119909 is strictly better than 119910 with respectto at least one objective then we say that 119909 dominates 119910 Ifno other solution is strictly better than 119909 then 119909 is called anondominated solution or Pareto optimal solution

In single objective problems there is only one objective tobe optimized Therefore the global best (119866119887119890119904119905) of the wholeswarm is only one In multiobjective problems there aremultiple consistent or conflicting objectives to be optimizedTherefore there exist very large or infinite numbers of solu-tions which cannot dominate each other These solutions arenondominated solutions or Pareto optimal solutions There-fore 119866119887119890119904119905 of the whole swarm is more than one How toselect suitable 119866119887119890119904119905 is a very key issue

When it is not possible to find all these solutions it maybe desirable to find as many solutions as possible in order toprovide more choices to the decision maker However if thesolutions are close to each other they are nearly of the samechoice It is desirable to find the Pareto optimal solutionsscattered uniformly over the Pareto front so that the decisionmaker can have a variety of choices The paper introducesuniform design to ensure that the acquired solutions scatteruniformly over the Pareto front in the objective space

For the MOPSO the diversity of the population is a veryimportant factor It has a key impact on the convergence ofan algorithm and uniformly distribution of Pareto optimalsolution It can effectively get rid of the premature of thealgorithmsThe paper introduces PAM clustering algorithmsto maintain the diversity of the population The particles inthe same cluster have similar features whereas ones in differ-ent clusters have dissimilar features Thus choosing particlesfrom different clusters may necessarily increase the diversityof the population

36 Steps of the Proposed Algorithm

Step 1 According to the population size 119873pop determinethe number of subintervals 119878 and the population size 119876

0in

each subinterval such that 1198760lowast 119878 is more than 119873pop 119876

0is

a prime and must exist in Table 1 Execute Algorithm A inSection 242 to generate a temporary population containing1198760lowast 119878 ge 119873pop points

Step 2 PerformAlgorithm B described in Section 32 to selectthe best 119873pop points from the 119876

0lowast 119878 points as the initial

population pop and acquire their objectives fit119881

Step 3 Initialize the speed and119875119887119890119904119905 of the particle 119894 by119881[119894] =pop[119894] and 119875119887119890119904119905[119894] = pop[119894] 119891119875119887119890119904119905[119894] = fit119881[119894] where 119894 =1 sdot sdot sdot 119873pop 119891119875119887119890119904119905[119894] denotes objectives of 119875119887119890119904119905[119894]

Step 4 According to Algorithm 2 select elitist or Paretooptimal solutions from pop and store them to the externalsecondary list pareto the size of which is assumed as119873119875

Step 5 Choose the suitable119866119887119890119904119905 for each of the particles frompareto in terms of Algorithm C described in Section 33

Mathematical Problems in Engineering 7

Step 6 Update the position pop and velocity 119881 for each ofthe particles respectively using formulas (2) and (3) whereformula (2) is modified as follows

V119894119895 (119896 + 1) = 119908 sdot V119894119895 (119896)

+ 1198881sdot 1199031sdot (119875119887119890119904119905

119894119895 (119896) minus 119909119894119895 (119896))

+ 1198882times 1199032times (119866119887119890119904119905

119894119895 (119896) minus 119909119894119895 (119896))

(8)

Step 7 For the 119895th dimensional variable of the 119894th particle ifit goes beyond its lower boundary or upper boundary then ittakes the 119895th dimensional value of its corresponding bound-ary and the 119895th dimensional value of its velocity takes theopposite value

Step 8 Calculate each of119872 objectives for each of the particlesand update them into fit119881

Step 9 Update 119875119887119890119904119905 of each particle as followsFor the 119894th particle if fit119881[119894] dominates 119891119875119887119890119904119905[119894] then

let 119891119875119887119890119904119905[119894] = fit119881[119894] and 119875119887119890119904119905[119894] = pop[119894] otherwise119875119887119890119904119905[119894] and119891119875119887119890119904119905[119894] are kept unchanged If neither of themis dominated by the other then randomly select one of themas 119891119875119887119890119904119905[119894] and update its corresponding 119875119887119890119904119905[119894]

Step 10 Update the external archive pareto storing Paretooptimality and its size119873119875 according to Algorithm 3

Step 11 Implement Algorithm D described in Section 34 toadjust the Pareto optimal solutions such that the number ofPareto optimal solutions is less than or equal to the size of theexternal archive119873119875 lim

Step 12 If the stop criterion is satisfied terminate algorithmotherwise turn to Step 5 and continue

4 Numerical Results

In order to evaluate the performances of the proposedalgorithm we compare it with two outstanding algorithmsUMOGA [23] and NSGA-II [25]

41 Test Problems Several well-known multiobjective testfunctions are used to test the performance of the proposedalgorithm

Biobjective problem FON [25 26] KUR [25 27]ZDT1 ZDT2 ZDT3 and ZDT6 [25 28ndash31]Three-objective problems DTLZ1 and DTLZ [2 32]

The definitions of them are as followsFON is defined as

1198911 (119909) = 1 minus exp(minus

3

sum119894=1

(119909119894minus1

radic3)2

)

1198912 (119909) = 1 minus exp(minus

3

sum119894=1

(119909119894+1

radic3)2

)

(9)

where 119909119894isin [minus4 4]

This test function has a nonconvex Pareto optimal frontKUR is defined as

1198911 (119909) =

119899minus1

sum119894=1

(minus10 exp (minus02 sdot radic1199092119894+ 1199092119894+1))

1198912 (119909) =

119899

sum119894=1

(1003816100381610038161003816119909119894100381610038161003816100381608+ 5 sdot sin1199093

119894)

(10)

where 119899 = 3 and 119909119894isin [minus5 5]

This test function has a nonconvex Pareto optimal frontin which there are three discontinuous regions

ZDT1 is defined as1198911(1199091) = 1199091

119892 (1199092 119909

119899) = 1 +

9sum119899119894=2119909119894

(119899 minus 1)

ℎ (1198911 119892) = 1 minus radic

1198911

119892

(11)

where 119899 = 30 and 119909119894isin [0 1]

This test function has a convex Pareto optimal frontZDT2 is defined as

1198911(1199091) = 1199091

119892 (1199092 119909

119899) = 1 +

9sum119899119894=2119909119894

(119899 minus 1)

ℎ (1198911 119892) = 1 minus (

1198911

119892)2

(12)

where 119899 = 30 and 119909119894isin [0 1]

This test function has a nonconvex Pareto optimal frontZDT3 is defined as

1198911(1199091) = 1199091

119892 (1199092 119909

119899) = 1 +

9sum119899119894=2119909119894

(119899 minus 1)

ℎ (1198911 119892) = 1 minus radic

1198911

119892minus (

1198911

119892) sin (10120587119891

1)

(13)

where 119899 = 30 and 119909119894isin [0 1]

This test function represents the discreteness feature ItsPareto optimal front consists of several noncontiguous con-vex parts The introduction of the sine function in ℎ causesdiscontinuity in the Pareto optimal front

ZDT6 is defined as

1198911(1199091) = 1 minus exp (minus4119909

1) sin6 (6120587119909

1)

119892 (1199092 119909

119899) = 1 + 9(

sum119899119894=2119909119894

(119899 minus 1))025

ℎ (1198911 119892) = 1 minus (

1198911

119892)2

(14)

where 119899 = 10 and 119909119894isin [0 1]

8 Mathematical Problems in Engineering

This test function has a nonconvex Pareto optimal frontIt includes two difficulties caused by the nonuniformityof the search space Firstly Pareto optimal solutions arenonuniformly distributed along the global Pareto optimalfront (the front is biased for solutions in which 119891

1(1199091) is near

one) Secondly the density of the solutions is the lowest nearthe Pareto optimal front and the highest away from the front

DTLZ1 is defined as

1198911 (119909) = 0511990911199092 (1 + 119892 (119909))

1198912 (119909) = 051199091 (1 minus 1199092) (1 + 119892 (119909))

1198913 (119909) = 05 (1 minus 1199091) (1 + 119892 (119909))

119892 (119909) = 100

sdot [5 +119899

sum119894=3

((119909119894minus 05)

2minus cos (20120587 (119909

119894minus 05)))]

(15)

where 119899 = 7 and 119909119894isin [0 1]

This test function has difficulty in the convergence to thePareto optimal hyperplane and contains (115minus1) local Paretooptimal fronts

DTLZ2 is defined as

1198911 (119909) = (1 + 119892 (119909)) cos (051205871199091) cos (051205871199092)

1198912 (119909) = (1 + 119892 (119909)) cos (051205871199091) sin (051205871199092)

1198913 (119909) = (1 + 119892 (119909)) sin (051205871199091)

119892 (119909) =119899

sum119894=3

(119909119894minus 05)

2

(16)

where 119899 = 12 and 119909119894isin [0 1]

The Pareto optimal solutions of this test function must lieinside the first octant of the unit sphere in a three-objectiveplot It is more difficult than DTLZ1

42 Parameter Values The parameter values of the proposedalgorithm UKMOPSO are adopted as follows

(i) Parameters for PSO the linearly descending inertiaweight [33 34] is within the interval [01 1] and theacceleration coefficients 119888

1and 1198882are both taken as 2

(ii) Parameter for PAM the minimal and maximal num-bers of clusters are 119870min = 2 and119870max = 10

(iii) Population size the population size119873pop is 200(iv) Parameters for the uniform design the number of

subintervals 119878 is 64 the number of the sample pointsor the population size of each subinterval 119876

0is 31 set

1198630= 31 and 119876

1= 5

(v) Stopping condition the algorithm terminates if thenumber of iterations is larger than the given maximalgenerations of 20

All the parameter values in UMOGA [23] are set equalto the original values in [23] the different and additional

parameter values between UMOGA and UKMOPSO are asfollows the number of subintervals 119878 is 16 119865 = 119873 (thenumber of variables)119863

1= 7 119901

119898= 002

NSGA-II in [25] adopted a population size of 100 acrossover probability of 08 a mutation probability of 1119899(where 119899 is the number of variables) and maximal genera-tions of 250 In order to make the comparisons fair we haveused population size of 100 and maximal generations of 40 inNSGA-II so that the total number of function evaluations inNSGA-II UKMOPSO and UMOGA is the same

43 Performance Metric Based on the assumption that thetrue Pareto front of a test problem is known many kinds ofperformance metrics have been proposed and used by manyresearchers such as [2 9 24 25 29 35ndash40] Three of themare adopted in the paper to compare the performance of theproposed algorithm with them The first metric is 119862 metric[29 38] It is taken as the quantitative metric of the solutionquality and often used to show that the outcomes of one algo-rithm dominating the outcomes of another algorithm It isdefined as

119862 (119860 119861) =|119887 isin 119861 exist119886 isin 119860 119886 ≺ 119887|

|119861| (17)

where 119886 ≺ 119887 represents 119887 being dominated by 119886 119860 representsthe Pareto front obtained by Algorithm 119860 and 119861 representsthe Pareto front obtained by Algorithm 119861 in a typical runFurthermore |119861| represents the number of elements of 119861

The metric value 119862(119860 119861) = 1 means that all points in 119861are dominated by or equal to points in 119860 In contrast 119862(119860119861) = 0means that none of the points in 119861 are covered by theones in 119860

The second metric is the IGD metric (namely InvertedGenerational Distance) [39ndash41] It is the mean value of thedistances from the set of solutions uniformly distributed inthe true Pareto front to the set of solutions obtained by analgorithm in objective space Let 119875lowast be a set of uniformly dis-tributed points along the true PF (Pareto Front) Let 119860 be anapproximate set to the PF the average distance from119875lowast to119860 isdefined as

IGD (119860 119875lowast) =sumVisin119875lowast 119889 (V 119860)

|119875lowast| (18)

where 119889(V 119860) is the minimum Euclidean distance between Vand the points in 119860 If |119875lowast| is large enough to represent thePF very well IGD(119860 119875lowast) could measure both the diversityand convergence of 119860 in a sense To have a low value ofIGD(119860 119875lowast) the set119860must be very close to the PF and cannotmiss any part of the whole PF

The smaller the IGDvalue for the set of obtained solutionsis the better the performance of the algorithm will be

The third measure is the measure of maximum spread(MS) [2 24 35] which is proposed in [42 43] It canmeasure how well the true Pareto front (PFtrue) is coveredby the discovered Pareto front (PFknown) through hyperboxes

Mathematical Problems in Engineering 9

0 01 02 03 04 05 06 07 08 09 10

01

02

03

04

05

06

07

08

09

1100

True Pareto frontUKMOPSO

0 01 02 03 04 05 06 07 08 09 10

01

02

03

04

05

06

07

08

09

152

True Pareto frontUMOGA

0 01 02 03 04 05 06 07 08 09 10

01

02

03

04

05

06

07

08

09

140

True Pareto frontNSGA-II

f1(x)

f2(x)

f1(x)

f2(x)

f1(x)

f2(x)

Figure 1 Pareto fronts obtained by different algorithms for test problem FON

formed by the extreme function values observed in the PFtrueand PFknown It is defined as

MS = radic 1

119872

119872

sum119894=1

120575119894

120575119894= (

min (119891max119894 119865max119894) minusmax (119891min

119894 119865min119894)

119865max119894

minus 119865min119894

)

2

(19)

where 119872 is the number of objectives 119891max119894

and 119891min119894

arethe maximum and minimum values of the 119894th objective inPFknown respectively and 119865

max119894

and 119865min119894

are the maximumand minimum values of the 119894th objective in PFtrue respec-tively

Note that if 119891min119894

ge 119865max119894

then 120575119894= 0 Algorithms with

largerMS values are desirable andMS = 1means that the truePareto front is totally covered by the obtained Pareto front

44 Results For each test problem we perform the proposedalgorithm (called UKMOPSO) for 30 independent runs andcompare its performance with UMOGA [23] and NSGA-II[25] The values of several metrics the 119862metric IGDmetricand MS metric are shown in Tables 2 3 4 and 5 The Paretofronts obtained by several algorithms implemented on severaltest functions are illustrated in Figures 1ndash6

For brevity 119862(UKMOPSOUMOGA) 119862(UMOGAUKMOPSO) 119862(UKMOPSONSGA-II) and 119862(NSGA-IIUKMOPSO) are respectively marked as 119862

12 11986221 11986213 and

11986231

10 Mathematical Problems in Engineering

0100

UKMOPSOTrue Pareto front

14

UMOGATrue Pareto front

40

NSGA-IITrue Pareto front

f1(x)

f2(x)

minus20 minus19 minus18 minus17 minus16 minus15 minus14

f1(x)

minus20 minus19 minus18 minus17 minus16 minus15 minus14 minus13 minus12minus12

minus10

minus8

minus6

minus4

minus2

0

f2(x)

f1(x)

minus20 minus19 minus18 minus17 minus16 minus15 minus14minus12

minus10

minus8

minus6

minus4

minus2

0

f2(x)

minus12

minus10

minus8

minus6

minus4

minus2

Figure 2 Pareto fronts obtained by different algorithms for test problem KUR

Table 2 Comparison of C metric between UKMOPSO andUMOGA

C (UKMOPSO UMOGA) C (UMOGA UKMOPSO)FON 0769 001KUR 0929 001ZDT1 012 013ZDT2 0063 003ZDT3 0139 0098ZDT6 0063 001DTLZ1 1 0DTLZ2 1 0

Table 3 Comparison of Cmetric between UKMOPSO and NSGA-II

C (UKMOPSO NSGA-II) C (NSGA-II UKMOPSO)FON 0025 001KUR 0075 011ZDT1 1 0ZDT2 1 0ZDT3 1 0ZDT6 1 0DTLZ1 0 013DTLZ2 0025 006

Mathematical Problems in Engineering 11

0

02

04

06

08

1

12

14100 76

0

05

1

15

2

25

325

0 01 02 03 04 05 06 07 08 09 1

True Pareto frontUKMOPSO

0 01 02 03 04 05 06 07 08 09 1

True Pareto frontNSGA-II

f1(x)

f2(x)

f1(x)

0 01 02 03 04 05 06 07 08 09 10

01

02

03

04

05

06

07

08

09

1

True Pareto frontUMOGA

f1(x)

f2(x)

f2(x)

Figure 3 Pareto fronts obtained by different algorithms for test problem ZDT1

Table 4 Comparison of IGD metric

UKMOPSO UMOGA NSGA-IIFON 00072 00432 03707KUR 00668 11576 12323ZDT1 00067 00265 13933ZDT2 00067 00727 21409ZDT3 01964 02151 02816ZDT6 00033 00139 59223DTLZ1 74827 22965 54182DTLZ2 01123 04319 02724

Table 5 Comparison of maximum spread (MS) metric

UKMOPSO UMOGA NSGA-IIFON 09769 09754 02368KUR 09789 09580 03414ZDT1 09429 09984 05495ZDT2 09999 09986 00019ZDT3 09276 09275 07963ZDT6 1 09974 03738DTLZ1 1 1 06588DTLZ2 1 1 07125

12 Mathematical Problems in Engineering

0

02

04

06

08

1

12

14100 9

0

05

1

15

2

25

38

0 01 02 03 04 05 06 07 08 09 1

True Pareto frontUKMOPSO

0 01 02 03 04 05 06 07 08 09 10

01

02

03

04

05

06

07

08

09

1

True Pareto frontUMOGA

0 01 02 03 04 05 06 07 08 09 1

True Pareto frontNSGA-II

f1(x) f1(x)

f2(x)

f2(x)

f2(x)

f1(x)

Figure 4 Pareto fronts obtained by different algorithms for test problem ZDT2

As shown in Table 2 for the test functions Fon and KUR11986212= 0769 and 119862

12= 0929 mean that 768 and 929 of

the solutions obtained by UMOGA are dominated by thoseobtained by UKMOPSO Similarly 119862

21= 001means that 1

of the solutions obtained by UKMOPSO are dominated bythose obtained by UMOGA For DTLZ1 and DTLZ2119862

12= 1

and 11986221= 0 mean that all solutions obtained by UMOGA

are dominated by those obtained by UKMOPSO but noneof solutions from UKMOPSO is dominated by those fromUMOGA It means that the solution quality via UKMOPSOis much better than that via UMOGA for the above testfunctions For ZDT1 ZDT2 ZDT3 and ZDT6 the valuesof the 119862 metrics do not have too many differences betweenUKMOPSO and UMOGA It means the solution qualities viaboth are almost identical

FromTable 3 we can see that for ZDT1 ZDT2 ZDT3 andZDT6 119862

13= 1 and 119862

31= 0mean that all solutions obtained

byNSGA-II are dominated by those obtained byUKMOPSObut none of solutions from UKMOPSO is dominated bythose from NSGA-II It means that the solution quality viaUKMOPSO is much better than that via NSGA-II for theabove test functions For the rest of the test functions thesolution qualities via both are almost identical

In Table 4 for all test problems the IGD values ofUKMOPSO are the smallest among UKMOPSO UMOGAand NSGA-II This means the PF found by UKMOPSO isthe nearest to the true PF compared with the PF obtainedby the other two algorithms namely the performance ofUKMOPSO is the best in the three algorithms The IGDvalues of UMOGA are all larger than those of NSGA-II for

Mathematical Problems in Engineering 13

0

02

04

06

08

1 92 86

0

05

1

15

2

25

3

35

4

45 40

0 01 02 03 04 05 06 07 08 09f1(x)

0 01 02 03 04 05 06 07 08 09f1(x)

0 01 02 03 04 05 06 07 08 09f1(x)

f2(x)

minus08

minus06

minus04

minus02

0

02

04

06

08

1

f2(x)

f2(x)

minus08

minus05

minus06

minus04

minus02

UKMOPSO UMOGA

NSGA-II

Figure 5 Pareto fronts obtained by different algorithms for test problem ZDT3

all test functions It means that the PF found by UMOGA iscloser to the true PF than the PF obtained by NSGA-II

From Table 5 we can see that all the MS values ofUKMOPSO are almost close to 1 for each test function Itmeans that almost all the true PF is totally covered by the PFobtained by UKMOPSO UMOGA is similar to UKMOPSOThe MS values of NSGA-II are much lesser than UKMOPSOand UMOGA especially MS = 00019 for ZDT2

Figures 1ndash6 all demonstrate that the PF found byUKMOPSO is the nearest to the true PF and scatters mostuniformly in those obtained by three algorithms Most of thepoints in the PF found by UKMOPSO overlap the pointsin the true PF It means the solution quality obtained byUKMOPSO is very high

45 Influence of the Uniform Crossover and PAM In orderto find out the influence of the uniform crossover on the

proposed algorithm we compare the distribution and num-ber of Pareto optimal solutions before and after performingthe uniform crossover One of the simulating results on thetest function ZDT1 is shown in Figure 7

From Figure 7 it can be seen that before and afterperforming the uniform crossover the number of data pointsin the 1st cluster and the 2nd cluster varies from 7 to 15and from 19 to 26 respectively namely many new Paretooptimal solutions having not been found before performingthe uniform crossover are generated and the differences ofthe data points between two clusters have been decreasedThis will directly influence the uniformity of the PF andacquire more uniform Pareto solutions

PAM is used to determine which Pareto solutions are tobe removed fromor be inserted into the external archiveThisis to maintain the diversity of Pareto optimal solutions The

14 Mathematical Problems in Engineering

100 16

0

05

1

15

2

25

3

35

4

45

540

02 03 04 05 06 07 08 09 10

01

02

03

04

05

06

07

08

09

1

True Pareto frontUKMOPSO

02 03 04 05 06 07 08 09 10

01

02

03

04

05

06

07

08

09

1

True Pareto frontUMOGA

02 03 04 05 06 07 08 09 1

True Pareto frontNSGA-II

f1(x)

f2(x)

f1(x)

f2(x)

f2(x)

f1(x)

Figure 6 Pareto fronts obtained by different algorithms for test problem ZDT6

diversity can be computed by the ldquodistance-to-average-pointrdquomeasure [44] defined as

diversity (119878) = 1

|119878|

|119878|

sum119894=1

radic119873

sum119895=1

(119901119894119895minus 119901119895)2

(20)

where 119878 is the population |119878| is the swarm size 119873 is thedimensionality of the problem 119901

119894119895is the 1198951015840th value of the 1198941015840th

particle and 119901119895is the 1198951015840th value of the average point 119901

If the number of Pareto optimality is less than or equalto the size of the external archive PAM has no influenceon the proposed algorithm Otherwise it is used to selectthe different type of Pareto optimality from several clustersTherefore The diversity of Pareto optimality will certainlyincreaseWemonitor the diversity of Pareto optimality before

and after performing PAMonZDT1 at a certain timeThe val-ues are respectively 8762 and 11752 This fully demonstratesthat PAM can improve the diversity of Pareto optimality

5 Conclusion and Future Work

In this paper a multiobjective particle swarm optimizationbased on PAM and uniform design is presented It firstlyimplements PAM to partition the Pareto front into119870 clustersin the objective space and then it implements the uniformcrossover operator on the minimal cluster so as to generatemore Pareto optimality in the minimal cluster When thesize of the Pareto solution is larger than that of the externalarchive PAM is used to determine which Pareto solutions areto be removed from or be inserted into the external archive

Mathematical Problems in Engineering 15

0 01 02 03 04 05 06 07 08 09

19

10

01

02

03

04

05

06

077

08

09

1

The 1st clusterThe 2nd cluster

26

(a)

41

0 01 02 03 04 05 06

26

07 08 09 10

01

02

03

04

051506

07

08

09

1

The 1st clusterThe 2nd cluster

(b)

Figure 7 Pareto optimal solutions before (a) and after (b) performing the uniform crossover

Finally it keeps all the points in the lesser clusters anddiscards some points in the larger clusters This can ensurethat each of the clusters will contain approximately the samedata points Therefore the diversity of the Pareto solutionswill increase and they can scatter uniformly over the Paretofront The results of the experimental simulation performedon several well-known test problems indicate that theproposed algorithm obviously outperforms the other twoalgorithms

This algorithm is going on for further enhancement andimprovement One attempt is to use a more efficient orapproximate clustering algorithm to speed up the executiontime of this algorithm Another attempt is to extend its appli-cation scopes

Conflict of Interests

The authors declare that there is no conflict of interestsregarding the publication of this paper

Acknowledgments

This research was supported by Guangxi Natural ScienceFoundation (no 2013GXNSFAA019337) Guangxi Universi-ties Key Project of Science and Technology Research (noKY2015ZD099) Key project of Guangxi Education Depart-ment (no 2013ZD055) Scientific Research Staring Founda-tion for the PHD Scholars of Yulin Normal University (noG2014005) Special Project of Yulin Normal University (no2012YJZX04) and Key Project of Yulin Normal University(no 2014YJZD05)The authors are grateful to the anonymousreferee for a careful checking of the details and for helpfulcomments that improved this paper

References

[1] J D Schaffer ldquoMultiple objective optimization with vector eval-uated genetic algorithmsrdquo in Proceedings of the 1st InternationalConference on Genetic Algorithms pp 93ndash100 1985

[2] L Tang and X Wang ldquoA hybrid multiobjective evolutionaryalgorithm for multiobjective optimization problemsrdquo IEEETransactions on Evolutionary Computation vol 17 no 1 pp 20ndash45 2013

[3] K Deb S Agrawal A Pratap and T Meyarivan ldquoA fast elitistnon-dominated sorting genetic algorithm for multi-objectiveoptimizationNSGA-IIrdquo inParallel ProblemSolving fromNaturePPSN VI vol 1917 of Lecture Notes in Computer Science pp849ndash858 Springer Berlin Germany 2000

[4] J Zhang Y Wang and J Feng ldquoAttribute index and uniformdesign based multiobjective association rule mining with evo-lutionary algorithmrdquo The Scientific World Journal vol 2013Article ID 259347 16 pages 2013

[5] S Jiang and Z Cai ldquoFaster convergence and higher hypervol-ume for multi-objective evolutionary algorithms by orthogonaland uniform designrdquo in Advances in Computation and Intelli-gence vol 6382 of Lecture Notes in Computer Science pp 312ndash328 Springer Berlin Germany 2010

[6] J Kennedy and R Eberhart ldquoParticle swarm optimizationrdquoin Proceedings of the IEEE International Conference on NeuralNetworks pp 1942ndash1948 December 1995

[7] R Eberhart and J Kennedy ldquoA new optimizer using particleswarm theoryrdquo in Proceedings of the 6th International Sympo-sium on Micro Machine and Human Science pp 39ndash43 IEEENagoya Japan October 1995

[8] L Yang ldquoA fast and elitist multi-objective particle swarm algo-rithm NSPSOrdquo in Proceedings of the IEEE International Con-ference on Granular Computing (GRC rsquo08) pp 470ndash475 August2008

[9] C A C Coello G T Pulido and M S Lechuga ldquoHandlingmultiple objectives with particle swarm optimizationrdquo IEEE

16 Mathematical Problems in Engineering

Transactions on Evolutionary Computation vol 8 no 3 pp256ndash279 2004

[10] M Clerc and J Kennedy ldquoThe particle swarm-explosion sta-bility and convergence in a multidimensional complex spacerdquoIEEE Transactions on Evolutionary Computation vol 6 no 1pp 58ndash73 2002

[11] H Xiaohui and R Eberhart ldquoMultiobjective optimization usingdynamic neighborhood particle swarm optimizationrdquo in Pro-ceedings of the Congress on Evolutionary Computation (CEC rsquo02)vol 2 pp 1677ndash1681 May 2002

[12] S Mostaghim and J Teich ldquoStrategies for finding goodlocal guides in multi-objective particle swarm optimization(MOPSO)rdquo in Proceedings of the IEEE Swarm IntelligenceSymposium (SIS rsquo03) pp 26ndash33 Indianapolis Ind USA April2003

[13] T Bartz-Beielstein P Limbourg J Mehnen K Schmitt K EParsopoulos and M N Vrahatis ldquoParticle swarm optimizersfor Pareto optimizationwith enhanced archiving techniquesrdquo inProceedings of the Congress on Evolutionary Computation (CECrsquo03) vol 3 pp 1780ndash1787 December 2003

[14] K E Parsopoulos D K Tasoulis and M N Vrahatis ldquoMulti-objective optimization using parallel vector evaluated particleswarm optimizationrdquo in Proceedings of the IASTED Interna-tional Conference on Artificial Intelligence and Applications (AIArsquo04) vol 2 pp 823ndash828 February 2004

[15] S-J Tsai T-Y Sun C-C Liu S-T Hsieh W-C Wu and S-YChiu ldquoAn improved multi-objective particle swarm optimizerformulti-objective problemsrdquo Expert Systems with Applicationsvol 37 no 8 pp 5872ndash5886 2010

[16] A A Mousa M A El-Shorbagy and W F Abd-El-WahedldquoLocal search based hybrid particle swarm optimization algo-rithm for multiobjective optimizationrdquo Swarm and Evolution-ary Computation vol 3 pp 1ndash14 2012

[17] J Wang W Liu W Zhang and B Yang ldquoMulti-objective parti-cle swarm optimization based on self-update and grid strategyrdquoin Proceedings of the 2012 International Conference on Informa-tion Technology and Software Engineering vol 211 of LectureNotes in Electrical Engineering pp 869ndash876 Springer BerlinGermany 2013

[18] K Khalili-Damghani A-R Abtahi and M Tavana ldquoA newmulti-objective particle swarmoptimizationmethod for solvingreliability redundancy allocation problemsrdquo Reliability Engi-neering amp System Safety vol 111 pp 58ndash75 2013

[19] J Zhang Y Wang and J Feng ldquoParallel particle swarm opti-mization based on PAMrdquo in Proceedings of the 2nd InternationalConference on Information Engineering and Computer Science(ICIECS rsquo10) Wuhan China December 2010

[20] J Han and M Kamber Data Mining Concepts and TechniquesMorgan Kaufmann 2006

[21] L F Ibrahim ldquoUsing of clustering algorithm CWSP-PAM forrural network planningrdquo in Proceedings of the 3rd InternationalConference on Information Technology and Applications (ICITArsquo05) vol 1 pp 280ndash283 IEEE July 2005

[22] P Chopra J Kang and J Lee ldquoUsing gene pair combinationsto improve the accuracy of the PAM classifierrdquo in Proceedingsof the IEEE International Conference on Bioinformatics andBiomedicine (BIBM rsquo09) pp 174ndash177 November 2009

[23] Y-W Leung and Y Wang ldquoMultiobjective programming usinguniform design and genetic algorithmrdquo IEEE Transactions onSystems Man and Cybernetics Part C Applications and Reviewsvol 30 no 3 pp 293ndash304 2000

[24] D Liu K C Tan C K Goh and W K Ho ldquoA multiobjectivememetic algorithm based on particle swarm optimizationrdquoIEEE Transactions on Systems Man and Cybernetics Part BCybernetics vol 37 no 1 pp 42ndash50 2007

[25] K Deb A Pratap S Agarwal and T Meyarivan ldquoA fast andelitist multiobjective genetic algorithm NSGA-IIrdquo IEEE Trans-actions on Evolutionary Computation vol 6 no 2 pp 182ndash1972002

[26] C M Fonseca and P J Fleming ldquoMultiobjective optimizationandmultiple constraint handling with evolutionary algorithmsII Application examplerdquo IEEE Transactions on Systems Manand Cybernetics Part A Systems and Humans vol 28 no 1 pp38ndash47 1998

[27] F Kursawe ldquoA variant of evolution strategies for vector opti-mizationrdquo in Parallel Problem Solving from Nature vol 496 ofLecture Notes in Computer Science pp 193ndash197 Springer BerlinGermany 1991

[28] E Zitzler K Deb and LThiele ldquoComparison of multiobjectiveevolutionary algorithms empirical resultsrdquo Evolutionary Com-putation vol 8 no 2 pp 173ndash195 2000

[29] H-L Liu Y Wang and Y-M Cheung ldquoA Multi-objective evo-lutionary algorithm usingmin-max strategy and sphere coordi-nate transformationrdquo Intelligent Automation amp Soft Computingvol 15 no 3 pp 361ndash384 2009

[30] N Chakraborti B S Kumar V S Babu S Moitra andA Mukhopadhyay ldquoA new multi-objective genetic algorithmapplied to hot-rolling processrdquoAppliedMathematicalModellingvol 32 no 9 pp 1781ndash1789 2008

[31] K Deb ldquoMulti-objective genetic algorithms problem difficul-ties and construction of test problemsrdquo Evolutionary Computa-tion vol 7 no 3 pp 205ndash230 1999

[32] K Deb L Thiele M Laumanns and E Zitzler ldquoScalable testproblems for evolutionarymultiobjective optimizationrdquo in Evo-lutionary Multiobjective Optimization Advanced Informationand Knowledge Processing pp 105ndash145 Springer London UK2005

[33] Y Shi and R Eberhart ldquoModified particle swarm optimizerrdquoin Proceedings of the IEEE World Congress on ComputationalIntelligence pp 69ndash73 May 1998

[34] C S Feng S Cong and X Y Feng ldquoA new adaptive inertiaweight strategy in particle swarm optimizationrdquo in Procedingsof the IEEE Congress on Evolutionary Computation (CEC rsquo07)pp 4186ndash4190 IEEE Singapore September 2007

[35] C K Goh K C Tan D S Liu and S C Chiam ldquoA competitiveand cooperative co-evolutionary approach to multi-objectiveparticle swarm optimization algorithm designrdquo European Jour-nal of Operational Research vol 202 no 1 pp 42ndash54 2010

[36] P Cheng J-S Pan L Li Y Tang and C Huang ldquoA surveyof performance assessment for multiobjective optimizersrdquo inProceedings of the 4th International Conference on Genetic andEvolutionary Computing (ICGEC rsquo10) pp 341ndash345 December2010

[37] E Zitzler L Thiele M Laumanns C M Fonseca and V GDa Fonseca ldquoPerformance assessment of multiobjective opti-mizers an analysis and reviewrdquo IEEE Transactions on Evolution-ary Computation vol 7 no 2 pp 117ndash132 2003

[38] E Zitzler and L Thiele ldquoMultiobjective evolutionary algo-rithms a comparative case study and the strength Paretoapproachrdquo IEEETransactions onEvolutionaryComputation vol3 no 4 pp 257ndash271 1999

[39] H Li and Q Zhang ldquoMultiobjective optimization problemswith complicated pareto sets MOEAD and NSGA-IIrdquo IEEE

Mathematical Problems in Engineering 17

Transactions on Evolutionary Computation vol 13 no 2 pp284ndash302 2009

[40] Q Zhang A Zhou S Zhao P N Suganthan W Liu andS Tiwari ldquoMultiobjective optimization test instances for theCEC 2009 special session and competitionrdquo Working ReportDepartment of Computing and Electronic Systems Universityof Essex 2008

[41] Q Zhang A Zhou and Y Jin ldquoRM-MEDA a regularity model-based multiobjective estimation of distribution algorithmrdquoIEEE Transactions on Evolutionary Computation vol 12 no 1pp 41ndash63 2008

[42] C-S Tsou H-H Fang H-H Chang and C-H Kao ldquoAnimproved particle swarmPareto optimizerwith local search andclusteringrdquo in Simulated Evolution and Learning vol 4247 ofLecture Notes in Computer Science pp 400ndash407 2006

[43] U Wickramasinghe and X Li ldquoChoosing leaders for multiob-jective PSO algorithms using differential evolutionrdquo in Proceed-ing of the 7th International Conference Simulated Evolution andLearning vol 5361 of Lecture Notes in Computer Science pp249ndash258 Springer Berlin Germany 2008

[44] T Krink J S Vesterstrom and J Riget ldquoParticle swarm opti-misation with spatial particle extensionrdquo in Proceedings of theCongress on Evolutionary Computation (CEC rsquo02) pp 1474ndash1479 May 2002

Submit your manuscripts athttpwwwhindawicom

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical Problems in Engineering

Hindawi Publishing Corporationhttpwwwhindawicom

Differential EquationsInternational Journal of

Volume 2014

Applied MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Probability and StatisticsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical PhysicsAdvances in

Complex AnalysisJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

OptimizationJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

CombinatoricsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Operations ResearchAdvances in

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Function Spaces

Abstract and Applied AnalysisHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of Mathematics and Mathematical Sciences

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Algebra

Discrete Dynamics in Nature and Society

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Decision SciencesAdvances in

Discrete MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014 Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Stochastic AnalysisInternational Journal of

Page 5: Research Article Multiobjective Particle Swarm

Mathematical Problems in Engineering 5

population paretoupdate(offspring pareto)Input offspring indicates the offsprings after performing the crossover operator

and mutation operator pareto indicates the non-dominated solutionsOutput pareto indicates the non-dominated solutionsoffspringlarr call paretocreate(offspring)for forall chr1 isin offspring

nondominatedlarr truefor forall chr2 isin paretoif chr1 dominate chr2paretolarr pareto minus chr2

else if chr2 dominate chr1nondominatedlarr false

elsecontinue

end ifend forif nondominatedparetolarr pareto cup chr1

end ifend forreturn pareto

Algorithm 3 Pseudocode of updating elitist

Step 3 Based on each of fitness functions evaluate the qualityof the 119879 points Assume the remainder 119879

11198630is 1198770 For the

first 1198770fitness functions select the best lceil119879

11198630rceil points for

the rest of fitness functions select the best lfloor11987911198630rfloor points

Overall a total of 1198791points are selected The objectives of

these selected points are correspondingly stored into fit119881

33 Selection Mechanism for Gbest Based on PAM InMOPSO 119866119887119890119904119905 plays an important role in guiding the entireswarm toward the global Pareto front [24] In contrast tothe single objective PSO having only one global best 119866119887119890119904119905MOPSO has multiple Pareto optimality solutions which arenondominated each other How to select a suitable 119866119887119890119904119905 foreach of particles fromPareto optimal solutions is one very keyissue

The paper presents the selection mechanism for 119866119887119890119904119905based on PAM as follows

Algorithm C Assume the population and the numbers ofparticles are denoted as pop and119873pop and the Pareto opti-mality and the numbers of Pareto optimality are denoted aspareto and119873119875

Step 1 Acquire the number of cluster according to the fol-lowing formula

119870 = 119870min + (119870max minus 119870min) sdot (119905

max119866) (7)

where 119870min and 119870max respectively denote the minimal andmaximal value of 119870 namely 119870 isin [119870min 119870max] 119905 and max119866indicate the 119905th iteration and the maximal iteration numberThe formula can acquire the linearly increasing number ofcluster so as to accord with the process from the coarse searchto the elaborate search

Step 2 If 119873119875 le 119870 then for each particle in pop find thenearest Pareto optimal solution as its 119866119887119890119904119905 Otherwise turnto Step 3

Step 3 Perform PAM described in Section 23 to partitionPareto and pop into 119870 clusters respectively the clustercentroids of which are denoted as119862

1= 11988811 11988812 119888

1119870 and

1198622= 11988821 11988822 119888

2119870

Step 4 For each 1198882119895isin 1198622 find the nearest 119888

1119894isin 1198621 For

all the particles in the cluster represented by 1198882119895 their 119866119887119890119904119905

randomly takes one of Pareto optimal solutions in the clusterrepresented by 119888

1119894

34 Adjustment of Pareto Optimal Solutions Based on theUniform Design and PAM The number of Pareto optimalsolutions in the external archivewill increasingly enlargewithevolution namely the size of the external archive will becomevery large with the iteration number This will increasethe computation complexity and the execution time of analgorithmTherefore the size of the external archive must becontrolled In the meanwhile Pareto optimal solutions in theexternal archive are possibly close to each other in the objec-tive space If the solutions are close to each other they arenearly the same choice It is desirable to find the Pareto opti-mal solutions scattered uniformly over the Pareto front sothat the decision maker can have a variety of choices There-fore how to control the size of the external archive and selectrepresentative Pareto optimal solutions scattered uniformlyover the Pareto front is a key issue

The paper presents the adjustment algorithm of Paretooptimal solutions based on the uniform design and PAMThe algorithm firstly implements PAM to partition thePareto front into 119870 clusters in the objective space and then

6 Mathematical Problems in Engineering

implements the uniform crossover operator on the minimalcluster so as to generate more Pareto optimality in theminimal cluster finally it keeps all the points in the lesserclusters and discards some points in the larger clusters Thedetailed steps of the algorithm are shown as follows

Algorithm DAssume the Pareto optimality their values of119872functions and the numbers of Pareto optimality are denotedas pareto 119865pareto and119873119875 The size of the external archive isassumed as119873119875 lim The number of clusters is 119870

Step 1 If the numbers of data points in the maximal andthe minimal clusters have too many differences or are bothvery small select all the data points from the minimal clusterand the same number of points from the cluster which isthe nearest to the minimal cluster For each pair of datapoints perform the uniform crossover operator described inSection 243 Update pareto 119865pareto and 119873119875 according toAlgorithm 3

Step 2 If 119873119875 gt 119873119875 lim implement PAM to partition the119865pareto into119870 clusters in the objective space and let119873 div =119873119875 lim 119870 it indicates the average points to select from eachof119870 clustersWe can classify three situations according to thenumber of points in each cluster and119873 div as follows

(i) Situation 1 the numbers of points in all clusters arelarger than or equal to119873 div

(ii) Situation 2 the number of data points is larger thanor equal to119873 div only in one cluster

(iii) Situation 3 the number of the clusters in which thenumber of data points is larger than 119873 div is largerthan 1 and less than119870

Step 3 For Situation 1 sort all clusters according to their con-taining points in ascending order Select 119873 div points fromthe previous 119870 minus 1 clusters and select the remainder points119873119875 limminus119873 div lowast(119870 minus 1) from the last cluster Turn to Step7

Step 4 For Situation 2 keep the points of all the clusters inwhich the number of the points is less than or equal to119873 divand the remainder points are selected from that only clusterTurn to Step 7

Step 5 For Situation 3 keep all the points of all the clustersin which the number of points is less than or equal to119873 divTurn to Step 6 to select the remainder point

Step 6 Recalculate 119873 div and let 119873 div = rem1198751198701015840 whererem119875 and 1198701015840 indicate the number of the remainder pointsand clusters According to the new 119873 div turn to Step 3 orStep 4

Step 7 Save the selected 119873119875 lim points and terminate thealgorithm

35 Thoughts on the Proposed Algorithm In MOP for 119872objectives 119891

1 1198912 119891

119872and any two points 119909 and 119910 in the

feasible solution space Ω if each objective is satisfied with119891119894(119909) le 119891

119894(119910) and at least one objective is satisfied with

119891119894(119909) lt 119891

119894(119910) namely 119909 is at least as good as 119910with respect to

all the objectives and 119909 is strictly better than 119910 with respectto at least one objective then we say that 119909 dominates 119910 Ifno other solution is strictly better than 119909 then 119909 is called anondominated solution or Pareto optimal solution

In single objective problems there is only one objective tobe optimized Therefore the global best (119866119887119890119904119905) of the wholeswarm is only one In multiobjective problems there aremultiple consistent or conflicting objectives to be optimizedTherefore there exist very large or infinite numbers of solu-tions which cannot dominate each other These solutions arenondominated solutions or Pareto optimal solutions There-fore 119866119887119890119904119905 of the whole swarm is more than one How toselect suitable 119866119887119890119904119905 is a very key issue

When it is not possible to find all these solutions it maybe desirable to find as many solutions as possible in order toprovide more choices to the decision maker However if thesolutions are close to each other they are nearly of the samechoice It is desirable to find the Pareto optimal solutionsscattered uniformly over the Pareto front so that the decisionmaker can have a variety of choices The paper introducesuniform design to ensure that the acquired solutions scatteruniformly over the Pareto front in the objective space

For the MOPSO the diversity of the population is a veryimportant factor It has a key impact on the convergence ofan algorithm and uniformly distribution of Pareto optimalsolution It can effectively get rid of the premature of thealgorithmsThe paper introduces PAM clustering algorithmsto maintain the diversity of the population The particles inthe same cluster have similar features whereas ones in differ-ent clusters have dissimilar features Thus choosing particlesfrom different clusters may necessarily increase the diversityof the population

36 Steps of the Proposed Algorithm

Step 1 According to the population size 119873pop determinethe number of subintervals 119878 and the population size 119876

0in

each subinterval such that 1198760lowast 119878 is more than 119873pop 119876

0is

a prime and must exist in Table 1 Execute Algorithm A inSection 242 to generate a temporary population containing1198760lowast 119878 ge 119873pop points

Step 2 PerformAlgorithm B described in Section 32 to selectthe best 119873pop points from the 119876

0lowast 119878 points as the initial

population pop and acquire their objectives fit119881

Step 3 Initialize the speed and119875119887119890119904119905 of the particle 119894 by119881[119894] =pop[119894] and 119875119887119890119904119905[119894] = pop[119894] 119891119875119887119890119904119905[119894] = fit119881[119894] where 119894 =1 sdot sdot sdot 119873pop 119891119875119887119890119904119905[119894] denotes objectives of 119875119887119890119904119905[119894]

Step 4 According to Algorithm 2 select elitist or Paretooptimal solutions from pop and store them to the externalsecondary list pareto the size of which is assumed as119873119875

Step 5 Choose the suitable119866119887119890119904119905 for each of the particles frompareto in terms of Algorithm C described in Section 33

Mathematical Problems in Engineering 7

Step 6 Update the position pop and velocity 119881 for each ofthe particles respectively using formulas (2) and (3) whereformula (2) is modified as follows

V119894119895 (119896 + 1) = 119908 sdot V119894119895 (119896)

+ 1198881sdot 1199031sdot (119875119887119890119904119905

119894119895 (119896) minus 119909119894119895 (119896))

+ 1198882times 1199032times (119866119887119890119904119905

119894119895 (119896) minus 119909119894119895 (119896))

(8)

Step 7 For the 119895th dimensional variable of the 119894th particle ifit goes beyond its lower boundary or upper boundary then ittakes the 119895th dimensional value of its corresponding bound-ary and the 119895th dimensional value of its velocity takes theopposite value

Step 8 Calculate each of119872 objectives for each of the particlesand update them into fit119881

Step 9 Update 119875119887119890119904119905 of each particle as followsFor the 119894th particle if fit119881[119894] dominates 119891119875119887119890119904119905[119894] then

let 119891119875119887119890119904119905[119894] = fit119881[119894] and 119875119887119890119904119905[119894] = pop[119894] otherwise119875119887119890119904119905[119894] and119891119875119887119890119904119905[119894] are kept unchanged If neither of themis dominated by the other then randomly select one of themas 119891119875119887119890119904119905[119894] and update its corresponding 119875119887119890119904119905[119894]

Step 10 Update the external archive pareto storing Paretooptimality and its size119873119875 according to Algorithm 3

Step 11 Implement Algorithm D described in Section 34 toadjust the Pareto optimal solutions such that the number ofPareto optimal solutions is less than or equal to the size of theexternal archive119873119875 lim

Step 12 If the stop criterion is satisfied terminate algorithmotherwise turn to Step 5 and continue

4 Numerical Results

In order to evaluate the performances of the proposedalgorithm we compare it with two outstanding algorithmsUMOGA [23] and NSGA-II [25]

41 Test Problems Several well-known multiobjective testfunctions are used to test the performance of the proposedalgorithm

Biobjective problem FON [25 26] KUR [25 27]ZDT1 ZDT2 ZDT3 and ZDT6 [25 28ndash31]Three-objective problems DTLZ1 and DTLZ [2 32]

The definitions of them are as followsFON is defined as

1198911 (119909) = 1 minus exp(minus

3

sum119894=1

(119909119894minus1

radic3)2

)

1198912 (119909) = 1 minus exp(minus

3

sum119894=1

(119909119894+1

radic3)2

)

(9)

where 119909119894isin [minus4 4]

This test function has a nonconvex Pareto optimal frontKUR is defined as

1198911 (119909) =

119899minus1

sum119894=1

(minus10 exp (minus02 sdot radic1199092119894+ 1199092119894+1))

1198912 (119909) =

119899

sum119894=1

(1003816100381610038161003816119909119894100381610038161003816100381608+ 5 sdot sin1199093

119894)

(10)

where 119899 = 3 and 119909119894isin [minus5 5]

This test function has a nonconvex Pareto optimal frontin which there are three discontinuous regions

ZDT1 is defined as1198911(1199091) = 1199091

119892 (1199092 119909

119899) = 1 +

9sum119899119894=2119909119894

(119899 minus 1)

ℎ (1198911 119892) = 1 minus radic

1198911

119892

(11)

where 119899 = 30 and 119909119894isin [0 1]

This test function has a convex Pareto optimal frontZDT2 is defined as

1198911(1199091) = 1199091

119892 (1199092 119909

119899) = 1 +

9sum119899119894=2119909119894

(119899 minus 1)

ℎ (1198911 119892) = 1 minus (

1198911

119892)2

(12)

where 119899 = 30 and 119909119894isin [0 1]

This test function has a nonconvex Pareto optimal frontZDT3 is defined as

1198911(1199091) = 1199091

119892 (1199092 119909

119899) = 1 +

9sum119899119894=2119909119894

(119899 minus 1)

ℎ (1198911 119892) = 1 minus radic

1198911

119892minus (

1198911

119892) sin (10120587119891

1)

(13)

where 119899 = 30 and 119909119894isin [0 1]

This test function represents the discreteness feature ItsPareto optimal front consists of several noncontiguous con-vex parts The introduction of the sine function in ℎ causesdiscontinuity in the Pareto optimal front

ZDT6 is defined as

1198911(1199091) = 1 minus exp (minus4119909

1) sin6 (6120587119909

1)

119892 (1199092 119909

119899) = 1 + 9(

sum119899119894=2119909119894

(119899 minus 1))025

ℎ (1198911 119892) = 1 minus (

1198911

119892)2

(14)

where 119899 = 10 and 119909119894isin [0 1]

8 Mathematical Problems in Engineering

This test function has a nonconvex Pareto optimal frontIt includes two difficulties caused by the nonuniformityof the search space Firstly Pareto optimal solutions arenonuniformly distributed along the global Pareto optimalfront (the front is biased for solutions in which 119891

1(1199091) is near

one) Secondly the density of the solutions is the lowest nearthe Pareto optimal front and the highest away from the front

DTLZ1 is defined as

1198911 (119909) = 0511990911199092 (1 + 119892 (119909))

1198912 (119909) = 051199091 (1 minus 1199092) (1 + 119892 (119909))

1198913 (119909) = 05 (1 minus 1199091) (1 + 119892 (119909))

119892 (119909) = 100

sdot [5 +119899

sum119894=3

((119909119894minus 05)

2minus cos (20120587 (119909

119894minus 05)))]

(15)

where 119899 = 7 and 119909119894isin [0 1]

This test function has difficulty in the convergence to thePareto optimal hyperplane and contains (115minus1) local Paretooptimal fronts

DTLZ2 is defined as

1198911 (119909) = (1 + 119892 (119909)) cos (051205871199091) cos (051205871199092)

1198912 (119909) = (1 + 119892 (119909)) cos (051205871199091) sin (051205871199092)

1198913 (119909) = (1 + 119892 (119909)) sin (051205871199091)

119892 (119909) =119899

sum119894=3

(119909119894minus 05)

2

(16)

where 119899 = 12 and 119909119894isin [0 1]

The Pareto optimal solutions of this test function must lieinside the first octant of the unit sphere in a three-objectiveplot It is more difficult than DTLZ1

42 Parameter Values The parameter values of the proposedalgorithm UKMOPSO are adopted as follows

(i) Parameters for PSO the linearly descending inertiaweight [33 34] is within the interval [01 1] and theacceleration coefficients 119888

1and 1198882are both taken as 2

(ii) Parameter for PAM the minimal and maximal num-bers of clusters are 119870min = 2 and119870max = 10

(iii) Population size the population size119873pop is 200(iv) Parameters for the uniform design the number of

subintervals 119878 is 64 the number of the sample pointsor the population size of each subinterval 119876

0is 31 set

1198630= 31 and 119876

1= 5

(v) Stopping condition the algorithm terminates if thenumber of iterations is larger than the given maximalgenerations of 20

All the parameter values in UMOGA [23] are set equalto the original values in [23] the different and additional

parameter values between UMOGA and UKMOPSO are asfollows the number of subintervals 119878 is 16 119865 = 119873 (thenumber of variables)119863

1= 7 119901

119898= 002

NSGA-II in [25] adopted a population size of 100 acrossover probability of 08 a mutation probability of 1119899(where 119899 is the number of variables) and maximal genera-tions of 250 In order to make the comparisons fair we haveused population size of 100 and maximal generations of 40 inNSGA-II so that the total number of function evaluations inNSGA-II UKMOPSO and UMOGA is the same

43 Performance Metric Based on the assumption that thetrue Pareto front of a test problem is known many kinds ofperformance metrics have been proposed and used by manyresearchers such as [2 9 24 25 29 35ndash40] Three of themare adopted in the paper to compare the performance of theproposed algorithm with them The first metric is 119862 metric[29 38] It is taken as the quantitative metric of the solutionquality and often used to show that the outcomes of one algo-rithm dominating the outcomes of another algorithm It isdefined as

119862 (119860 119861) =|119887 isin 119861 exist119886 isin 119860 119886 ≺ 119887|

|119861| (17)

where 119886 ≺ 119887 represents 119887 being dominated by 119886 119860 representsthe Pareto front obtained by Algorithm 119860 and 119861 representsthe Pareto front obtained by Algorithm 119861 in a typical runFurthermore |119861| represents the number of elements of 119861

The metric value 119862(119860 119861) = 1 means that all points in 119861are dominated by or equal to points in 119860 In contrast 119862(119860119861) = 0means that none of the points in 119861 are covered by theones in 119860

The second metric is the IGD metric (namely InvertedGenerational Distance) [39ndash41] It is the mean value of thedistances from the set of solutions uniformly distributed inthe true Pareto front to the set of solutions obtained by analgorithm in objective space Let 119875lowast be a set of uniformly dis-tributed points along the true PF (Pareto Front) Let 119860 be anapproximate set to the PF the average distance from119875lowast to119860 isdefined as

IGD (119860 119875lowast) =sumVisin119875lowast 119889 (V 119860)

|119875lowast| (18)

where 119889(V 119860) is the minimum Euclidean distance between Vand the points in 119860 If |119875lowast| is large enough to represent thePF very well IGD(119860 119875lowast) could measure both the diversityand convergence of 119860 in a sense To have a low value ofIGD(119860 119875lowast) the set119860must be very close to the PF and cannotmiss any part of the whole PF

The smaller the IGDvalue for the set of obtained solutionsis the better the performance of the algorithm will be

The third measure is the measure of maximum spread(MS) [2 24 35] which is proposed in [42 43] It canmeasure how well the true Pareto front (PFtrue) is coveredby the discovered Pareto front (PFknown) through hyperboxes

Mathematical Problems in Engineering 9

0 01 02 03 04 05 06 07 08 09 10

01

02

03

04

05

06

07

08

09

1100

True Pareto frontUKMOPSO

0 01 02 03 04 05 06 07 08 09 10

01

02

03

04

05

06

07

08

09

152

True Pareto frontUMOGA

0 01 02 03 04 05 06 07 08 09 10

01

02

03

04

05

06

07

08

09

140

True Pareto frontNSGA-II

f1(x)

f2(x)

f1(x)

f2(x)

f1(x)

f2(x)

Figure 1 Pareto fronts obtained by different algorithms for test problem FON

formed by the extreme function values observed in the PFtrueand PFknown It is defined as

MS = radic 1

119872

119872

sum119894=1

120575119894

120575119894= (

min (119891max119894 119865max119894) minusmax (119891min

119894 119865min119894)

119865max119894

minus 119865min119894

)

2

(19)

where 119872 is the number of objectives 119891max119894

and 119891min119894

arethe maximum and minimum values of the 119894th objective inPFknown respectively and 119865

max119894

and 119865min119894

are the maximumand minimum values of the 119894th objective in PFtrue respec-tively

Note that if 119891min119894

ge 119865max119894

then 120575119894= 0 Algorithms with

largerMS values are desirable andMS = 1means that the truePareto front is totally covered by the obtained Pareto front

44 Results For each test problem we perform the proposedalgorithm (called UKMOPSO) for 30 independent runs andcompare its performance with UMOGA [23] and NSGA-II[25] The values of several metrics the 119862metric IGDmetricand MS metric are shown in Tables 2 3 4 and 5 The Paretofronts obtained by several algorithms implemented on severaltest functions are illustrated in Figures 1ndash6

For brevity 119862(UKMOPSOUMOGA) 119862(UMOGAUKMOPSO) 119862(UKMOPSONSGA-II) and 119862(NSGA-IIUKMOPSO) are respectively marked as 119862

12 11986221 11986213 and

11986231

10 Mathematical Problems in Engineering

0100

UKMOPSOTrue Pareto front

14

UMOGATrue Pareto front

40

NSGA-IITrue Pareto front

f1(x)

f2(x)

minus20 minus19 minus18 minus17 minus16 minus15 minus14

f1(x)

minus20 minus19 minus18 minus17 minus16 minus15 minus14 minus13 minus12minus12

minus10

minus8

minus6

minus4

minus2

0

f2(x)

f1(x)

minus20 minus19 minus18 minus17 minus16 minus15 minus14minus12

minus10

minus8

minus6

minus4

minus2

0

f2(x)

minus12

minus10

minus8

minus6

minus4

minus2

Figure 2 Pareto fronts obtained by different algorithms for test problem KUR

Table 2 Comparison of C metric between UKMOPSO andUMOGA

C (UKMOPSO UMOGA) C (UMOGA UKMOPSO)FON 0769 001KUR 0929 001ZDT1 012 013ZDT2 0063 003ZDT3 0139 0098ZDT6 0063 001DTLZ1 1 0DTLZ2 1 0

Table 3 Comparison of Cmetric between UKMOPSO and NSGA-II

C (UKMOPSO NSGA-II) C (NSGA-II UKMOPSO)FON 0025 001KUR 0075 011ZDT1 1 0ZDT2 1 0ZDT3 1 0ZDT6 1 0DTLZ1 0 013DTLZ2 0025 006

Mathematical Problems in Engineering 11

0

02

04

06

08

1

12

14100 76

0

05

1

15

2

25

325

0 01 02 03 04 05 06 07 08 09 1

True Pareto frontUKMOPSO

0 01 02 03 04 05 06 07 08 09 1

True Pareto frontNSGA-II

f1(x)

f2(x)

f1(x)

0 01 02 03 04 05 06 07 08 09 10

01

02

03

04

05

06

07

08

09

1

True Pareto frontUMOGA

f1(x)

f2(x)

f2(x)

Figure 3 Pareto fronts obtained by different algorithms for test problem ZDT1

Table 4 Comparison of IGD metric

UKMOPSO UMOGA NSGA-IIFON 00072 00432 03707KUR 00668 11576 12323ZDT1 00067 00265 13933ZDT2 00067 00727 21409ZDT3 01964 02151 02816ZDT6 00033 00139 59223DTLZ1 74827 22965 54182DTLZ2 01123 04319 02724

Table 5 Comparison of maximum spread (MS) metric

UKMOPSO UMOGA NSGA-IIFON 09769 09754 02368KUR 09789 09580 03414ZDT1 09429 09984 05495ZDT2 09999 09986 00019ZDT3 09276 09275 07963ZDT6 1 09974 03738DTLZ1 1 1 06588DTLZ2 1 1 07125

12 Mathematical Problems in Engineering

0

02

04

06

08

1

12

14100 9

0

05

1

15

2

25

38

0 01 02 03 04 05 06 07 08 09 1

True Pareto frontUKMOPSO

0 01 02 03 04 05 06 07 08 09 10

01

02

03

04

05

06

07

08

09

1

True Pareto frontUMOGA

0 01 02 03 04 05 06 07 08 09 1

True Pareto frontNSGA-II

f1(x) f1(x)

f2(x)

f2(x)

f2(x)

f1(x)

Figure 4 Pareto fronts obtained by different algorithms for test problem ZDT2

As shown in Table 2 for the test functions Fon and KUR11986212= 0769 and 119862

12= 0929 mean that 768 and 929 of

the solutions obtained by UMOGA are dominated by thoseobtained by UKMOPSO Similarly 119862

21= 001means that 1

of the solutions obtained by UKMOPSO are dominated bythose obtained by UMOGA For DTLZ1 and DTLZ2119862

12= 1

and 11986221= 0 mean that all solutions obtained by UMOGA

are dominated by those obtained by UKMOPSO but noneof solutions from UKMOPSO is dominated by those fromUMOGA It means that the solution quality via UKMOPSOis much better than that via UMOGA for the above testfunctions For ZDT1 ZDT2 ZDT3 and ZDT6 the valuesof the 119862 metrics do not have too many differences betweenUKMOPSO and UMOGA It means the solution qualities viaboth are almost identical

FromTable 3 we can see that for ZDT1 ZDT2 ZDT3 andZDT6 119862

13= 1 and 119862

31= 0mean that all solutions obtained

byNSGA-II are dominated by those obtained byUKMOPSObut none of solutions from UKMOPSO is dominated bythose from NSGA-II It means that the solution quality viaUKMOPSO is much better than that via NSGA-II for theabove test functions For the rest of the test functions thesolution qualities via both are almost identical

In Table 4 for all test problems the IGD values ofUKMOPSO are the smallest among UKMOPSO UMOGAand NSGA-II This means the PF found by UKMOPSO isthe nearest to the true PF compared with the PF obtainedby the other two algorithms namely the performance ofUKMOPSO is the best in the three algorithms The IGDvalues of UMOGA are all larger than those of NSGA-II for

Mathematical Problems in Engineering 13

0

02

04

06

08

1 92 86

0

05

1

15

2

25

3

35

4

45 40

0 01 02 03 04 05 06 07 08 09f1(x)

0 01 02 03 04 05 06 07 08 09f1(x)

0 01 02 03 04 05 06 07 08 09f1(x)

f2(x)

minus08

minus06

minus04

minus02

0

02

04

06

08

1

f2(x)

f2(x)

minus08

minus05

minus06

minus04

minus02

UKMOPSO UMOGA

NSGA-II

Figure 5 Pareto fronts obtained by different algorithms for test problem ZDT3

all test functions It means that the PF found by UMOGA iscloser to the true PF than the PF obtained by NSGA-II

From Table 5 we can see that all the MS values ofUKMOPSO are almost close to 1 for each test function Itmeans that almost all the true PF is totally covered by the PFobtained by UKMOPSO UMOGA is similar to UKMOPSOThe MS values of NSGA-II are much lesser than UKMOPSOand UMOGA especially MS = 00019 for ZDT2

Figures 1ndash6 all demonstrate that the PF found byUKMOPSO is the nearest to the true PF and scatters mostuniformly in those obtained by three algorithms Most of thepoints in the PF found by UKMOPSO overlap the pointsin the true PF It means the solution quality obtained byUKMOPSO is very high

45 Influence of the Uniform Crossover and PAM In orderto find out the influence of the uniform crossover on the

proposed algorithm we compare the distribution and num-ber of Pareto optimal solutions before and after performingthe uniform crossover One of the simulating results on thetest function ZDT1 is shown in Figure 7

From Figure 7 it can be seen that before and afterperforming the uniform crossover the number of data pointsin the 1st cluster and the 2nd cluster varies from 7 to 15and from 19 to 26 respectively namely many new Paretooptimal solutions having not been found before performingthe uniform crossover are generated and the differences ofthe data points between two clusters have been decreasedThis will directly influence the uniformity of the PF andacquire more uniform Pareto solutions

PAM is used to determine which Pareto solutions are tobe removed fromor be inserted into the external archiveThisis to maintain the diversity of Pareto optimal solutions The

14 Mathematical Problems in Engineering

100 16

0

05

1

15

2

25

3

35

4

45

540

02 03 04 05 06 07 08 09 10

01

02

03

04

05

06

07

08

09

1

True Pareto frontUKMOPSO

02 03 04 05 06 07 08 09 10

01

02

03

04

05

06

07

08

09

1

True Pareto frontUMOGA

02 03 04 05 06 07 08 09 1

True Pareto frontNSGA-II

f1(x)

f2(x)

f1(x)

f2(x)

f2(x)

f1(x)

Figure 6 Pareto fronts obtained by different algorithms for test problem ZDT6

diversity can be computed by the ldquodistance-to-average-pointrdquomeasure [44] defined as

diversity (119878) = 1

|119878|

|119878|

sum119894=1

radic119873

sum119895=1

(119901119894119895minus 119901119895)2

(20)

where 119878 is the population |119878| is the swarm size 119873 is thedimensionality of the problem 119901

119894119895is the 1198951015840th value of the 1198941015840th

particle and 119901119895is the 1198951015840th value of the average point 119901

If the number of Pareto optimality is less than or equalto the size of the external archive PAM has no influenceon the proposed algorithm Otherwise it is used to selectthe different type of Pareto optimality from several clustersTherefore The diversity of Pareto optimality will certainlyincreaseWemonitor the diversity of Pareto optimality before

and after performing PAMonZDT1 at a certain timeThe val-ues are respectively 8762 and 11752 This fully demonstratesthat PAM can improve the diversity of Pareto optimality

5 Conclusion and Future Work

In this paper a multiobjective particle swarm optimizationbased on PAM and uniform design is presented It firstlyimplements PAM to partition the Pareto front into119870 clustersin the objective space and then it implements the uniformcrossover operator on the minimal cluster so as to generatemore Pareto optimality in the minimal cluster When thesize of the Pareto solution is larger than that of the externalarchive PAM is used to determine which Pareto solutions areto be removed from or be inserted into the external archive

Mathematical Problems in Engineering 15

0 01 02 03 04 05 06 07 08 09

19

10

01

02

03

04

05

06

077

08

09

1

The 1st clusterThe 2nd cluster

26

(a)

41

0 01 02 03 04 05 06

26

07 08 09 10

01

02

03

04

051506

07

08

09

1

The 1st clusterThe 2nd cluster

(b)

Figure 7 Pareto optimal solutions before (a) and after (b) performing the uniform crossover

Finally it keeps all the points in the lesser clusters anddiscards some points in the larger clusters This can ensurethat each of the clusters will contain approximately the samedata points Therefore the diversity of the Pareto solutionswill increase and they can scatter uniformly over the Paretofront The results of the experimental simulation performedon several well-known test problems indicate that theproposed algorithm obviously outperforms the other twoalgorithms

This algorithm is going on for further enhancement andimprovement One attempt is to use a more efficient orapproximate clustering algorithm to speed up the executiontime of this algorithm Another attempt is to extend its appli-cation scopes

Conflict of Interests

The authors declare that there is no conflict of interestsregarding the publication of this paper

Acknowledgments

This research was supported by Guangxi Natural ScienceFoundation (no 2013GXNSFAA019337) Guangxi Universi-ties Key Project of Science and Technology Research (noKY2015ZD099) Key project of Guangxi Education Depart-ment (no 2013ZD055) Scientific Research Staring Founda-tion for the PHD Scholars of Yulin Normal University (noG2014005) Special Project of Yulin Normal University (no2012YJZX04) and Key Project of Yulin Normal University(no 2014YJZD05)The authors are grateful to the anonymousreferee for a careful checking of the details and for helpfulcomments that improved this paper

References

[1] J D Schaffer ldquoMultiple objective optimization with vector eval-uated genetic algorithmsrdquo in Proceedings of the 1st InternationalConference on Genetic Algorithms pp 93ndash100 1985

[2] L Tang and X Wang ldquoA hybrid multiobjective evolutionaryalgorithm for multiobjective optimization problemsrdquo IEEETransactions on Evolutionary Computation vol 17 no 1 pp 20ndash45 2013

[3] K Deb S Agrawal A Pratap and T Meyarivan ldquoA fast elitistnon-dominated sorting genetic algorithm for multi-objectiveoptimizationNSGA-IIrdquo inParallel ProblemSolving fromNaturePPSN VI vol 1917 of Lecture Notes in Computer Science pp849ndash858 Springer Berlin Germany 2000

[4] J Zhang Y Wang and J Feng ldquoAttribute index and uniformdesign based multiobjective association rule mining with evo-lutionary algorithmrdquo The Scientific World Journal vol 2013Article ID 259347 16 pages 2013

[5] S Jiang and Z Cai ldquoFaster convergence and higher hypervol-ume for multi-objective evolutionary algorithms by orthogonaland uniform designrdquo in Advances in Computation and Intelli-gence vol 6382 of Lecture Notes in Computer Science pp 312ndash328 Springer Berlin Germany 2010

[6] J Kennedy and R Eberhart ldquoParticle swarm optimizationrdquoin Proceedings of the IEEE International Conference on NeuralNetworks pp 1942ndash1948 December 1995

[7] R Eberhart and J Kennedy ldquoA new optimizer using particleswarm theoryrdquo in Proceedings of the 6th International Sympo-sium on Micro Machine and Human Science pp 39ndash43 IEEENagoya Japan October 1995

[8] L Yang ldquoA fast and elitist multi-objective particle swarm algo-rithm NSPSOrdquo in Proceedings of the IEEE International Con-ference on Granular Computing (GRC rsquo08) pp 470ndash475 August2008

[9] C A C Coello G T Pulido and M S Lechuga ldquoHandlingmultiple objectives with particle swarm optimizationrdquo IEEE

16 Mathematical Problems in Engineering

Transactions on Evolutionary Computation vol 8 no 3 pp256ndash279 2004

[10] M Clerc and J Kennedy ldquoThe particle swarm-explosion sta-bility and convergence in a multidimensional complex spacerdquoIEEE Transactions on Evolutionary Computation vol 6 no 1pp 58ndash73 2002

[11] H Xiaohui and R Eberhart ldquoMultiobjective optimization usingdynamic neighborhood particle swarm optimizationrdquo in Pro-ceedings of the Congress on Evolutionary Computation (CEC rsquo02)vol 2 pp 1677ndash1681 May 2002

[12] S Mostaghim and J Teich ldquoStrategies for finding goodlocal guides in multi-objective particle swarm optimization(MOPSO)rdquo in Proceedings of the IEEE Swarm IntelligenceSymposium (SIS rsquo03) pp 26ndash33 Indianapolis Ind USA April2003

[13] T Bartz-Beielstein P Limbourg J Mehnen K Schmitt K EParsopoulos and M N Vrahatis ldquoParticle swarm optimizersfor Pareto optimizationwith enhanced archiving techniquesrdquo inProceedings of the Congress on Evolutionary Computation (CECrsquo03) vol 3 pp 1780ndash1787 December 2003

[14] K E Parsopoulos D K Tasoulis and M N Vrahatis ldquoMulti-objective optimization using parallel vector evaluated particleswarm optimizationrdquo in Proceedings of the IASTED Interna-tional Conference on Artificial Intelligence and Applications (AIArsquo04) vol 2 pp 823ndash828 February 2004

[15] S-J Tsai T-Y Sun C-C Liu S-T Hsieh W-C Wu and S-YChiu ldquoAn improved multi-objective particle swarm optimizerformulti-objective problemsrdquo Expert Systems with Applicationsvol 37 no 8 pp 5872ndash5886 2010

[16] A A Mousa M A El-Shorbagy and W F Abd-El-WahedldquoLocal search based hybrid particle swarm optimization algo-rithm for multiobjective optimizationrdquo Swarm and Evolution-ary Computation vol 3 pp 1ndash14 2012

[17] J Wang W Liu W Zhang and B Yang ldquoMulti-objective parti-cle swarm optimization based on self-update and grid strategyrdquoin Proceedings of the 2012 International Conference on Informa-tion Technology and Software Engineering vol 211 of LectureNotes in Electrical Engineering pp 869ndash876 Springer BerlinGermany 2013

[18] K Khalili-Damghani A-R Abtahi and M Tavana ldquoA newmulti-objective particle swarmoptimizationmethod for solvingreliability redundancy allocation problemsrdquo Reliability Engi-neering amp System Safety vol 111 pp 58ndash75 2013

[19] J Zhang Y Wang and J Feng ldquoParallel particle swarm opti-mization based on PAMrdquo in Proceedings of the 2nd InternationalConference on Information Engineering and Computer Science(ICIECS rsquo10) Wuhan China December 2010

[20] J Han and M Kamber Data Mining Concepts and TechniquesMorgan Kaufmann 2006

[21] L F Ibrahim ldquoUsing of clustering algorithm CWSP-PAM forrural network planningrdquo in Proceedings of the 3rd InternationalConference on Information Technology and Applications (ICITArsquo05) vol 1 pp 280ndash283 IEEE July 2005

[22] P Chopra J Kang and J Lee ldquoUsing gene pair combinationsto improve the accuracy of the PAM classifierrdquo in Proceedingsof the IEEE International Conference on Bioinformatics andBiomedicine (BIBM rsquo09) pp 174ndash177 November 2009

[23] Y-W Leung and Y Wang ldquoMultiobjective programming usinguniform design and genetic algorithmrdquo IEEE Transactions onSystems Man and Cybernetics Part C Applications and Reviewsvol 30 no 3 pp 293ndash304 2000

[24] D Liu K C Tan C K Goh and W K Ho ldquoA multiobjectivememetic algorithm based on particle swarm optimizationrdquoIEEE Transactions on Systems Man and Cybernetics Part BCybernetics vol 37 no 1 pp 42ndash50 2007

[25] K Deb A Pratap S Agarwal and T Meyarivan ldquoA fast andelitist multiobjective genetic algorithm NSGA-IIrdquo IEEE Trans-actions on Evolutionary Computation vol 6 no 2 pp 182ndash1972002

[26] C M Fonseca and P J Fleming ldquoMultiobjective optimizationandmultiple constraint handling with evolutionary algorithmsII Application examplerdquo IEEE Transactions on Systems Manand Cybernetics Part A Systems and Humans vol 28 no 1 pp38ndash47 1998

[27] F Kursawe ldquoA variant of evolution strategies for vector opti-mizationrdquo in Parallel Problem Solving from Nature vol 496 ofLecture Notes in Computer Science pp 193ndash197 Springer BerlinGermany 1991

[28] E Zitzler K Deb and LThiele ldquoComparison of multiobjectiveevolutionary algorithms empirical resultsrdquo Evolutionary Com-putation vol 8 no 2 pp 173ndash195 2000

[29] H-L Liu Y Wang and Y-M Cheung ldquoA Multi-objective evo-lutionary algorithm usingmin-max strategy and sphere coordi-nate transformationrdquo Intelligent Automation amp Soft Computingvol 15 no 3 pp 361ndash384 2009

[30] N Chakraborti B S Kumar V S Babu S Moitra andA Mukhopadhyay ldquoA new multi-objective genetic algorithmapplied to hot-rolling processrdquoAppliedMathematicalModellingvol 32 no 9 pp 1781ndash1789 2008

[31] K Deb ldquoMulti-objective genetic algorithms problem difficul-ties and construction of test problemsrdquo Evolutionary Computa-tion vol 7 no 3 pp 205ndash230 1999

[32] K Deb L Thiele M Laumanns and E Zitzler ldquoScalable testproblems for evolutionarymultiobjective optimizationrdquo in Evo-lutionary Multiobjective Optimization Advanced Informationand Knowledge Processing pp 105ndash145 Springer London UK2005

[33] Y Shi and R Eberhart ldquoModified particle swarm optimizerrdquoin Proceedings of the IEEE World Congress on ComputationalIntelligence pp 69ndash73 May 1998

[34] C S Feng S Cong and X Y Feng ldquoA new adaptive inertiaweight strategy in particle swarm optimizationrdquo in Procedingsof the IEEE Congress on Evolutionary Computation (CEC rsquo07)pp 4186ndash4190 IEEE Singapore September 2007

[35] C K Goh K C Tan D S Liu and S C Chiam ldquoA competitiveand cooperative co-evolutionary approach to multi-objectiveparticle swarm optimization algorithm designrdquo European Jour-nal of Operational Research vol 202 no 1 pp 42ndash54 2010

[36] P Cheng J-S Pan L Li Y Tang and C Huang ldquoA surveyof performance assessment for multiobjective optimizersrdquo inProceedings of the 4th International Conference on Genetic andEvolutionary Computing (ICGEC rsquo10) pp 341ndash345 December2010

[37] E Zitzler L Thiele M Laumanns C M Fonseca and V GDa Fonseca ldquoPerformance assessment of multiobjective opti-mizers an analysis and reviewrdquo IEEE Transactions on Evolution-ary Computation vol 7 no 2 pp 117ndash132 2003

[38] E Zitzler and L Thiele ldquoMultiobjective evolutionary algo-rithms a comparative case study and the strength Paretoapproachrdquo IEEETransactions onEvolutionaryComputation vol3 no 4 pp 257ndash271 1999

[39] H Li and Q Zhang ldquoMultiobjective optimization problemswith complicated pareto sets MOEAD and NSGA-IIrdquo IEEE

Mathematical Problems in Engineering 17

Transactions on Evolutionary Computation vol 13 no 2 pp284ndash302 2009

[40] Q Zhang A Zhou S Zhao P N Suganthan W Liu andS Tiwari ldquoMultiobjective optimization test instances for theCEC 2009 special session and competitionrdquo Working ReportDepartment of Computing and Electronic Systems Universityof Essex 2008

[41] Q Zhang A Zhou and Y Jin ldquoRM-MEDA a regularity model-based multiobjective estimation of distribution algorithmrdquoIEEE Transactions on Evolutionary Computation vol 12 no 1pp 41ndash63 2008

[42] C-S Tsou H-H Fang H-H Chang and C-H Kao ldquoAnimproved particle swarmPareto optimizerwith local search andclusteringrdquo in Simulated Evolution and Learning vol 4247 ofLecture Notes in Computer Science pp 400ndash407 2006

[43] U Wickramasinghe and X Li ldquoChoosing leaders for multiob-jective PSO algorithms using differential evolutionrdquo in Proceed-ing of the 7th International Conference Simulated Evolution andLearning vol 5361 of Lecture Notes in Computer Science pp249ndash258 Springer Berlin Germany 2008

[44] T Krink J S Vesterstrom and J Riget ldquoParticle swarm opti-misation with spatial particle extensionrdquo in Proceedings of theCongress on Evolutionary Computation (CEC rsquo02) pp 1474ndash1479 May 2002

Submit your manuscripts athttpwwwhindawicom

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical Problems in Engineering

Hindawi Publishing Corporationhttpwwwhindawicom

Differential EquationsInternational Journal of

Volume 2014

Applied MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Probability and StatisticsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical PhysicsAdvances in

Complex AnalysisJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

OptimizationJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

CombinatoricsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Operations ResearchAdvances in

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Function Spaces

Abstract and Applied AnalysisHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of Mathematics and Mathematical Sciences

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Algebra

Discrete Dynamics in Nature and Society

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Decision SciencesAdvances in

Discrete MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014 Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Stochastic AnalysisInternational Journal of

Page 6: Research Article Multiobjective Particle Swarm

6 Mathematical Problems in Engineering

implements the uniform crossover operator on the minimalcluster so as to generate more Pareto optimality in theminimal cluster finally it keeps all the points in the lesserclusters and discards some points in the larger clusters Thedetailed steps of the algorithm are shown as follows

Algorithm DAssume the Pareto optimality their values of119872functions and the numbers of Pareto optimality are denotedas pareto 119865pareto and119873119875 The size of the external archive isassumed as119873119875 lim The number of clusters is 119870

Step 1 If the numbers of data points in the maximal andthe minimal clusters have too many differences or are bothvery small select all the data points from the minimal clusterand the same number of points from the cluster which isthe nearest to the minimal cluster For each pair of datapoints perform the uniform crossover operator described inSection 243 Update pareto 119865pareto and 119873119875 according toAlgorithm 3

Step 2 If 119873119875 gt 119873119875 lim implement PAM to partition the119865pareto into119870 clusters in the objective space and let119873 div =119873119875 lim 119870 it indicates the average points to select from eachof119870 clustersWe can classify three situations according to thenumber of points in each cluster and119873 div as follows

(i) Situation 1 the numbers of points in all clusters arelarger than or equal to119873 div

(ii) Situation 2 the number of data points is larger thanor equal to119873 div only in one cluster

(iii) Situation 3 the number of the clusters in which thenumber of data points is larger than 119873 div is largerthan 1 and less than119870

Step 3 For Situation 1 sort all clusters according to their con-taining points in ascending order Select 119873 div points fromthe previous 119870 minus 1 clusters and select the remainder points119873119875 limminus119873 div lowast(119870 minus 1) from the last cluster Turn to Step7

Step 4 For Situation 2 keep the points of all the clusters inwhich the number of the points is less than or equal to119873 divand the remainder points are selected from that only clusterTurn to Step 7

Step 5 For Situation 3 keep all the points of all the clustersin which the number of points is less than or equal to119873 divTurn to Step 6 to select the remainder point

Step 6 Recalculate 119873 div and let 119873 div = rem1198751198701015840 whererem119875 and 1198701015840 indicate the number of the remainder pointsand clusters According to the new 119873 div turn to Step 3 orStep 4

Step 7 Save the selected 119873119875 lim points and terminate thealgorithm

35 Thoughts on the Proposed Algorithm In MOP for 119872objectives 119891

1 1198912 119891

119872and any two points 119909 and 119910 in the

feasible solution space Ω if each objective is satisfied with119891119894(119909) le 119891

119894(119910) and at least one objective is satisfied with

119891119894(119909) lt 119891

119894(119910) namely 119909 is at least as good as 119910with respect to

all the objectives and 119909 is strictly better than 119910 with respectto at least one objective then we say that 119909 dominates 119910 Ifno other solution is strictly better than 119909 then 119909 is called anondominated solution or Pareto optimal solution

In single objective problems there is only one objective tobe optimized Therefore the global best (119866119887119890119904119905) of the wholeswarm is only one In multiobjective problems there aremultiple consistent or conflicting objectives to be optimizedTherefore there exist very large or infinite numbers of solu-tions which cannot dominate each other These solutions arenondominated solutions or Pareto optimal solutions There-fore 119866119887119890119904119905 of the whole swarm is more than one How toselect suitable 119866119887119890119904119905 is a very key issue

When it is not possible to find all these solutions it maybe desirable to find as many solutions as possible in order toprovide more choices to the decision maker However if thesolutions are close to each other they are nearly of the samechoice It is desirable to find the Pareto optimal solutionsscattered uniformly over the Pareto front so that the decisionmaker can have a variety of choices The paper introducesuniform design to ensure that the acquired solutions scatteruniformly over the Pareto front in the objective space

For the MOPSO the diversity of the population is a veryimportant factor It has a key impact on the convergence ofan algorithm and uniformly distribution of Pareto optimalsolution It can effectively get rid of the premature of thealgorithmsThe paper introduces PAM clustering algorithmsto maintain the diversity of the population The particles inthe same cluster have similar features whereas ones in differ-ent clusters have dissimilar features Thus choosing particlesfrom different clusters may necessarily increase the diversityof the population

36 Steps of the Proposed Algorithm

Step 1 According to the population size 119873pop determinethe number of subintervals 119878 and the population size 119876

0in

each subinterval such that 1198760lowast 119878 is more than 119873pop 119876

0is

a prime and must exist in Table 1 Execute Algorithm A inSection 242 to generate a temporary population containing1198760lowast 119878 ge 119873pop points

Step 2 PerformAlgorithm B described in Section 32 to selectthe best 119873pop points from the 119876

0lowast 119878 points as the initial

population pop and acquire their objectives fit119881

Step 3 Initialize the speed and119875119887119890119904119905 of the particle 119894 by119881[119894] =pop[119894] and 119875119887119890119904119905[119894] = pop[119894] 119891119875119887119890119904119905[119894] = fit119881[119894] where 119894 =1 sdot sdot sdot 119873pop 119891119875119887119890119904119905[119894] denotes objectives of 119875119887119890119904119905[119894]

Step 4 According to Algorithm 2 select elitist or Paretooptimal solutions from pop and store them to the externalsecondary list pareto the size of which is assumed as119873119875

Step 5 Choose the suitable119866119887119890119904119905 for each of the particles frompareto in terms of Algorithm C described in Section 33

Mathematical Problems in Engineering 7

Step 6 Update the position pop and velocity 119881 for each ofthe particles respectively using formulas (2) and (3) whereformula (2) is modified as follows

V119894119895 (119896 + 1) = 119908 sdot V119894119895 (119896)

+ 1198881sdot 1199031sdot (119875119887119890119904119905

119894119895 (119896) minus 119909119894119895 (119896))

+ 1198882times 1199032times (119866119887119890119904119905

119894119895 (119896) minus 119909119894119895 (119896))

(8)

Step 7 For the 119895th dimensional variable of the 119894th particle ifit goes beyond its lower boundary or upper boundary then ittakes the 119895th dimensional value of its corresponding bound-ary and the 119895th dimensional value of its velocity takes theopposite value

Step 8 Calculate each of119872 objectives for each of the particlesand update them into fit119881

Step 9 Update 119875119887119890119904119905 of each particle as followsFor the 119894th particle if fit119881[119894] dominates 119891119875119887119890119904119905[119894] then

let 119891119875119887119890119904119905[119894] = fit119881[119894] and 119875119887119890119904119905[119894] = pop[119894] otherwise119875119887119890119904119905[119894] and119891119875119887119890119904119905[119894] are kept unchanged If neither of themis dominated by the other then randomly select one of themas 119891119875119887119890119904119905[119894] and update its corresponding 119875119887119890119904119905[119894]

Step 10 Update the external archive pareto storing Paretooptimality and its size119873119875 according to Algorithm 3

Step 11 Implement Algorithm D described in Section 34 toadjust the Pareto optimal solutions such that the number ofPareto optimal solutions is less than or equal to the size of theexternal archive119873119875 lim

Step 12 If the stop criterion is satisfied terminate algorithmotherwise turn to Step 5 and continue

4 Numerical Results

In order to evaluate the performances of the proposedalgorithm we compare it with two outstanding algorithmsUMOGA [23] and NSGA-II [25]

41 Test Problems Several well-known multiobjective testfunctions are used to test the performance of the proposedalgorithm

Biobjective problem FON [25 26] KUR [25 27]ZDT1 ZDT2 ZDT3 and ZDT6 [25 28ndash31]Three-objective problems DTLZ1 and DTLZ [2 32]

The definitions of them are as followsFON is defined as

1198911 (119909) = 1 minus exp(minus

3

sum119894=1

(119909119894minus1

radic3)2

)

1198912 (119909) = 1 minus exp(minus

3

sum119894=1

(119909119894+1

radic3)2

)

(9)

where 119909119894isin [minus4 4]

This test function has a nonconvex Pareto optimal frontKUR is defined as

1198911 (119909) =

119899minus1

sum119894=1

(minus10 exp (minus02 sdot radic1199092119894+ 1199092119894+1))

1198912 (119909) =

119899

sum119894=1

(1003816100381610038161003816119909119894100381610038161003816100381608+ 5 sdot sin1199093

119894)

(10)

where 119899 = 3 and 119909119894isin [minus5 5]

This test function has a nonconvex Pareto optimal frontin which there are three discontinuous regions

ZDT1 is defined as1198911(1199091) = 1199091

119892 (1199092 119909

119899) = 1 +

9sum119899119894=2119909119894

(119899 minus 1)

ℎ (1198911 119892) = 1 minus radic

1198911

119892

(11)

where 119899 = 30 and 119909119894isin [0 1]

This test function has a convex Pareto optimal frontZDT2 is defined as

1198911(1199091) = 1199091

119892 (1199092 119909

119899) = 1 +

9sum119899119894=2119909119894

(119899 minus 1)

ℎ (1198911 119892) = 1 minus (

1198911

119892)2

(12)

where 119899 = 30 and 119909119894isin [0 1]

This test function has a nonconvex Pareto optimal frontZDT3 is defined as

1198911(1199091) = 1199091

119892 (1199092 119909

119899) = 1 +

9sum119899119894=2119909119894

(119899 minus 1)

ℎ (1198911 119892) = 1 minus radic

1198911

119892minus (

1198911

119892) sin (10120587119891

1)

(13)

where 119899 = 30 and 119909119894isin [0 1]

This test function represents the discreteness feature ItsPareto optimal front consists of several noncontiguous con-vex parts The introduction of the sine function in ℎ causesdiscontinuity in the Pareto optimal front

ZDT6 is defined as

1198911(1199091) = 1 minus exp (minus4119909

1) sin6 (6120587119909

1)

119892 (1199092 119909

119899) = 1 + 9(

sum119899119894=2119909119894

(119899 minus 1))025

ℎ (1198911 119892) = 1 minus (

1198911

119892)2

(14)

where 119899 = 10 and 119909119894isin [0 1]

8 Mathematical Problems in Engineering

This test function has a nonconvex Pareto optimal frontIt includes two difficulties caused by the nonuniformityof the search space Firstly Pareto optimal solutions arenonuniformly distributed along the global Pareto optimalfront (the front is biased for solutions in which 119891

1(1199091) is near

one) Secondly the density of the solutions is the lowest nearthe Pareto optimal front and the highest away from the front

DTLZ1 is defined as

1198911 (119909) = 0511990911199092 (1 + 119892 (119909))

1198912 (119909) = 051199091 (1 minus 1199092) (1 + 119892 (119909))

1198913 (119909) = 05 (1 minus 1199091) (1 + 119892 (119909))

119892 (119909) = 100

sdot [5 +119899

sum119894=3

((119909119894minus 05)

2minus cos (20120587 (119909

119894minus 05)))]

(15)

where 119899 = 7 and 119909119894isin [0 1]

This test function has difficulty in the convergence to thePareto optimal hyperplane and contains (115minus1) local Paretooptimal fronts

DTLZ2 is defined as

1198911 (119909) = (1 + 119892 (119909)) cos (051205871199091) cos (051205871199092)

1198912 (119909) = (1 + 119892 (119909)) cos (051205871199091) sin (051205871199092)

1198913 (119909) = (1 + 119892 (119909)) sin (051205871199091)

119892 (119909) =119899

sum119894=3

(119909119894minus 05)

2

(16)

where 119899 = 12 and 119909119894isin [0 1]

The Pareto optimal solutions of this test function must lieinside the first octant of the unit sphere in a three-objectiveplot It is more difficult than DTLZ1

42 Parameter Values The parameter values of the proposedalgorithm UKMOPSO are adopted as follows

(i) Parameters for PSO the linearly descending inertiaweight [33 34] is within the interval [01 1] and theacceleration coefficients 119888

1and 1198882are both taken as 2

(ii) Parameter for PAM the minimal and maximal num-bers of clusters are 119870min = 2 and119870max = 10

(iii) Population size the population size119873pop is 200(iv) Parameters for the uniform design the number of

subintervals 119878 is 64 the number of the sample pointsor the population size of each subinterval 119876

0is 31 set

1198630= 31 and 119876

1= 5

(v) Stopping condition the algorithm terminates if thenumber of iterations is larger than the given maximalgenerations of 20

All the parameter values in UMOGA [23] are set equalto the original values in [23] the different and additional

parameter values between UMOGA and UKMOPSO are asfollows the number of subintervals 119878 is 16 119865 = 119873 (thenumber of variables)119863

1= 7 119901

119898= 002

NSGA-II in [25] adopted a population size of 100 acrossover probability of 08 a mutation probability of 1119899(where 119899 is the number of variables) and maximal genera-tions of 250 In order to make the comparisons fair we haveused population size of 100 and maximal generations of 40 inNSGA-II so that the total number of function evaluations inNSGA-II UKMOPSO and UMOGA is the same

43 Performance Metric Based on the assumption that thetrue Pareto front of a test problem is known many kinds ofperformance metrics have been proposed and used by manyresearchers such as [2 9 24 25 29 35ndash40] Three of themare adopted in the paper to compare the performance of theproposed algorithm with them The first metric is 119862 metric[29 38] It is taken as the quantitative metric of the solutionquality and often used to show that the outcomes of one algo-rithm dominating the outcomes of another algorithm It isdefined as

119862 (119860 119861) =|119887 isin 119861 exist119886 isin 119860 119886 ≺ 119887|

|119861| (17)

where 119886 ≺ 119887 represents 119887 being dominated by 119886 119860 representsthe Pareto front obtained by Algorithm 119860 and 119861 representsthe Pareto front obtained by Algorithm 119861 in a typical runFurthermore |119861| represents the number of elements of 119861

The metric value 119862(119860 119861) = 1 means that all points in 119861are dominated by or equal to points in 119860 In contrast 119862(119860119861) = 0means that none of the points in 119861 are covered by theones in 119860

The second metric is the IGD metric (namely InvertedGenerational Distance) [39ndash41] It is the mean value of thedistances from the set of solutions uniformly distributed inthe true Pareto front to the set of solutions obtained by analgorithm in objective space Let 119875lowast be a set of uniformly dis-tributed points along the true PF (Pareto Front) Let 119860 be anapproximate set to the PF the average distance from119875lowast to119860 isdefined as

IGD (119860 119875lowast) =sumVisin119875lowast 119889 (V 119860)

|119875lowast| (18)

where 119889(V 119860) is the minimum Euclidean distance between Vand the points in 119860 If |119875lowast| is large enough to represent thePF very well IGD(119860 119875lowast) could measure both the diversityand convergence of 119860 in a sense To have a low value ofIGD(119860 119875lowast) the set119860must be very close to the PF and cannotmiss any part of the whole PF

The smaller the IGDvalue for the set of obtained solutionsis the better the performance of the algorithm will be

The third measure is the measure of maximum spread(MS) [2 24 35] which is proposed in [42 43] It canmeasure how well the true Pareto front (PFtrue) is coveredby the discovered Pareto front (PFknown) through hyperboxes

Mathematical Problems in Engineering 9

0 01 02 03 04 05 06 07 08 09 10

01

02

03

04

05

06

07

08

09

1100

True Pareto frontUKMOPSO

0 01 02 03 04 05 06 07 08 09 10

01

02

03

04

05

06

07

08

09

152

True Pareto frontUMOGA

0 01 02 03 04 05 06 07 08 09 10

01

02

03

04

05

06

07

08

09

140

True Pareto frontNSGA-II

f1(x)

f2(x)

f1(x)

f2(x)

f1(x)

f2(x)

Figure 1 Pareto fronts obtained by different algorithms for test problem FON

formed by the extreme function values observed in the PFtrueand PFknown It is defined as

MS = radic 1

119872

119872

sum119894=1

120575119894

120575119894= (

min (119891max119894 119865max119894) minusmax (119891min

119894 119865min119894)

119865max119894

minus 119865min119894

)

2

(19)

where 119872 is the number of objectives 119891max119894

and 119891min119894

arethe maximum and minimum values of the 119894th objective inPFknown respectively and 119865

max119894

and 119865min119894

are the maximumand minimum values of the 119894th objective in PFtrue respec-tively

Note that if 119891min119894

ge 119865max119894

then 120575119894= 0 Algorithms with

largerMS values are desirable andMS = 1means that the truePareto front is totally covered by the obtained Pareto front

44 Results For each test problem we perform the proposedalgorithm (called UKMOPSO) for 30 independent runs andcompare its performance with UMOGA [23] and NSGA-II[25] The values of several metrics the 119862metric IGDmetricand MS metric are shown in Tables 2 3 4 and 5 The Paretofronts obtained by several algorithms implemented on severaltest functions are illustrated in Figures 1ndash6

For brevity 119862(UKMOPSOUMOGA) 119862(UMOGAUKMOPSO) 119862(UKMOPSONSGA-II) and 119862(NSGA-IIUKMOPSO) are respectively marked as 119862

12 11986221 11986213 and

11986231

10 Mathematical Problems in Engineering

0100

UKMOPSOTrue Pareto front

14

UMOGATrue Pareto front

40

NSGA-IITrue Pareto front

f1(x)

f2(x)

minus20 minus19 minus18 minus17 minus16 minus15 minus14

f1(x)

minus20 minus19 minus18 minus17 minus16 minus15 minus14 minus13 minus12minus12

minus10

minus8

minus6

minus4

minus2

0

f2(x)

f1(x)

minus20 minus19 minus18 minus17 minus16 minus15 minus14minus12

minus10

minus8

minus6

minus4

minus2

0

f2(x)

minus12

minus10

minus8

minus6

minus4

minus2

Figure 2 Pareto fronts obtained by different algorithms for test problem KUR

Table 2 Comparison of C metric between UKMOPSO andUMOGA

C (UKMOPSO UMOGA) C (UMOGA UKMOPSO)FON 0769 001KUR 0929 001ZDT1 012 013ZDT2 0063 003ZDT3 0139 0098ZDT6 0063 001DTLZ1 1 0DTLZ2 1 0

Table 3 Comparison of Cmetric between UKMOPSO and NSGA-II

C (UKMOPSO NSGA-II) C (NSGA-II UKMOPSO)FON 0025 001KUR 0075 011ZDT1 1 0ZDT2 1 0ZDT3 1 0ZDT6 1 0DTLZ1 0 013DTLZ2 0025 006

Mathematical Problems in Engineering 11

0

02

04

06

08

1

12

14100 76

0

05

1

15

2

25

325

0 01 02 03 04 05 06 07 08 09 1

True Pareto frontUKMOPSO

0 01 02 03 04 05 06 07 08 09 1

True Pareto frontNSGA-II

f1(x)

f2(x)

f1(x)

0 01 02 03 04 05 06 07 08 09 10

01

02

03

04

05

06

07

08

09

1

True Pareto frontUMOGA

f1(x)

f2(x)

f2(x)

Figure 3 Pareto fronts obtained by different algorithms for test problem ZDT1

Table 4 Comparison of IGD metric

UKMOPSO UMOGA NSGA-IIFON 00072 00432 03707KUR 00668 11576 12323ZDT1 00067 00265 13933ZDT2 00067 00727 21409ZDT3 01964 02151 02816ZDT6 00033 00139 59223DTLZ1 74827 22965 54182DTLZ2 01123 04319 02724

Table 5 Comparison of maximum spread (MS) metric

UKMOPSO UMOGA NSGA-IIFON 09769 09754 02368KUR 09789 09580 03414ZDT1 09429 09984 05495ZDT2 09999 09986 00019ZDT3 09276 09275 07963ZDT6 1 09974 03738DTLZ1 1 1 06588DTLZ2 1 1 07125

12 Mathematical Problems in Engineering

0

02

04

06

08

1

12

14100 9

0

05

1

15

2

25

38

0 01 02 03 04 05 06 07 08 09 1

True Pareto frontUKMOPSO

0 01 02 03 04 05 06 07 08 09 10

01

02

03

04

05

06

07

08

09

1

True Pareto frontUMOGA

0 01 02 03 04 05 06 07 08 09 1

True Pareto frontNSGA-II

f1(x) f1(x)

f2(x)

f2(x)

f2(x)

f1(x)

Figure 4 Pareto fronts obtained by different algorithms for test problem ZDT2

As shown in Table 2 for the test functions Fon and KUR11986212= 0769 and 119862

12= 0929 mean that 768 and 929 of

the solutions obtained by UMOGA are dominated by thoseobtained by UKMOPSO Similarly 119862

21= 001means that 1

of the solutions obtained by UKMOPSO are dominated bythose obtained by UMOGA For DTLZ1 and DTLZ2119862

12= 1

and 11986221= 0 mean that all solutions obtained by UMOGA

are dominated by those obtained by UKMOPSO but noneof solutions from UKMOPSO is dominated by those fromUMOGA It means that the solution quality via UKMOPSOis much better than that via UMOGA for the above testfunctions For ZDT1 ZDT2 ZDT3 and ZDT6 the valuesof the 119862 metrics do not have too many differences betweenUKMOPSO and UMOGA It means the solution qualities viaboth are almost identical

FromTable 3 we can see that for ZDT1 ZDT2 ZDT3 andZDT6 119862

13= 1 and 119862

31= 0mean that all solutions obtained

byNSGA-II are dominated by those obtained byUKMOPSObut none of solutions from UKMOPSO is dominated bythose from NSGA-II It means that the solution quality viaUKMOPSO is much better than that via NSGA-II for theabove test functions For the rest of the test functions thesolution qualities via both are almost identical

In Table 4 for all test problems the IGD values ofUKMOPSO are the smallest among UKMOPSO UMOGAand NSGA-II This means the PF found by UKMOPSO isthe nearest to the true PF compared with the PF obtainedby the other two algorithms namely the performance ofUKMOPSO is the best in the three algorithms The IGDvalues of UMOGA are all larger than those of NSGA-II for

Mathematical Problems in Engineering 13

0

02

04

06

08

1 92 86

0

05

1

15

2

25

3

35

4

45 40

0 01 02 03 04 05 06 07 08 09f1(x)

0 01 02 03 04 05 06 07 08 09f1(x)

0 01 02 03 04 05 06 07 08 09f1(x)

f2(x)

minus08

minus06

minus04

minus02

0

02

04

06

08

1

f2(x)

f2(x)

minus08

minus05

minus06

minus04

minus02

UKMOPSO UMOGA

NSGA-II

Figure 5 Pareto fronts obtained by different algorithms for test problem ZDT3

all test functions It means that the PF found by UMOGA iscloser to the true PF than the PF obtained by NSGA-II

From Table 5 we can see that all the MS values ofUKMOPSO are almost close to 1 for each test function Itmeans that almost all the true PF is totally covered by the PFobtained by UKMOPSO UMOGA is similar to UKMOPSOThe MS values of NSGA-II are much lesser than UKMOPSOand UMOGA especially MS = 00019 for ZDT2

Figures 1ndash6 all demonstrate that the PF found byUKMOPSO is the nearest to the true PF and scatters mostuniformly in those obtained by three algorithms Most of thepoints in the PF found by UKMOPSO overlap the pointsin the true PF It means the solution quality obtained byUKMOPSO is very high

45 Influence of the Uniform Crossover and PAM In orderto find out the influence of the uniform crossover on the

proposed algorithm we compare the distribution and num-ber of Pareto optimal solutions before and after performingthe uniform crossover One of the simulating results on thetest function ZDT1 is shown in Figure 7

From Figure 7 it can be seen that before and afterperforming the uniform crossover the number of data pointsin the 1st cluster and the 2nd cluster varies from 7 to 15and from 19 to 26 respectively namely many new Paretooptimal solutions having not been found before performingthe uniform crossover are generated and the differences ofthe data points between two clusters have been decreasedThis will directly influence the uniformity of the PF andacquire more uniform Pareto solutions

PAM is used to determine which Pareto solutions are tobe removed fromor be inserted into the external archiveThisis to maintain the diversity of Pareto optimal solutions The

14 Mathematical Problems in Engineering

100 16

0

05

1

15

2

25

3

35

4

45

540

02 03 04 05 06 07 08 09 10

01

02

03

04

05

06

07

08

09

1

True Pareto frontUKMOPSO

02 03 04 05 06 07 08 09 10

01

02

03

04

05

06

07

08

09

1

True Pareto frontUMOGA

02 03 04 05 06 07 08 09 1

True Pareto frontNSGA-II

f1(x)

f2(x)

f1(x)

f2(x)

f2(x)

f1(x)

Figure 6 Pareto fronts obtained by different algorithms for test problem ZDT6

diversity can be computed by the ldquodistance-to-average-pointrdquomeasure [44] defined as

diversity (119878) = 1

|119878|

|119878|

sum119894=1

radic119873

sum119895=1

(119901119894119895minus 119901119895)2

(20)

where 119878 is the population |119878| is the swarm size 119873 is thedimensionality of the problem 119901

119894119895is the 1198951015840th value of the 1198941015840th

particle and 119901119895is the 1198951015840th value of the average point 119901

If the number of Pareto optimality is less than or equalto the size of the external archive PAM has no influenceon the proposed algorithm Otherwise it is used to selectthe different type of Pareto optimality from several clustersTherefore The diversity of Pareto optimality will certainlyincreaseWemonitor the diversity of Pareto optimality before

and after performing PAMonZDT1 at a certain timeThe val-ues are respectively 8762 and 11752 This fully demonstratesthat PAM can improve the diversity of Pareto optimality

5 Conclusion and Future Work

In this paper a multiobjective particle swarm optimizationbased on PAM and uniform design is presented It firstlyimplements PAM to partition the Pareto front into119870 clustersin the objective space and then it implements the uniformcrossover operator on the minimal cluster so as to generatemore Pareto optimality in the minimal cluster When thesize of the Pareto solution is larger than that of the externalarchive PAM is used to determine which Pareto solutions areto be removed from or be inserted into the external archive

Mathematical Problems in Engineering 15

0 01 02 03 04 05 06 07 08 09

19

10

01

02

03

04

05

06

077

08

09

1

The 1st clusterThe 2nd cluster

26

(a)

41

0 01 02 03 04 05 06

26

07 08 09 10

01

02

03

04

051506

07

08

09

1

The 1st clusterThe 2nd cluster

(b)

Figure 7 Pareto optimal solutions before (a) and after (b) performing the uniform crossover

Finally it keeps all the points in the lesser clusters anddiscards some points in the larger clusters This can ensurethat each of the clusters will contain approximately the samedata points Therefore the diversity of the Pareto solutionswill increase and they can scatter uniformly over the Paretofront The results of the experimental simulation performedon several well-known test problems indicate that theproposed algorithm obviously outperforms the other twoalgorithms

This algorithm is going on for further enhancement andimprovement One attempt is to use a more efficient orapproximate clustering algorithm to speed up the executiontime of this algorithm Another attempt is to extend its appli-cation scopes

Conflict of Interests

The authors declare that there is no conflict of interestsregarding the publication of this paper

Acknowledgments

This research was supported by Guangxi Natural ScienceFoundation (no 2013GXNSFAA019337) Guangxi Universi-ties Key Project of Science and Technology Research (noKY2015ZD099) Key project of Guangxi Education Depart-ment (no 2013ZD055) Scientific Research Staring Founda-tion for the PHD Scholars of Yulin Normal University (noG2014005) Special Project of Yulin Normal University (no2012YJZX04) and Key Project of Yulin Normal University(no 2014YJZD05)The authors are grateful to the anonymousreferee for a careful checking of the details and for helpfulcomments that improved this paper

References

[1] J D Schaffer ldquoMultiple objective optimization with vector eval-uated genetic algorithmsrdquo in Proceedings of the 1st InternationalConference on Genetic Algorithms pp 93ndash100 1985

[2] L Tang and X Wang ldquoA hybrid multiobjective evolutionaryalgorithm for multiobjective optimization problemsrdquo IEEETransactions on Evolutionary Computation vol 17 no 1 pp 20ndash45 2013

[3] K Deb S Agrawal A Pratap and T Meyarivan ldquoA fast elitistnon-dominated sorting genetic algorithm for multi-objectiveoptimizationNSGA-IIrdquo inParallel ProblemSolving fromNaturePPSN VI vol 1917 of Lecture Notes in Computer Science pp849ndash858 Springer Berlin Germany 2000

[4] J Zhang Y Wang and J Feng ldquoAttribute index and uniformdesign based multiobjective association rule mining with evo-lutionary algorithmrdquo The Scientific World Journal vol 2013Article ID 259347 16 pages 2013

[5] S Jiang and Z Cai ldquoFaster convergence and higher hypervol-ume for multi-objective evolutionary algorithms by orthogonaland uniform designrdquo in Advances in Computation and Intelli-gence vol 6382 of Lecture Notes in Computer Science pp 312ndash328 Springer Berlin Germany 2010

[6] J Kennedy and R Eberhart ldquoParticle swarm optimizationrdquoin Proceedings of the IEEE International Conference on NeuralNetworks pp 1942ndash1948 December 1995

[7] R Eberhart and J Kennedy ldquoA new optimizer using particleswarm theoryrdquo in Proceedings of the 6th International Sympo-sium on Micro Machine and Human Science pp 39ndash43 IEEENagoya Japan October 1995

[8] L Yang ldquoA fast and elitist multi-objective particle swarm algo-rithm NSPSOrdquo in Proceedings of the IEEE International Con-ference on Granular Computing (GRC rsquo08) pp 470ndash475 August2008

[9] C A C Coello G T Pulido and M S Lechuga ldquoHandlingmultiple objectives with particle swarm optimizationrdquo IEEE

16 Mathematical Problems in Engineering

Transactions on Evolutionary Computation vol 8 no 3 pp256ndash279 2004

[10] M Clerc and J Kennedy ldquoThe particle swarm-explosion sta-bility and convergence in a multidimensional complex spacerdquoIEEE Transactions on Evolutionary Computation vol 6 no 1pp 58ndash73 2002

[11] H Xiaohui and R Eberhart ldquoMultiobjective optimization usingdynamic neighborhood particle swarm optimizationrdquo in Pro-ceedings of the Congress on Evolutionary Computation (CEC rsquo02)vol 2 pp 1677ndash1681 May 2002

[12] S Mostaghim and J Teich ldquoStrategies for finding goodlocal guides in multi-objective particle swarm optimization(MOPSO)rdquo in Proceedings of the IEEE Swarm IntelligenceSymposium (SIS rsquo03) pp 26ndash33 Indianapolis Ind USA April2003

[13] T Bartz-Beielstein P Limbourg J Mehnen K Schmitt K EParsopoulos and M N Vrahatis ldquoParticle swarm optimizersfor Pareto optimizationwith enhanced archiving techniquesrdquo inProceedings of the Congress on Evolutionary Computation (CECrsquo03) vol 3 pp 1780ndash1787 December 2003

[14] K E Parsopoulos D K Tasoulis and M N Vrahatis ldquoMulti-objective optimization using parallel vector evaluated particleswarm optimizationrdquo in Proceedings of the IASTED Interna-tional Conference on Artificial Intelligence and Applications (AIArsquo04) vol 2 pp 823ndash828 February 2004

[15] S-J Tsai T-Y Sun C-C Liu S-T Hsieh W-C Wu and S-YChiu ldquoAn improved multi-objective particle swarm optimizerformulti-objective problemsrdquo Expert Systems with Applicationsvol 37 no 8 pp 5872ndash5886 2010

[16] A A Mousa M A El-Shorbagy and W F Abd-El-WahedldquoLocal search based hybrid particle swarm optimization algo-rithm for multiobjective optimizationrdquo Swarm and Evolution-ary Computation vol 3 pp 1ndash14 2012

[17] J Wang W Liu W Zhang and B Yang ldquoMulti-objective parti-cle swarm optimization based on self-update and grid strategyrdquoin Proceedings of the 2012 International Conference on Informa-tion Technology and Software Engineering vol 211 of LectureNotes in Electrical Engineering pp 869ndash876 Springer BerlinGermany 2013

[18] K Khalili-Damghani A-R Abtahi and M Tavana ldquoA newmulti-objective particle swarmoptimizationmethod for solvingreliability redundancy allocation problemsrdquo Reliability Engi-neering amp System Safety vol 111 pp 58ndash75 2013

[19] J Zhang Y Wang and J Feng ldquoParallel particle swarm opti-mization based on PAMrdquo in Proceedings of the 2nd InternationalConference on Information Engineering and Computer Science(ICIECS rsquo10) Wuhan China December 2010

[20] J Han and M Kamber Data Mining Concepts and TechniquesMorgan Kaufmann 2006

[21] L F Ibrahim ldquoUsing of clustering algorithm CWSP-PAM forrural network planningrdquo in Proceedings of the 3rd InternationalConference on Information Technology and Applications (ICITArsquo05) vol 1 pp 280ndash283 IEEE July 2005

[22] P Chopra J Kang and J Lee ldquoUsing gene pair combinationsto improve the accuracy of the PAM classifierrdquo in Proceedingsof the IEEE International Conference on Bioinformatics andBiomedicine (BIBM rsquo09) pp 174ndash177 November 2009

[23] Y-W Leung and Y Wang ldquoMultiobjective programming usinguniform design and genetic algorithmrdquo IEEE Transactions onSystems Man and Cybernetics Part C Applications and Reviewsvol 30 no 3 pp 293ndash304 2000

[24] D Liu K C Tan C K Goh and W K Ho ldquoA multiobjectivememetic algorithm based on particle swarm optimizationrdquoIEEE Transactions on Systems Man and Cybernetics Part BCybernetics vol 37 no 1 pp 42ndash50 2007

[25] K Deb A Pratap S Agarwal and T Meyarivan ldquoA fast andelitist multiobjective genetic algorithm NSGA-IIrdquo IEEE Trans-actions on Evolutionary Computation vol 6 no 2 pp 182ndash1972002

[26] C M Fonseca and P J Fleming ldquoMultiobjective optimizationandmultiple constraint handling with evolutionary algorithmsII Application examplerdquo IEEE Transactions on Systems Manand Cybernetics Part A Systems and Humans vol 28 no 1 pp38ndash47 1998

[27] F Kursawe ldquoA variant of evolution strategies for vector opti-mizationrdquo in Parallel Problem Solving from Nature vol 496 ofLecture Notes in Computer Science pp 193ndash197 Springer BerlinGermany 1991

[28] E Zitzler K Deb and LThiele ldquoComparison of multiobjectiveevolutionary algorithms empirical resultsrdquo Evolutionary Com-putation vol 8 no 2 pp 173ndash195 2000

[29] H-L Liu Y Wang and Y-M Cheung ldquoA Multi-objective evo-lutionary algorithm usingmin-max strategy and sphere coordi-nate transformationrdquo Intelligent Automation amp Soft Computingvol 15 no 3 pp 361ndash384 2009

[30] N Chakraborti B S Kumar V S Babu S Moitra andA Mukhopadhyay ldquoA new multi-objective genetic algorithmapplied to hot-rolling processrdquoAppliedMathematicalModellingvol 32 no 9 pp 1781ndash1789 2008

[31] K Deb ldquoMulti-objective genetic algorithms problem difficul-ties and construction of test problemsrdquo Evolutionary Computa-tion vol 7 no 3 pp 205ndash230 1999

[32] K Deb L Thiele M Laumanns and E Zitzler ldquoScalable testproblems for evolutionarymultiobjective optimizationrdquo in Evo-lutionary Multiobjective Optimization Advanced Informationand Knowledge Processing pp 105ndash145 Springer London UK2005

[33] Y Shi and R Eberhart ldquoModified particle swarm optimizerrdquoin Proceedings of the IEEE World Congress on ComputationalIntelligence pp 69ndash73 May 1998

[34] C S Feng S Cong and X Y Feng ldquoA new adaptive inertiaweight strategy in particle swarm optimizationrdquo in Procedingsof the IEEE Congress on Evolutionary Computation (CEC rsquo07)pp 4186ndash4190 IEEE Singapore September 2007

[35] C K Goh K C Tan D S Liu and S C Chiam ldquoA competitiveand cooperative co-evolutionary approach to multi-objectiveparticle swarm optimization algorithm designrdquo European Jour-nal of Operational Research vol 202 no 1 pp 42ndash54 2010

[36] P Cheng J-S Pan L Li Y Tang and C Huang ldquoA surveyof performance assessment for multiobjective optimizersrdquo inProceedings of the 4th International Conference on Genetic andEvolutionary Computing (ICGEC rsquo10) pp 341ndash345 December2010

[37] E Zitzler L Thiele M Laumanns C M Fonseca and V GDa Fonseca ldquoPerformance assessment of multiobjective opti-mizers an analysis and reviewrdquo IEEE Transactions on Evolution-ary Computation vol 7 no 2 pp 117ndash132 2003

[38] E Zitzler and L Thiele ldquoMultiobjective evolutionary algo-rithms a comparative case study and the strength Paretoapproachrdquo IEEETransactions onEvolutionaryComputation vol3 no 4 pp 257ndash271 1999

[39] H Li and Q Zhang ldquoMultiobjective optimization problemswith complicated pareto sets MOEAD and NSGA-IIrdquo IEEE

Mathematical Problems in Engineering 17

Transactions on Evolutionary Computation vol 13 no 2 pp284ndash302 2009

[40] Q Zhang A Zhou S Zhao P N Suganthan W Liu andS Tiwari ldquoMultiobjective optimization test instances for theCEC 2009 special session and competitionrdquo Working ReportDepartment of Computing and Electronic Systems Universityof Essex 2008

[41] Q Zhang A Zhou and Y Jin ldquoRM-MEDA a regularity model-based multiobjective estimation of distribution algorithmrdquoIEEE Transactions on Evolutionary Computation vol 12 no 1pp 41ndash63 2008

[42] C-S Tsou H-H Fang H-H Chang and C-H Kao ldquoAnimproved particle swarmPareto optimizerwith local search andclusteringrdquo in Simulated Evolution and Learning vol 4247 ofLecture Notes in Computer Science pp 400ndash407 2006

[43] U Wickramasinghe and X Li ldquoChoosing leaders for multiob-jective PSO algorithms using differential evolutionrdquo in Proceed-ing of the 7th International Conference Simulated Evolution andLearning vol 5361 of Lecture Notes in Computer Science pp249ndash258 Springer Berlin Germany 2008

[44] T Krink J S Vesterstrom and J Riget ldquoParticle swarm opti-misation with spatial particle extensionrdquo in Proceedings of theCongress on Evolutionary Computation (CEC rsquo02) pp 1474ndash1479 May 2002

Submit your manuscripts athttpwwwhindawicom

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical Problems in Engineering

Hindawi Publishing Corporationhttpwwwhindawicom

Differential EquationsInternational Journal of

Volume 2014

Applied MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Probability and StatisticsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical PhysicsAdvances in

Complex AnalysisJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

OptimizationJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

CombinatoricsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Operations ResearchAdvances in

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Function Spaces

Abstract and Applied AnalysisHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of Mathematics and Mathematical Sciences

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Algebra

Discrete Dynamics in Nature and Society

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Decision SciencesAdvances in

Discrete MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014 Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Stochastic AnalysisInternational Journal of

Page 7: Research Article Multiobjective Particle Swarm

Mathematical Problems in Engineering 7

Step 6 Update the position pop and velocity 119881 for each ofthe particles respectively using formulas (2) and (3) whereformula (2) is modified as follows

V119894119895 (119896 + 1) = 119908 sdot V119894119895 (119896)

+ 1198881sdot 1199031sdot (119875119887119890119904119905

119894119895 (119896) minus 119909119894119895 (119896))

+ 1198882times 1199032times (119866119887119890119904119905

119894119895 (119896) minus 119909119894119895 (119896))

(8)

Step 7 For the 119895th dimensional variable of the 119894th particle ifit goes beyond its lower boundary or upper boundary then ittakes the 119895th dimensional value of its corresponding bound-ary and the 119895th dimensional value of its velocity takes theopposite value

Step 8 Calculate each of119872 objectives for each of the particlesand update them into fit119881

Step 9 Update 119875119887119890119904119905 of each particle as followsFor the 119894th particle if fit119881[119894] dominates 119891119875119887119890119904119905[119894] then

let 119891119875119887119890119904119905[119894] = fit119881[119894] and 119875119887119890119904119905[119894] = pop[119894] otherwise119875119887119890119904119905[119894] and119891119875119887119890119904119905[119894] are kept unchanged If neither of themis dominated by the other then randomly select one of themas 119891119875119887119890119904119905[119894] and update its corresponding 119875119887119890119904119905[119894]

Step 10 Update the external archive pareto storing Paretooptimality and its size119873119875 according to Algorithm 3

Step 11 Implement Algorithm D described in Section 34 toadjust the Pareto optimal solutions such that the number ofPareto optimal solutions is less than or equal to the size of theexternal archive119873119875 lim

Step 12 If the stop criterion is satisfied terminate algorithmotherwise turn to Step 5 and continue

4 Numerical Results

In order to evaluate the performances of the proposedalgorithm we compare it with two outstanding algorithmsUMOGA [23] and NSGA-II [25]

41 Test Problems Several well-known multiobjective testfunctions are used to test the performance of the proposedalgorithm

Biobjective problem FON [25 26] KUR [25 27]ZDT1 ZDT2 ZDT3 and ZDT6 [25 28ndash31]Three-objective problems DTLZ1 and DTLZ [2 32]

The definitions of them are as followsFON is defined as

1198911 (119909) = 1 minus exp(minus

3

sum119894=1

(119909119894minus1

radic3)2

)

1198912 (119909) = 1 minus exp(minus

3

sum119894=1

(119909119894+1

radic3)2

)

(9)

where 119909119894isin [minus4 4]

This test function has a nonconvex Pareto optimal frontKUR is defined as

1198911 (119909) =

119899minus1

sum119894=1

(minus10 exp (minus02 sdot radic1199092119894+ 1199092119894+1))

1198912 (119909) =

119899

sum119894=1

(1003816100381610038161003816119909119894100381610038161003816100381608+ 5 sdot sin1199093

119894)

(10)

where 119899 = 3 and 119909119894isin [minus5 5]

This test function has a nonconvex Pareto optimal frontin which there are three discontinuous regions

ZDT1 is defined as1198911(1199091) = 1199091

119892 (1199092 119909

119899) = 1 +

9sum119899119894=2119909119894

(119899 minus 1)

ℎ (1198911 119892) = 1 minus radic

1198911

119892

(11)

where 119899 = 30 and 119909119894isin [0 1]

This test function has a convex Pareto optimal frontZDT2 is defined as

1198911(1199091) = 1199091

119892 (1199092 119909

119899) = 1 +

9sum119899119894=2119909119894

(119899 minus 1)

ℎ (1198911 119892) = 1 minus (

1198911

119892)2

(12)

where 119899 = 30 and 119909119894isin [0 1]

This test function has a nonconvex Pareto optimal frontZDT3 is defined as

1198911(1199091) = 1199091

119892 (1199092 119909

119899) = 1 +

9sum119899119894=2119909119894

(119899 minus 1)

ℎ (1198911 119892) = 1 minus radic

1198911

119892minus (

1198911

119892) sin (10120587119891

1)

(13)

where 119899 = 30 and 119909119894isin [0 1]

This test function represents the discreteness feature ItsPareto optimal front consists of several noncontiguous con-vex parts The introduction of the sine function in ℎ causesdiscontinuity in the Pareto optimal front

ZDT6 is defined as

1198911(1199091) = 1 minus exp (minus4119909

1) sin6 (6120587119909

1)

119892 (1199092 119909

119899) = 1 + 9(

sum119899119894=2119909119894

(119899 minus 1))025

ℎ (1198911 119892) = 1 minus (

1198911

119892)2

(14)

where 119899 = 10 and 119909119894isin [0 1]

8 Mathematical Problems in Engineering

This test function has a nonconvex Pareto optimal frontIt includes two difficulties caused by the nonuniformityof the search space Firstly Pareto optimal solutions arenonuniformly distributed along the global Pareto optimalfront (the front is biased for solutions in which 119891

1(1199091) is near

one) Secondly the density of the solutions is the lowest nearthe Pareto optimal front and the highest away from the front

DTLZ1 is defined as

1198911 (119909) = 0511990911199092 (1 + 119892 (119909))

1198912 (119909) = 051199091 (1 minus 1199092) (1 + 119892 (119909))

1198913 (119909) = 05 (1 minus 1199091) (1 + 119892 (119909))

119892 (119909) = 100

sdot [5 +119899

sum119894=3

((119909119894minus 05)

2minus cos (20120587 (119909

119894minus 05)))]

(15)

where 119899 = 7 and 119909119894isin [0 1]

This test function has difficulty in the convergence to thePareto optimal hyperplane and contains (115minus1) local Paretooptimal fronts

DTLZ2 is defined as

1198911 (119909) = (1 + 119892 (119909)) cos (051205871199091) cos (051205871199092)

1198912 (119909) = (1 + 119892 (119909)) cos (051205871199091) sin (051205871199092)

1198913 (119909) = (1 + 119892 (119909)) sin (051205871199091)

119892 (119909) =119899

sum119894=3

(119909119894minus 05)

2

(16)

where 119899 = 12 and 119909119894isin [0 1]

The Pareto optimal solutions of this test function must lieinside the first octant of the unit sphere in a three-objectiveplot It is more difficult than DTLZ1

42 Parameter Values The parameter values of the proposedalgorithm UKMOPSO are adopted as follows

(i) Parameters for PSO the linearly descending inertiaweight [33 34] is within the interval [01 1] and theacceleration coefficients 119888

1and 1198882are both taken as 2

(ii) Parameter for PAM the minimal and maximal num-bers of clusters are 119870min = 2 and119870max = 10

(iii) Population size the population size119873pop is 200(iv) Parameters for the uniform design the number of

subintervals 119878 is 64 the number of the sample pointsor the population size of each subinterval 119876

0is 31 set

1198630= 31 and 119876

1= 5

(v) Stopping condition the algorithm terminates if thenumber of iterations is larger than the given maximalgenerations of 20

All the parameter values in UMOGA [23] are set equalto the original values in [23] the different and additional

parameter values between UMOGA and UKMOPSO are asfollows the number of subintervals 119878 is 16 119865 = 119873 (thenumber of variables)119863

1= 7 119901

119898= 002

NSGA-II in [25] adopted a population size of 100 acrossover probability of 08 a mutation probability of 1119899(where 119899 is the number of variables) and maximal genera-tions of 250 In order to make the comparisons fair we haveused population size of 100 and maximal generations of 40 inNSGA-II so that the total number of function evaluations inNSGA-II UKMOPSO and UMOGA is the same

43 Performance Metric Based on the assumption that thetrue Pareto front of a test problem is known many kinds ofperformance metrics have been proposed and used by manyresearchers such as [2 9 24 25 29 35ndash40] Three of themare adopted in the paper to compare the performance of theproposed algorithm with them The first metric is 119862 metric[29 38] It is taken as the quantitative metric of the solutionquality and often used to show that the outcomes of one algo-rithm dominating the outcomes of another algorithm It isdefined as

119862 (119860 119861) =|119887 isin 119861 exist119886 isin 119860 119886 ≺ 119887|

|119861| (17)

where 119886 ≺ 119887 represents 119887 being dominated by 119886 119860 representsthe Pareto front obtained by Algorithm 119860 and 119861 representsthe Pareto front obtained by Algorithm 119861 in a typical runFurthermore |119861| represents the number of elements of 119861

The metric value 119862(119860 119861) = 1 means that all points in 119861are dominated by or equal to points in 119860 In contrast 119862(119860119861) = 0means that none of the points in 119861 are covered by theones in 119860

The second metric is the IGD metric (namely InvertedGenerational Distance) [39ndash41] It is the mean value of thedistances from the set of solutions uniformly distributed inthe true Pareto front to the set of solutions obtained by analgorithm in objective space Let 119875lowast be a set of uniformly dis-tributed points along the true PF (Pareto Front) Let 119860 be anapproximate set to the PF the average distance from119875lowast to119860 isdefined as

IGD (119860 119875lowast) =sumVisin119875lowast 119889 (V 119860)

|119875lowast| (18)

where 119889(V 119860) is the minimum Euclidean distance between Vand the points in 119860 If |119875lowast| is large enough to represent thePF very well IGD(119860 119875lowast) could measure both the diversityand convergence of 119860 in a sense To have a low value ofIGD(119860 119875lowast) the set119860must be very close to the PF and cannotmiss any part of the whole PF

The smaller the IGDvalue for the set of obtained solutionsis the better the performance of the algorithm will be

The third measure is the measure of maximum spread(MS) [2 24 35] which is proposed in [42 43] It canmeasure how well the true Pareto front (PFtrue) is coveredby the discovered Pareto front (PFknown) through hyperboxes

Mathematical Problems in Engineering 9

0 01 02 03 04 05 06 07 08 09 10

01

02

03

04

05

06

07

08

09

1100

True Pareto frontUKMOPSO

0 01 02 03 04 05 06 07 08 09 10

01

02

03

04

05

06

07

08

09

152

True Pareto frontUMOGA

0 01 02 03 04 05 06 07 08 09 10

01

02

03

04

05

06

07

08

09

140

True Pareto frontNSGA-II

f1(x)

f2(x)

f1(x)

f2(x)

f1(x)

f2(x)

Figure 1 Pareto fronts obtained by different algorithms for test problem FON

formed by the extreme function values observed in the PFtrueand PFknown It is defined as

MS = radic 1

119872

119872

sum119894=1

120575119894

120575119894= (

min (119891max119894 119865max119894) minusmax (119891min

119894 119865min119894)

119865max119894

minus 119865min119894

)

2

(19)

where 119872 is the number of objectives 119891max119894

and 119891min119894

arethe maximum and minimum values of the 119894th objective inPFknown respectively and 119865

max119894

and 119865min119894

are the maximumand minimum values of the 119894th objective in PFtrue respec-tively

Note that if 119891min119894

ge 119865max119894

then 120575119894= 0 Algorithms with

largerMS values are desirable andMS = 1means that the truePareto front is totally covered by the obtained Pareto front

44 Results For each test problem we perform the proposedalgorithm (called UKMOPSO) for 30 independent runs andcompare its performance with UMOGA [23] and NSGA-II[25] The values of several metrics the 119862metric IGDmetricand MS metric are shown in Tables 2 3 4 and 5 The Paretofronts obtained by several algorithms implemented on severaltest functions are illustrated in Figures 1ndash6

For brevity 119862(UKMOPSOUMOGA) 119862(UMOGAUKMOPSO) 119862(UKMOPSONSGA-II) and 119862(NSGA-IIUKMOPSO) are respectively marked as 119862

12 11986221 11986213 and

11986231

10 Mathematical Problems in Engineering

0100

UKMOPSOTrue Pareto front

14

UMOGATrue Pareto front

40

NSGA-IITrue Pareto front

f1(x)

f2(x)

minus20 minus19 minus18 minus17 minus16 minus15 minus14

f1(x)

minus20 minus19 minus18 minus17 minus16 minus15 minus14 minus13 minus12minus12

minus10

minus8

minus6

minus4

minus2

0

f2(x)

f1(x)

minus20 minus19 minus18 minus17 minus16 minus15 minus14minus12

minus10

minus8

minus6

minus4

minus2

0

f2(x)

minus12

minus10

minus8

minus6

minus4

minus2

Figure 2 Pareto fronts obtained by different algorithms for test problem KUR

Table 2 Comparison of C metric between UKMOPSO andUMOGA

C (UKMOPSO UMOGA) C (UMOGA UKMOPSO)FON 0769 001KUR 0929 001ZDT1 012 013ZDT2 0063 003ZDT3 0139 0098ZDT6 0063 001DTLZ1 1 0DTLZ2 1 0

Table 3 Comparison of Cmetric between UKMOPSO and NSGA-II

C (UKMOPSO NSGA-II) C (NSGA-II UKMOPSO)FON 0025 001KUR 0075 011ZDT1 1 0ZDT2 1 0ZDT3 1 0ZDT6 1 0DTLZ1 0 013DTLZ2 0025 006

Mathematical Problems in Engineering 11

0

02

04

06

08

1

12

14100 76

0

05

1

15

2

25

325

0 01 02 03 04 05 06 07 08 09 1

True Pareto frontUKMOPSO

0 01 02 03 04 05 06 07 08 09 1

True Pareto frontNSGA-II

f1(x)

f2(x)

f1(x)

0 01 02 03 04 05 06 07 08 09 10

01

02

03

04

05

06

07

08

09

1

True Pareto frontUMOGA

f1(x)

f2(x)

f2(x)

Figure 3 Pareto fronts obtained by different algorithms for test problem ZDT1

Table 4 Comparison of IGD metric

UKMOPSO UMOGA NSGA-IIFON 00072 00432 03707KUR 00668 11576 12323ZDT1 00067 00265 13933ZDT2 00067 00727 21409ZDT3 01964 02151 02816ZDT6 00033 00139 59223DTLZ1 74827 22965 54182DTLZ2 01123 04319 02724

Table 5 Comparison of maximum spread (MS) metric

UKMOPSO UMOGA NSGA-IIFON 09769 09754 02368KUR 09789 09580 03414ZDT1 09429 09984 05495ZDT2 09999 09986 00019ZDT3 09276 09275 07963ZDT6 1 09974 03738DTLZ1 1 1 06588DTLZ2 1 1 07125

12 Mathematical Problems in Engineering

0

02

04

06

08

1

12

14100 9

0

05

1

15

2

25

38

0 01 02 03 04 05 06 07 08 09 1

True Pareto frontUKMOPSO

0 01 02 03 04 05 06 07 08 09 10

01

02

03

04

05

06

07

08

09

1

True Pareto frontUMOGA

0 01 02 03 04 05 06 07 08 09 1

True Pareto frontNSGA-II

f1(x) f1(x)

f2(x)

f2(x)

f2(x)

f1(x)

Figure 4 Pareto fronts obtained by different algorithms for test problem ZDT2

As shown in Table 2 for the test functions Fon and KUR11986212= 0769 and 119862

12= 0929 mean that 768 and 929 of

the solutions obtained by UMOGA are dominated by thoseobtained by UKMOPSO Similarly 119862

21= 001means that 1

of the solutions obtained by UKMOPSO are dominated bythose obtained by UMOGA For DTLZ1 and DTLZ2119862

12= 1

and 11986221= 0 mean that all solutions obtained by UMOGA

are dominated by those obtained by UKMOPSO but noneof solutions from UKMOPSO is dominated by those fromUMOGA It means that the solution quality via UKMOPSOis much better than that via UMOGA for the above testfunctions For ZDT1 ZDT2 ZDT3 and ZDT6 the valuesof the 119862 metrics do not have too many differences betweenUKMOPSO and UMOGA It means the solution qualities viaboth are almost identical

FromTable 3 we can see that for ZDT1 ZDT2 ZDT3 andZDT6 119862

13= 1 and 119862

31= 0mean that all solutions obtained

byNSGA-II are dominated by those obtained byUKMOPSObut none of solutions from UKMOPSO is dominated bythose from NSGA-II It means that the solution quality viaUKMOPSO is much better than that via NSGA-II for theabove test functions For the rest of the test functions thesolution qualities via both are almost identical

In Table 4 for all test problems the IGD values ofUKMOPSO are the smallest among UKMOPSO UMOGAand NSGA-II This means the PF found by UKMOPSO isthe nearest to the true PF compared with the PF obtainedby the other two algorithms namely the performance ofUKMOPSO is the best in the three algorithms The IGDvalues of UMOGA are all larger than those of NSGA-II for

Mathematical Problems in Engineering 13

0

02

04

06

08

1 92 86

0

05

1

15

2

25

3

35

4

45 40

0 01 02 03 04 05 06 07 08 09f1(x)

0 01 02 03 04 05 06 07 08 09f1(x)

0 01 02 03 04 05 06 07 08 09f1(x)

f2(x)

minus08

minus06

minus04

minus02

0

02

04

06

08

1

f2(x)

f2(x)

minus08

minus05

minus06

minus04

minus02

UKMOPSO UMOGA

NSGA-II

Figure 5 Pareto fronts obtained by different algorithms for test problem ZDT3

all test functions It means that the PF found by UMOGA iscloser to the true PF than the PF obtained by NSGA-II

From Table 5 we can see that all the MS values ofUKMOPSO are almost close to 1 for each test function Itmeans that almost all the true PF is totally covered by the PFobtained by UKMOPSO UMOGA is similar to UKMOPSOThe MS values of NSGA-II are much lesser than UKMOPSOand UMOGA especially MS = 00019 for ZDT2

Figures 1ndash6 all demonstrate that the PF found byUKMOPSO is the nearest to the true PF and scatters mostuniformly in those obtained by three algorithms Most of thepoints in the PF found by UKMOPSO overlap the pointsin the true PF It means the solution quality obtained byUKMOPSO is very high

45 Influence of the Uniform Crossover and PAM In orderto find out the influence of the uniform crossover on the

proposed algorithm we compare the distribution and num-ber of Pareto optimal solutions before and after performingthe uniform crossover One of the simulating results on thetest function ZDT1 is shown in Figure 7

From Figure 7 it can be seen that before and afterperforming the uniform crossover the number of data pointsin the 1st cluster and the 2nd cluster varies from 7 to 15and from 19 to 26 respectively namely many new Paretooptimal solutions having not been found before performingthe uniform crossover are generated and the differences ofthe data points between two clusters have been decreasedThis will directly influence the uniformity of the PF andacquire more uniform Pareto solutions

PAM is used to determine which Pareto solutions are tobe removed fromor be inserted into the external archiveThisis to maintain the diversity of Pareto optimal solutions The

14 Mathematical Problems in Engineering

100 16

0

05

1

15

2

25

3

35

4

45

540

02 03 04 05 06 07 08 09 10

01

02

03

04

05

06

07

08

09

1

True Pareto frontUKMOPSO

02 03 04 05 06 07 08 09 10

01

02

03

04

05

06

07

08

09

1

True Pareto frontUMOGA

02 03 04 05 06 07 08 09 1

True Pareto frontNSGA-II

f1(x)

f2(x)

f1(x)

f2(x)

f2(x)

f1(x)

Figure 6 Pareto fronts obtained by different algorithms for test problem ZDT6

diversity can be computed by the ldquodistance-to-average-pointrdquomeasure [44] defined as

diversity (119878) = 1

|119878|

|119878|

sum119894=1

radic119873

sum119895=1

(119901119894119895minus 119901119895)2

(20)

where 119878 is the population |119878| is the swarm size 119873 is thedimensionality of the problem 119901

119894119895is the 1198951015840th value of the 1198941015840th

particle and 119901119895is the 1198951015840th value of the average point 119901

If the number of Pareto optimality is less than or equalto the size of the external archive PAM has no influenceon the proposed algorithm Otherwise it is used to selectthe different type of Pareto optimality from several clustersTherefore The diversity of Pareto optimality will certainlyincreaseWemonitor the diversity of Pareto optimality before

and after performing PAMonZDT1 at a certain timeThe val-ues are respectively 8762 and 11752 This fully demonstratesthat PAM can improve the diversity of Pareto optimality

5 Conclusion and Future Work

In this paper a multiobjective particle swarm optimizationbased on PAM and uniform design is presented It firstlyimplements PAM to partition the Pareto front into119870 clustersin the objective space and then it implements the uniformcrossover operator on the minimal cluster so as to generatemore Pareto optimality in the minimal cluster When thesize of the Pareto solution is larger than that of the externalarchive PAM is used to determine which Pareto solutions areto be removed from or be inserted into the external archive

Mathematical Problems in Engineering 15

0 01 02 03 04 05 06 07 08 09

19

10

01

02

03

04

05

06

077

08

09

1

The 1st clusterThe 2nd cluster

26

(a)

41

0 01 02 03 04 05 06

26

07 08 09 10

01

02

03

04

051506

07

08

09

1

The 1st clusterThe 2nd cluster

(b)

Figure 7 Pareto optimal solutions before (a) and after (b) performing the uniform crossover

Finally it keeps all the points in the lesser clusters anddiscards some points in the larger clusters This can ensurethat each of the clusters will contain approximately the samedata points Therefore the diversity of the Pareto solutionswill increase and they can scatter uniformly over the Paretofront The results of the experimental simulation performedon several well-known test problems indicate that theproposed algorithm obviously outperforms the other twoalgorithms

This algorithm is going on for further enhancement andimprovement One attempt is to use a more efficient orapproximate clustering algorithm to speed up the executiontime of this algorithm Another attempt is to extend its appli-cation scopes

Conflict of Interests

The authors declare that there is no conflict of interestsregarding the publication of this paper

Acknowledgments

This research was supported by Guangxi Natural ScienceFoundation (no 2013GXNSFAA019337) Guangxi Universi-ties Key Project of Science and Technology Research (noKY2015ZD099) Key project of Guangxi Education Depart-ment (no 2013ZD055) Scientific Research Staring Founda-tion for the PHD Scholars of Yulin Normal University (noG2014005) Special Project of Yulin Normal University (no2012YJZX04) and Key Project of Yulin Normal University(no 2014YJZD05)The authors are grateful to the anonymousreferee for a careful checking of the details and for helpfulcomments that improved this paper

References

[1] J D Schaffer ldquoMultiple objective optimization with vector eval-uated genetic algorithmsrdquo in Proceedings of the 1st InternationalConference on Genetic Algorithms pp 93ndash100 1985

[2] L Tang and X Wang ldquoA hybrid multiobjective evolutionaryalgorithm for multiobjective optimization problemsrdquo IEEETransactions on Evolutionary Computation vol 17 no 1 pp 20ndash45 2013

[3] K Deb S Agrawal A Pratap and T Meyarivan ldquoA fast elitistnon-dominated sorting genetic algorithm for multi-objectiveoptimizationNSGA-IIrdquo inParallel ProblemSolving fromNaturePPSN VI vol 1917 of Lecture Notes in Computer Science pp849ndash858 Springer Berlin Germany 2000

[4] J Zhang Y Wang and J Feng ldquoAttribute index and uniformdesign based multiobjective association rule mining with evo-lutionary algorithmrdquo The Scientific World Journal vol 2013Article ID 259347 16 pages 2013

[5] S Jiang and Z Cai ldquoFaster convergence and higher hypervol-ume for multi-objective evolutionary algorithms by orthogonaland uniform designrdquo in Advances in Computation and Intelli-gence vol 6382 of Lecture Notes in Computer Science pp 312ndash328 Springer Berlin Germany 2010

[6] J Kennedy and R Eberhart ldquoParticle swarm optimizationrdquoin Proceedings of the IEEE International Conference on NeuralNetworks pp 1942ndash1948 December 1995

[7] R Eberhart and J Kennedy ldquoA new optimizer using particleswarm theoryrdquo in Proceedings of the 6th International Sympo-sium on Micro Machine and Human Science pp 39ndash43 IEEENagoya Japan October 1995

[8] L Yang ldquoA fast and elitist multi-objective particle swarm algo-rithm NSPSOrdquo in Proceedings of the IEEE International Con-ference on Granular Computing (GRC rsquo08) pp 470ndash475 August2008

[9] C A C Coello G T Pulido and M S Lechuga ldquoHandlingmultiple objectives with particle swarm optimizationrdquo IEEE

16 Mathematical Problems in Engineering

Transactions on Evolutionary Computation vol 8 no 3 pp256ndash279 2004

[10] M Clerc and J Kennedy ldquoThe particle swarm-explosion sta-bility and convergence in a multidimensional complex spacerdquoIEEE Transactions on Evolutionary Computation vol 6 no 1pp 58ndash73 2002

[11] H Xiaohui and R Eberhart ldquoMultiobjective optimization usingdynamic neighborhood particle swarm optimizationrdquo in Pro-ceedings of the Congress on Evolutionary Computation (CEC rsquo02)vol 2 pp 1677ndash1681 May 2002

[12] S Mostaghim and J Teich ldquoStrategies for finding goodlocal guides in multi-objective particle swarm optimization(MOPSO)rdquo in Proceedings of the IEEE Swarm IntelligenceSymposium (SIS rsquo03) pp 26ndash33 Indianapolis Ind USA April2003

[13] T Bartz-Beielstein P Limbourg J Mehnen K Schmitt K EParsopoulos and M N Vrahatis ldquoParticle swarm optimizersfor Pareto optimizationwith enhanced archiving techniquesrdquo inProceedings of the Congress on Evolutionary Computation (CECrsquo03) vol 3 pp 1780ndash1787 December 2003

[14] K E Parsopoulos D K Tasoulis and M N Vrahatis ldquoMulti-objective optimization using parallel vector evaluated particleswarm optimizationrdquo in Proceedings of the IASTED Interna-tional Conference on Artificial Intelligence and Applications (AIArsquo04) vol 2 pp 823ndash828 February 2004

[15] S-J Tsai T-Y Sun C-C Liu S-T Hsieh W-C Wu and S-YChiu ldquoAn improved multi-objective particle swarm optimizerformulti-objective problemsrdquo Expert Systems with Applicationsvol 37 no 8 pp 5872ndash5886 2010

[16] A A Mousa M A El-Shorbagy and W F Abd-El-WahedldquoLocal search based hybrid particle swarm optimization algo-rithm for multiobjective optimizationrdquo Swarm and Evolution-ary Computation vol 3 pp 1ndash14 2012

[17] J Wang W Liu W Zhang and B Yang ldquoMulti-objective parti-cle swarm optimization based on self-update and grid strategyrdquoin Proceedings of the 2012 International Conference on Informa-tion Technology and Software Engineering vol 211 of LectureNotes in Electrical Engineering pp 869ndash876 Springer BerlinGermany 2013

[18] K Khalili-Damghani A-R Abtahi and M Tavana ldquoA newmulti-objective particle swarmoptimizationmethod for solvingreliability redundancy allocation problemsrdquo Reliability Engi-neering amp System Safety vol 111 pp 58ndash75 2013

[19] J Zhang Y Wang and J Feng ldquoParallel particle swarm opti-mization based on PAMrdquo in Proceedings of the 2nd InternationalConference on Information Engineering and Computer Science(ICIECS rsquo10) Wuhan China December 2010

[20] J Han and M Kamber Data Mining Concepts and TechniquesMorgan Kaufmann 2006

[21] L F Ibrahim ldquoUsing of clustering algorithm CWSP-PAM forrural network planningrdquo in Proceedings of the 3rd InternationalConference on Information Technology and Applications (ICITArsquo05) vol 1 pp 280ndash283 IEEE July 2005

[22] P Chopra J Kang and J Lee ldquoUsing gene pair combinationsto improve the accuracy of the PAM classifierrdquo in Proceedingsof the IEEE International Conference on Bioinformatics andBiomedicine (BIBM rsquo09) pp 174ndash177 November 2009

[23] Y-W Leung and Y Wang ldquoMultiobjective programming usinguniform design and genetic algorithmrdquo IEEE Transactions onSystems Man and Cybernetics Part C Applications and Reviewsvol 30 no 3 pp 293ndash304 2000

[24] D Liu K C Tan C K Goh and W K Ho ldquoA multiobjectivememetic algorithm based on particle swarm optimizationrdquoIEEE Transactions on Systems Man and Cybernetics Part BCybernetics vol 37 no 1 pp 42ndash50 2007

[25] K Deb A Pratap S Agarwal and T Meyarivan ldquoA fast andelitist multiobjective genetic algorithm NSGA-IIrdquo IEEE Trans-actions on Evolutionary Computation vol 6 no 2 pp 182ndash1972002

[26] C M Fonseca and P J Fleming ldquoMultiobjective optimizationandmultiple constraint handling with evolutionary algorithmsII Application examplerdquo IEEE Transactions on Systems Manand Cybernetics Part A Systems and Humans vol 28 no 1 pp38ndash47 1998

[27] F Kursawe ldquoA variant of evolution strategies for vector opti-mizationrdquo in Parallel Problem Solving from Nature vol 496 ofLecture Notes in Computer Science pp 193ndash197 Springer BerlinGermany 1991

[28] E Zitzler K Deb and LThiele ldquoComparison of multiobjectiveevolutionary algorithms empirical resultsrdquo Evolutionary Com-putation vol 8 no 2 pp 173ndash195 2000

[29] H-L Liu Y Wang and Y-M Cheung ldquoA Multi-objective evo-lutionary algorithm usingmin-max strategy and sphere coordi-nate transformationrdquo Intelligent Automation amp Soft Computingvol 15 no 3 pp 361ndash384 2009

[30] N Chakraborti B S Kumar V S Babu S Moitra andA Mukhopadhyay ldquoA new multi-objective genetic algorithmapplied to hot-rolling processrdquoAppliedMathematicalModellingvol 32 no 9 pp 1781ndash1789 2008

[31] K Deb ldquoMulti-objective genetic algorithms problem difficul-ties and construction of test problemsrdquo Evolutionary Computa-tion vol 7 no 3 pp 205ndash230 1999

[32] K Deb L Thiele M Laumanns and E Zitzler ldquoScalable testproblems for evolutionarymultiobjective optimizationrdquo in Evo-lutionary Multiobjective Optimization Advanced Informationand Knowledge Processing pp 105ndash145 Springer London UK2005

[33] Y Shi and R Eberhart ldquoModified particle swarm optimizerrdquoin Proceedings of the IEEE World Congress on ComputationalIntelligence pp 69ndash73 May 1998

[34] C S Feng S Cong and X Y Feng ldquoA new adaptive inertiaweight strategy in particle swarm optimizationrdquo in Procedingsof the IEEE Congress on Evolutionary Computation (CEC rsquo07)pp 4186ndash4190 IEEE Singapore September 2007

[35] C K Goh K C Tan D S Liu and S C Chiam ldquoA competitiveand cooperative co-evolutionary approach to multi-objectiveparticle swarm optimization algorithm designrdquo European Jour-nal of Operational Research vol 202 no 1 pp 42ndash54 2010

[36] P Cheng J-S Pan L Li Y Tang and C Huang ldquoA surveyof performance assessment for multiobjective optimizersrdquo inProceedings of the 4th International Conference on Genetic andEvolutionary Computing (ICGEC rsquo10) pp 341ndash345 December2010

[37] E Zitzler L Thiele M Laumanns C M Fonseca and V GDa Fonseca ldquoPerformance assessment of multiobjective opti-mizers an analysis and reviewrdquo IEEE Transactions on Evolution-ary Computation vol 7 no 2 pp 117ndash132 2003

[38] E Zitzler and L Thiele ldquoMultiobjective evolutionary algo-rithms a comparative case study and the strength Paretoapproachrdquo IEEETransactions onEvolutionaryComputation vol3 no 4 pp 257ndash271 1999

[39] H Li and Q Zhang ldquoMultiobjective optimization problemswith complicated pareto sets MOEAD and NSGA-IIrdquo IEEE

Mathematical Problems in Engineering 17

Transactions on Evolutionary Computation vol 13 no 2 pp284ndash302 2009

[40] Q Zhang A Zhou S Zhao P N Suganthan W Liu andS Tiwari ldquoMultiobjective optimization test instances for theCEC 2009 special session and competitionrdquo Working ReportDepartment of Computing and Electronic Systems Universityof Essex 2008

[41] Q Zhang A Zhou and Y Jin ldquoRM-MEDA a regularity model-based multiobjective estimation of distribution algorithmrdquoIEEE Transactions on Evolutionary Computation vol 12 no 1pp 41ndash63 2008

[42] C-S Tsou H-H Fang H-H Chang and C-H Kao ldquoAnimproved particle swarmPareto optimizerwith local search andclusteringrdquo in Simulated Evolution and Learning vol 4247 ofLecture Notes in Computer Science pp 400ndash407 2006

[43] U Wickramasinghe and X Li ldquoChoosing leaders for multiob-jective PSO algorithms using differential evolutionrdquo in Proceed-ing of the 7th International Conference Simulated Evolution andLearning vol 5361 of Lecture Notes in Computer Science pp249ndash258 Springer Berlin Germany 2008

[44] T Krink J S Vesterstrom and J Riget ldquoParticle swarm opti-misation with spatial particle extensionrdquo in Proceedings of theCongress on Evolutionary Computation (CEC rsquo02) pp 1474ndash1479 May 2002

Submit your manuscripts athttpwwwhindawicom

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical Problems in Engineering

Hindawi Publishing Corporationhttpwwwhindawicom

Differential EquationsInternational Journal of

Volume 2014

Applied MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Probability and StatisticsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical PhysicsAdvances in

Complex AnalysisJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

OptimizationJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

CombinatoricsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Operations ResearchAdvances in

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Function Spaces

Abstract and Applied AnalysisHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of Mathematics and Mathematical Sciences

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Algebra

Discrete Dynamics in Nature and Society

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Decision SciencesAdvances in

Discrete MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014 Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Stochastic AnalysisInternational Journal of

Page 8: Research Article Multiobjective Particle Swarm

8 Mathematical Problems in Engineering

This test function has a nonconvex Pareto optimal frontIt includes two difficulties caused by the nonuniformityof the search space Firstly Pareto optimal solutions arenonuniformly distributed along the global Pareto optimalfront (the front is biased for solutions in which 119891

1(1199091) is near

one) Secondly the density of the solutions is the lowest nearthe Pareto optimal front and the highest away from the front

DTLZ1 is defined as

1198911 (119909) = 0511990911199092 (1 + 119892 (119909))

1198912 (119909) = 051199091 (1 minus 1199092) (1 + 119892 (119909))

1198913 (119909) = 05 (1 minus 1199091) (1 + 119892 (119909))

119892 (119909) = 100

sdot [5 +119899

sum119894=3

((119909119894minus 05)

2minus cos (20120587 (119909

119894minus 05)))]

(15)

where 119899 = 7 and 119909119894isin [0 1]

This test function has difficulty in the convergence to thePareto optimal hyperplane and contains (115minus1) local Paretooptimal fronts

DTLZ2 is defined as

1198911 (119909) = (1 + 119892 (119909)) cos (051205871199091) cos (051205871199092)

1198912 (119909) = (1 + 119892 (119909)) cos (051205871199091) sin (051205871199092)

1198913 (119909) = (1 + 119892 (119909)) sin (051205871199091)

119892 (119909) =119899

sum119894=3

(119909119894minus 05)

2

(16)

where 119899 = 12 and 119909119894isin [0 1]

The Pareto optimal solutions of this test function must lieinside the first octant of the unit sphere in a three-objectiveplot It is more difficult than DTLZ1

42 Parameter Values The parameter values of the proposedalgorithm UKMOPSO are adopted as follows

(i) Parameters for PSO the linearly descending inertiaweight [33 34] is within the interval [01 1] and theacceleration coefficients 119888

1and 1198882are both taken as 2

(ii) Parameter for PAM the minimal and maximal num-bers of clusters are 119870min = 2 and119870max = 10

(iii) Population size the population size119873pop is 200(iv) Parameters for the uniform design the number of

subintervals 119878 is 64 the number of the sample pointsor the population size of each subinterval 119876

0is 31 set

1198630= 31 and 119876

1= 5

(v) Stopping condition the algorithm terminates if thenumber of iterations is larger than the given maximalgenerations of 20

All the parameter values in UMOGA [23] are set equalto the original values in [23] the different and additional

parameter values between UMOGA and UKMOPSO are asfollows the number of subintervals 119878 is 16 119865 = 119873 (thenumber of variables)119863

1= 7 119901

119898= 002

NSGA-II in [25] adopted a population size of 100 acrossover probability of 08 a mutation probability of 1119899(where 119899 is the number of variables) and maximal genera-tions of 250 In order to make the comparisons fair we haveused population size of 100 and maximal generations of 40 inNSGA-II so that the total number of function evaluations inNSGA-II UKMOPSO and UMOGA is the same

43 Performance Metric Based on the assumption that thetrue Pareto front of a test problem is known many kinds ofperformance metrics have been proposed and used by manyresearchers such as [2 9 24 25 29 35ndash40] Three of themare adopted in the paper to compare the performance of theproposed algorithm with them The first metric is 119862 metric[29 38] It is taken as the quantitative metric of the solutionquality and often used to show that the outcomes of one algo-rithm dominating the outcomes of another algorithm It isdefined as

119862 (119860 119861) =|119887 isin 119861 exist119886 isin 119860 119886 ≺ 119887|

|119861| (17)

where 119886 ≺ 119887 represents 119887 being dominated by 119886 119860 representsthe Pareto front obtained by Algorithm 119860 and 119861 representsthe Pareto front obtained by Algorithm 119861 in a typical runFurthermore |119861| represents the number of elements of 119861

The metric value 119862(119860 119861) = 1 means that all points in 119861are dominated by or equal to points in 119860 In contrast 119862(119860119861) = 0means that none of the points in 119861 are covered by theones in 119860

The second metric is the IGD metric (namely InvertedGenerational Distance) [39ndash41] It is the mean value of thedistances from the set of solutions uniformly distributed inthe true Pareto front to the set of solutions obtained by analgorithm in objective space Let 119875lowast be a set of uniformly dis-tributed points along the true PF (Pareto Front) Let 119860 be anapproximate set to the PF the average distance from119875lowast to119860 isdefined as

IGD (119860 119875lowast) =sumVisin119875lowast 119889 (V 119860)

|119875lowast| (18)

where 119889(V 119860) is the minimum Euclidean distance between Vand the points in 119860 If |119875lowast| is large enough to represent thePF very well IGD(119860 119875lowast) could measure both the diversityand convergence of 119860 in a sense To have a low value ofIGD(119860 119875lowast) the set119860must be very close to the PF and cannotmiss any part of the whole PF

The smaller the IGDvalue for the set of obtained solutionsis the better the performance of the algorithm will be

The third measure is the measure of maximum spread(MS) [2 24 35] which is proposed in [42 43] It canmeasure how well the true Pareto front (PFtrue) is coveredby the discovered Pareto front (PFknown) through hyperboxes

Mathematical Problems in Engineering 9

0 01 02 03 04 05 06 07 08 09 10

01

02

03

04

05

06

07

08

09

1100

True Pareto frontUKMOPSO

0 01 02 03 04 05 06 07 08 09 10

01

02

03

04

05

06

07

08

09

152

True Pareto frontUMOGA

0 01 02 03 04 05 06 07 08 09 10

01

02

03

04

05

06

07

08

09

140

True Pareto frontNSGA-II

f1(x)

f2(x)

f1(x)

f2(x)

f1(x)

f2(x)

Figure 1 Pareto fronts obtained by different algorithms for test problem FON

formed by the extreme function values observed in the PFtrueand PFknown It is defined as

MS = radic 1

119872

119872

sum119894=1

120575119894

120575119894= (

min (119891max119894 119865max119894) minusmax (119891min

119894 119865min119894)

119865max119894

minus 119865min119894

)

2

(19)

where 119872 is the number of objectives 119891max119894

and 119891min119894

arethe maximum and minimum values of the 119894th objective inPFknown respectively and 119865

max119894

and 119865min119894

are the maximumand minimum values of the 119894th objective in PFtrue respec-tively

Note that if 119891min119894

ge 119865max119894

then 120575119894= 0 Algorithms with

largerMS values are desirable andMS = 1means that the truePareto front is totally covered by the obtained Pareto front

44 Results For each test problem we perform the proposedalgorithm (called UKMOPSO) for 30 independent runs andcompare its performance with UMOGA [23] and NSGA-II[25] The values of several metrics the 119862metric IGDmetricand MS metric are shown in Tables 2 3 4 and 5 The Paretofronts obtained by several algorithms implemented on severaltest functions are illustrated in Figures 1ndash6

For brevity 119862(UKMOPSOUMOGA) 119862(UMOGAUKMOPSO) 119862(UKMOPSONSGA-II) and 119862(NSGA-IIUKMOPSO) are respectively marked as 119862

12 11986221 11986213 and

11986231

10 Mathematical Problems in Engineering

0100

UKMOPSOTrue Pareto front

14

UMOGATrue Pareto front

40

NSGA-IITrue Pareto front

f1(x)

f2(x)

minus20 minus19 minus18 minus17 minus16 minus15 minus14

f1(x)

minus20 minus19 minus18 minus17 minus16 minus15 minus14 minus13 minus12minus12

minus10

minus8

minus6

minus4

minus2

0

f2(x)

f1(x)

minus20 minus19 minus18 minus17 minus16 minus15 minus14minus12

minus10

minus8

minus6

minus4

minus2

0

f2(x)

minus12

minus10

minus8

minus6

minus4

minus2

Figure 2 Pareto fronts obtained by different algorithms for test problem KUR

Table 2 Comparison of C metric between UKMOPSO andUMOGA

C (UKMOPSO UMOGA) C (UMOGA UKMOPSO)FON 0769 001KUR 0929 001ZDT1 012 013ZDT2 0063 003ZDT3 0139 0098ZDT6 0063 001DTLZ1 1 0DTLZ2 1 0

Table 3 Comparison of Cmetric between UKMOPSO and NSGA-II

C (UKMOPSO NSGA-II) C (NSGA-II UKMOPSO)FON 0025 001KUR 0075 011ZDT1 1 0ZDT2 1 0ZDT3 1 0ZDT6 1 0DTLZ1 0 013DTLZ2 0025 006

Mathematical Problems in Engineering 11

0

02

04

06

08

1

12

14100 76

0

05

1

15

2

25

325

0 01 02 03 04 05 06 07 08 09 1

True Pareto frontUKMOPSO

0 01 02 03 04 05 06 07 08 09 1

True Pareto frontNSGA-II

f1(x)

f2(x)

f1(x)

0 01 02 03 04 05 06 07 08 09 10

01

02

03

04

05

06

07

08

09

1

True Pareto frontUMOGA

f1(x)

f2(x)

f2(x)

Figure 3 Pareto fronts obtained by different algorithms for test problem ZDT1

Table 4 Comparison of IGD metric

UKMOPSO UMOGA NSGA-IIFON 00072 00432 03707KUR 00668 11576 12323ZDT1 00067 00265 13933ZDT2 00067 00727 21409ZDT3 01964 02151 02816ZDT6 00033 00139 59223DTLZ1 74827 22965 54182DTLZ2 01123 04319 02724

Table 5 Comparison of maximum spread (MS) metric

UKMOPSO UMOGA NSGA-IIFON 09769 09754 02368KUR 09789 09580 03414ZDT1 09429 09984 05495ZDT2 09999 09986 00019ZDT3 09276 09275 07963ZDT6 1 09974 03738DTLZ1 1 1 06588DTLZ2 1 1 07125

12 Mathematical Problems in Engineering

0

02

04

06

08

1

12

14100 9

0

05

1

15

2

25

38

0 01 02 03 04 05 06 07 08 09 1

True Pareto frontUKMOPSO

0 01 02 03 04 05 06 07 08 09 10

01

02

03

04

05

06

07

08

09

1

True Pareto frontUMOGA

0 01 02 03 04 05 06 07 08 09 1

True Pareto frontNSGA-II

f1(x) f1(x)

f2(x)

f2(x)

f2(x)

f1(x)

Figure 4 Pareto fronts obtained by different algorithms for test problem ZDT2

As shown in Table 2 for the test functions Fon and KUR11986212= 0769 and 119862

12= 0929 mean that 768 and 929 of

the solutions obtained by UMOGA are dominated by thoseobtained by UKMOPSO Similarly 119862

21= 001means that 1

of the solutions obtained by UKMOPSO are dominated bythose obtained by UMOGA For DTLZ1 and DTLZ2119862

12= 1

and 11986221= 0 mean that all solutions obtained by UMOGA

are dominated by those obtained by UKMOPSO but noneof solutions from UKMOPSO is dominated by those fromUMOGA It means that the solution quality via UKMOPSOis much better than that via UMOGA for the above testfunctions For ZDT1 ZDT2 ZDT3 and ZDT6 the valuesof the 119862 metrics do not have too many differences betweenUKMOPSO and UMOGA It means the solution qualities viaboth are almost identical

FromTable 3 we can see that for ZDT1 ZDT2 ZDT3 andZDT6 119862

13= 1 and 119862

31= 0mean that all solutions obtained

byNSGA-II are dominated by those obtained byUKMOPSObut none of solutions from UKMOPSO is dominated bythose from NSGA-II It means that the solution quality viaUKMOPSO is much better than that via NSGA-II for theabove test functions For the rest of the test functions thesolution qualities via both are almost identical

In Table 4 for all test problems the IGD values ofUKMOPSO are the smallest among UKMOPSO UMOGAand NSGA-II This means the PF found by UKMOPSO isthe nearest to the true PF compared with the PF obtainedby the other two algorithms namely the performance ofUKMOPSO is the best in the three algorithms The IGDvalues of UMOGA are all larger than those of NSGA-II for

Mathematical Problems in Engineering 13

0

02

04

06

08

1 92 86

0

05

1

15

2

25

3

35

4

45 40

0 01 02 03 04 05 06 07 08 09f1(x)

0 01 02 03 04 05 06 07 08 09f1(x)

0 01 02 03 04 05 06 07 08 09f1(x)

f2(x)

minus08

minus06

minus04

minus02

0

02

04

06

08

1

f2(x)

f2(x)

minus08

minus05

minus06

minus04

minus02

UKMOPSO UMOGA

NSGA-II

Figure 5 Pareto fronts obtained by different algorithms for test problem ZDT3

all test functions It means that the PF found by UMOGA iscloser to the true PF than the PF obtained by NSGA-II

From Table 5 we can see that all the MS values ofUKMOPSO are almost close to 1 for each test function Itmeans that almost all the true PF is totally covered by the PFobtained by UKMOPSO UMOGA is similar to UKMOPSOThe MS values of NSGA-II are much lesser than UKMOPSOand UMOGA especially MS = 00019 for ZDT2

Figures 1ndash6 all demonstrate that the PF found byUKMOPSO is the nearest to the true PF and scatters mostuniformly in those obtained by three algorithms Most of thepoints in the PF found by UKMOPSO overlap the pointsin the true PF It means the solution quality obtained byUKMOPSO is very high

45 Influence of the Uniform Crossover and PAM In orderto find out the influence of the uniform crossover on the

proposed algorithm we compare the distribution and num-ber of Pareto optimal solutions before and after performingthe uniform crossover One of the simulating results on thetest function ZDT1 is shown in Figure 7

From Figure 7 it can be seen that before and afterperforming the uniform crossover the number of data pointsin the 1st cluster and the 2nd cluster varies from 7 to 15and from 19 to 26 respectively namely many new Paretooptimal solutions having not been found before performingthe uniform crossover are generated and the differences ofthe data points between two clusters have been decreasedThis will directly influence the uniformity of the PF andacquire more uniform Pareto solutions

PAM is used to determine which Pareto solutions are tobe removed fromor be inserted into the external archiveThisis to maintain the diversity of Pareto optimal solutions The

14 Mathematical Problems in Engineering

100 16

0

05

1

15

2

25

3

35

4

45

540

02 03 04 05 06 07 08 09 10

01

02

03

04

05

06

07

08

09

1

True Pareto frontUKMOPSO

02 03 04 05 06 07 08 09 10

01

02

03

04

05

06

07

08

09

1

True Pareto frontUMOGA

02 03 04 05 06 07 08 09 1

True Pareto frontNSGA-II

f1(x)

f2(x)

f1(x)

f2(x)

f2(x)

f1(x)

Figure 6 Pareto fronts obtained by different algorithms for test problem ZDT6

diversity can be computed by the ldquodistance-to-average-pointrdquomeasure [44] defined as

diversity (119878) = 1

|119878|

|119878|

sum119894=1

radic119873

sum119895=1

(119901119894119895minus 119901119895)2

(20)

where 119878 is the population |119878| is the swarm size 119873 is thedimensionality of the problem 119901

119894119895is the 1198951015840th value of the 1198941015840th

particle and 119901119895is the 1198951015840th value of the average point 119901

If the number of Pareto optimality is less than or equalto the size of the external archive PAM has no influenceon the proposed algorithm Otherwise it is used to selectthe different type of Pareto optimality from several clustersTherefore The diversity of Pareto optimality will certainlyincreaseWemonitor the diversity of Pareto optimality before

and after performing PAMonZDT1 at a certain timeThe val-ues are respectively 8762 and 11752 This fully demonstratesthat PAM can improve the diversity of Pareto optimality

5 Conclusion and Future Work

In this paper a multiobjective particle swarm optimizationbased on PAM and uniform design is presented It firstlyimplements PAM to partition the Pareto front into119870 clustersin the objective space and then it implements the uniformcrossover operator on the minimal cluster so as to generatemore Pareto optimality in the minimal cluster When thesize of the Pareto solution is larger than that of the externalarchive PAM is used to determine which Pareto solutions areto be removed from or be inserted into the external archive

Mathematical Problems in Engineering 15

0 01 02 03 04 05 06 07 08 09

19

10

01

02

03

04

05

06

077

08

09

1

The 1st clusterThe 2nd cluster

26

(a)

41

0 01 02 03 04 05 06

26

07 08 09 10

01

02

03

04

051506

07

08

09

1

The 1st clusterThe 2nd cluster

(b)

Figure 7 Pareto optimal solutions before (a) and after (b) performing the uniform crossover

Finally it keeps all the points in the lesser clusters anddiscards some points in the larger clusters This can ensurethat each of the clusters will contain approximately the samedata points Therefore the diversity of the Pareto solutionswill increase and they can scatter uniformly over the Paretofront The results of the experimental simulation performedon several well-known test problems indicate that theproposed algorithm obviously outperforms the other twoalgorithms

This algorithm is going on for further enhancement andimprovement One attempt is to use a more efficient orapproximate clustering algorithm to speed up the executiontime of this algorithm Another attempt is to extend its appli-cation scopes

Conflict of Interests

The authors declare that there is no conflict of interestsregarding the publication of this paper

Acknowledgments

This research was supported by Guangxi Natural ScienceFoundation (no 2013GXNSFAA019337) Guangxi Universi-ties Key Project of Science and Technology Research (noKY2015ZD099) Key project of Guangxi Education Depart-ment (no 2013ZD055) Scientific Research Staring Founda-tion for the PHD Scholars of Yulin Normal University (noG2014005) Special Project of Yulin Normal University (no2012YJZX04) and Key Project of Yulin Normal University(no 2014YJZD05)The authors are grateful to the anonymousreferee for a careful checking of the details and for helpfulcomments that improved this paper

References

[1] J D Schaffer ldquoMultiple objective optimization with vector eval-uated genetic algorithmsrdquo in Proceedings of the 1st InternationalConference on Genetic Algorithms pp 93ndash100 1985

[2] L Tang and X Wang ldquoA hybrid multiobjective evolutionaryalgorithm for multiobjective optimization problemsrdquo IEEETransactions on Evolutionary Computation vol 17 no 1 pp 20ndash45 2013

[3] K Deb S Agrawal A Pratap and T Meyarivan ldquoA fast elitistnon-dominated sorting genetic algorithm for multi-objectiveoptimizationNSGA-IIrdquo inParallel ProblemSolving fromNaturePPSN VI vol 1917 of Lecture Notes in Computer Science pp849ndash858 Springer Berlin Germany 2000

[4] J Zhang Y Wang and J Feng ldquoAttribute index and uniformdesign based multiobjective association rule mining with evo-lutionary algorithmrdquo The Scientific World Journal vol 2013Article ID 259347 16 pages 2013

[5] S Jiang and Z Cai ldquoFaster convergence and higher hypervol-ume for multi-objective evolutionary algorithms by orthogonaland uniform designrdquo in Advances in Computation and Intelli-gence vol 6382 of Lecture Notes in Computer Science pp 312ndash328 Springer Berlin Germany 2010

[6] J Kennedy and R Eberhart ldquoParticle swarm optimizationrdquoin Proceedings of the IEEE International Conference on NeuralNetworks pp 1942ndash1948 December 1995

[7] R Eberhart and J Kennedy ldquoA new optimizer using particleswarm theoryrdquo in Proceedings of the 6th International Sympo-sium on Micro Machine and Human Science pp 39ndash43 IEEENagoya Japan October 1995

[8] L Yang ldquoA fast and elitist multi-objective particle swarm algo-rithm NSPSOrdquo in Proceedings of the IEEE International Con-ference on Granular Computing (GRC rsquo08) pp 470ndash475 August2008

[9] C A C Coello G T Pulido and M S Lechuga ldquoHandlingmultiple objectives with particle swarm optimizationrdquo IEEE

16 Mathematical Problems in Engineering

Transactions on Evolutionary Computation vol 8 no 3 pp256ndash279 2004

[10] M Clerc and J Kennedy ldquoThe particle swarm-explosion sta-bility and convergence in a multidimensional complex spacerdquoIEEE Transactions on Evolutionary Computation vol 6 no 1pp 58ndash73 2002

[11] H Xiaohui and R Eberhart ldquoMultiobjective optimization usingdynamic neighborhood particle swarm optimizationrdquo in Pro-ceedings of the Congress on Evolutionary Computation (CEC rsquo02)vol 2 pp 1677ndash1681 May 2002

[12] S Mostaghim and J Teich ldquoStrategies for finding goodlocal guides in multi-objective particle swarm optimization(MOPSO)rdquo in Proceedings of the IEEE Swarm IntelligenceSymposium (SIS rsquo03) pp 26ndash33 Indianapolis Ind USA April2003

[13] T Bartz-Beielstein P Limbourg J Mehnen K Schmitt K EParsopoulos and M N Vrahatis ldquoParticle swarm optimizersfor Pareto optimizationwith enhanced archiving techniquesrdquo inProceedings of the Congress on Evolutionary Computation (CECrsquo03) vol 3 pp 1780ndash1787 December 2003

[14] K E Parsopoulos D K Tasoulis and M N Vrahatis ldquoMulti-objective optimization using parallel vector evaluated particleswarm optimizationrdquo in Proceedings of the IASTED Interna-tional Conference on Artificial Intelligence and Applications (AIArsquo04) vol 2 pp 823ndash828 February 2004

[15] S-J Tsai T-Y Sun C-C Liu S-T Hsieh W-C Wu and S-YChiu ldquoAn improved multi-objective particle swarm optimizerformulti-objective problemsrdquo Expert Systems with Applicationsvol 37 no 8 pp 5872ndash5886 2010

[16] A A Mousa M A El-Shorbagy and W F Abd-El-WahedldquoLocal search based hybrid particle swarm optimization algo-rithm for multiobjective optimizationrdquo Swarm and Evolution-ary Computation vol 3 pp 1ndash14 2012

[17] J Wang W Liu W Zhang and B Yang ldquoMulti-objective parti-cle swarm optimization based on self-update and grid strategyrdquoin Proceedings of the 2012 International Conference on Informa-tion Technology and Software Engineering vol 211 of LectureNotes in Electrical Engineering pp 869ndash876 Springer BerlinGermany 2013

[18] K Khalili-Damghani A-R Abtahi and M Tavana ldquoA newmulti-objective particle swarmoptimizationmethod for solvingreliability redundancy allocation problemsrdquo Reliability Engi-neering amp System Safety vol 111 pp 58ndash75 2013

[19] J Zhang Y Wang and J Feng ldquoParallel particle swarm opti-mization based on PAMrdquo in Proceedings of the 2nd InternationalConference on Information Engineering and Computer Science(ICIECS rsquo10) Wuhan China December 2010

[20] J Han and M Kamber Data Mining Concepts and TechniquesMorgan Kaufmann 2006

[21] L F Ibrahim ldquoUsing of clustering algorithm CWSP-PAM forrural network planningrdquo in Proceedings of the 3rd InternationalConference on Information Technology and Applications (ICITArsquo05) vol 1 pp 280ndash283 IEEE July 2005

[22] P Chopra J Kang and J Lee ldquoUsing gene pair combinationsto improve the accuracy of the PAM classifierrdquo in Proceedingsof the IEEE International Conference on Bioinformatics andBiomedicine (BIBM rsquo09) pp 174ndash177 November 2009

[23] Y-W Leung and Y Wang ldquoMultiobjective programming usinguniform design and genetic algorithmrdquo IEEE Transactions onSystems Man and Cybernetics Part C Applications and Reviewsvol 30 no 3 pp 293ndash304 2000

[24] D Liu K C Tan C K Goh and W K Ho ldquoA multiobjectivememetic algorithm based on particle swarm optimizationrdquoIEEE Transactions on Systems Man and Cybernetics Part BCybernetics vol 37 no 1 pp 42ndash50 2007

[25] K Deb A Pratap S Agarwal and T Meyarivan ldquoA fast andelitist multiobjective genetic algorithm NSGA-IIrdquo IEEE Trans-actions on Evolutionary Computation vol 6 no 2 pp 182ndash1972002

[26] C M Fonseca and P J Fleming ldquoMultiobjective optimizationandmultiple constraint handling with evolutionary algorithmsII Application examplerdquo IEEE Transactions on Systems Manand Cybernetics Part A Systems and Humans vol 28 no 1 pp38ndash47 1998

[27] F Kursawe ldquoA variant of evolution strategies for vector opti-mizationrdquo in Parallel Problem Solving from Nature vol 496 ofLecture Notes in Computer Science pp 193ndash197 Springer BerlinGermany 1991

[28] E Zitzler K Deb and LThiele ldquoComparison of multiobjectiveevolutionary algorithms empirical resultsrdquo Evolutionary Com-putation vol 8 no 2 pp 173ndash195 2000

[29] H-L Liu Y Wang and Y-M Cheung ldquoA Multi-objective evo-lutionary algorithm usingmin-max strategy and sphere coordi-nate transformationrdquo Intelligent Automation amp Soft Computingvol 15 no 3 pp 361ndash384 2009

[30] N Chakraborti B S Kumar V S Babu S Moitra andA Mukhopadhyay ldquoA new multi-objective genetic algorithmapplied to hot-rolling processrdquoAppliedMathematicalModellingvol 32 no 9 pp 1781ndash1789 2008

[31] K Deb ldquoMulti-objective genetic algorithms problem difficul-ties and construction of test problemsrdquo Evolutionary Computa-tion vol 7 no 3 pp 205ndash230 1999

[32] K Deb L Thiele M Laumanns and E Zitzler ldquoScalable testproblems for evolutionarymultiobjective optimizationrdquo in Evo-lutionary Multiobjective Optimization Advanced Informationand Knowledge Processing pp 105ndash145 Springer London UK2005

[33] Y Shi and R Eberhart ldquoModified particle swarm optimizerrdquoin Proceedings of the IEEE World Congress on ComputationalIntelligence pp 69ndash73 May 1998

[34] C S Feng S Cong and X Y Feng ldquoA new adaptive inertiaweight strategy in particle swarm optimizationrdquo in Procedingsof the IEEE Congress on Evolutionary Computation (CEC rsquo07)pp 4186ndash4190 IEEE Singapore September 2007

[35] C K Goh K C Tan D S Liu and S C Chiam ldquoA competitiveand cooperative co-evolutionary approach to multi-objectiveparticle swarm optimization algorithm designrdquo European Jour-nal of Operational Research vol 202 no 1 pp 42ndash54 2010

[36] P Cheng J-S Pan L Li Y Tang and C Huang ldquoA surveyof performance assessment for multiobjective optimizersrdquo inProceedings of the 4th International Conference on Genetic andEvolutionary Computing (ICGEC rsquo10) pp 341ndash345 December2010

[37] E Zitzler L Thiele M Laumanns C M Fonseca and V GDa Fonseca ldquoPerformance assessment of multiobjective opti-mizers an analysis and reviewrdquo IEEE Transactions on Evolution-ary Computation vol 7 no 2 pp 117ndash132 2003

[38] E Zitzler and L Thiele ldquoMultiobjective evolutionary algo-rithms a comparative case study and the strength Paretoapproachrdquo IEEETransactions onEvolutionaryComputation vol3 no 4 pp 257ndash271 1999

[39] H Li and Q Zhang ldquoMultiobjective optimization problemswith complicated pareto sets MOEAD and NSGA-IIrdquo IEEE

Mathematical Problems in Engineering 17

Transactions on Evolutionary Computation vol 13 no 2 pp284ndash302 2009

[40] Q Zhang A Zhou S Zhao P N Suganthan W Liu andS Tiwari ldquoMultiobjective optimization test instances for theCEC 2009 special session and competitionrdquo Working ReportDepartment of Computing and Electronic Systems Universityof Essex 2008

[41] Q Zhang A Zhou and Y Jin ldquoRM-MEDA a regularity model-based multiobjective estimation of distribution algorithmrdquoIEEE Transactions on Evolutionary Computation vol 12 no 1pp 41ndash63 2008

[42] C-S Tsou H-H Fang H-H Chang and C-H Kao ldquoAnimproved particle swarmPareto optimizerwith local search andclusteringrdquo in Simulated Evolution and Learning vol 4247 ofLecture Notes in Computer Science pp 400ndash407 2006

[43] U Wickramasinghe and X Li ldquoChoosing leaders for multiob-jective PSO algorithms using differential evolutionrdquo in Proceed-ing of the 7th International Conference Simulated Evolution andLearning vol 5361 of Lecture Notes in Computer Science pp249ndash258 Springer Berlin Germany 2008

[44] T Krink J S Vesterstrom and J Riget ldquoParticle swarm opti-misation with spatial particle extensionrdquo in Proceedings of theCongress on Evolutionary Computation (CEC rsquo02) pp 1474ndash1479 May 2002

Submit your manuscripts athttpwwwhindawicom

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical Problems in Engineering

Hindawi Publishing Corporationhttpwwwhindawicom

Differential EquationsInternational Journal of

Volume 2014

Applied MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Probability and StatisticsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical PhysicsAdvances in

Complex AnalysisJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

OptimizationJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

CombinatoricsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Operations ResearchAdvances in

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Function Spaces

Abstract and Applied AnalysisHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of Mathematics and Mathematical Sciences

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Algebra

Discrete Dynamics in Nature and Society

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Decision SciencesAdvances in

Discrete MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014 Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Stochastic AnalysisInternational Journal of

Page 9: Research Article Multiobjective Particle Swarm

Mathematical Problems in Engineering 9

0 01 02 03 04 05 06 07 08 09 10

01

02

03

04

05

06

07

08

09

1100

True Pareto frontUKMOPSO

0 01 02 03 04 05 06 07 08 09 10

01

02

03

04

05

06

07

08

09

152

True Pareto frontUMOGA

0 01 02 03 04 05 06 07 08 09 10

01

02

03

04

05

06

07

08

09

140

True Pareto frontNSGA-II

f1(x)

f2(x)

f1(x)

f2(x)

f1(x)

f2(x)

Figure 1 Pareto fronts obtained by different algorithms for test problem FON

formed by the extreme function values observed in the PFtrueand PFknown It is defined as

MS = radic 1

119872

119872

sum119894=1

120575119894

120575119894= (

min (119891max119894 119865max119894) minusmax (119891min

119894 119865min119894)

119865max119894

minus 119865min119894

)

2

(19)

where 119872 is the number of objectives 119891max119894

and 119891min119894

arethe maximum and minimum values of the 119894th objective inPFknown respectively and 119865

max119894

and 119865min119894

are the maximumand minimum values of the 119894th objective in PFtrue respec-tively

Note that if 119891min119894

ge 119865max119894

then 120575119894= 0 Algorithms with

largerMS values are desirable andMS = 1means that the truePareto front is totally covered by the obtained Pareto front

44 Results For each test problem we perform the proposedalgorithm (called UKMOPSO) for 30 independent runs andcompare its performance with UMOGA [23] and NSGA-II[25] The values of several metrics the 119862metric IGDmetricand MS metric are shown in Tables 2 3 4 and 5 The Paretofronts obtained by several algorithms implemented on severaltest functions are illustrated in Figures 1ndash6

For brevity 119862(UKMOPSOUMOGA) 119862(UMOGAUKMOPSO) 119862(UKMOPSONSGA-II) and 119862(NSGA-IIUKMOPSO) are respectively marked as 119862

12 11986221 11986213 and

11986231

10 Mathematical Problems in Engineering

0100

UKMOPSOTrue Pareto front

14

UMOGATrue Pareto front

40

NSGA-IITrue Pareto front

f1(x)

f2(x)

minus20 minus19 minus18 minus17 minus16 minus15 minus14

f1(x)

minus20 minus19 minus18 minus17 minus16 minus15 minus14 minus13 minus12minus12

minus10

minus8

minus6

minus4

minus2

0

f2(x)

f1(x)

minus20 minus19 minus18 minus17 minus16 minus15 minus14minus12

minus10

minus8

minus6

minus4

minus2

0

f2(x)

minus12

minus10

minus8

minus6

minus4

minus2

Figure 2 Pareto fronts obtained by different algorithms for test problem KUR

Table 2 Comparison of C metric between UKMOPSO andUMOGA

C (UKMOPSO UMOGA) C (UMOGA UKMOPSO)FON 0769 001KUR 0929 001ZDT1 012 013ZDT2 0063 003ZDT3 0139 0098ZDT6 0063 001DTLZ1 1 0DTLZ2 1 0

Table 3 Comparison of Cmetric between UKMOPSO and NSGA-II

C (UKMOPSO NSGA-II) C (NSGA-II UKMOPSO)FON 0025 001KUR 0075 011ZDT1 1 0ZDT2 1 0ZDT3 1 0ZDT6 1 0DTLZ1 0 013DTLZ2 0025 006

Mathematical Problems in Engineering 11

0

02

04

06

08

1

12

14100 76

0

05

1

15

2

25

325

0 01 02 03 04 05 06 07 08 09 1

True Pareto frontUKMOPSO

0 01 02 03 04 05 06 07 08 09 1

True Pareto frontNSGA-II

f1(x)

f2(x)

f1(x)

0 01 02 03 04 05 06 07 08 09 10

01

02

03

04

05

06

07

08

09

1

True Pareto frontUMOGA

f1(x)

f2(x)

f2(x)

Figure 3 Pareto fronts obtained by different algorithms for test problem ZDT1

Table 4 Comparison of IGD metric

UKMOPSO UMOGA NSGA-IIFON 00072 00432 03707KUR 00668 11576 12323ZDT1 00067 00265 13933ZDT2 00067 00727 21409ZDT3 01964 02151 02816ZDT6 00033 00139 59223DTLZ1 74827 22965 54182DTLZ2 01123 04319 02724

Table 5 Comparison of maximum spread (MS) metric

UKMOPSO UMOGA NSGA-IIFON 09769 09754 02368KUR 09789 09580 03414ZDT1 09429 09984 05495ZDT2 09999 09986 00019ZDT3 09276 09275 07963ZDT6 1 09974 03738DTLZ1 1 1 06588DTLZ2 1 1 07125

12 Mathematical Problems in Engineering

0

02

04

06

08

1

12

14100 9

0

05

1

15

2

25

38

0 01 02 03 04 05 06 07 08 09 1

True Pareto frontUKMOPSO

0 01 02 03 04 05 06 07 08 09 10

01

02

03

04

05

06

07

08

09

1

True Pareto frontUMOGA

0 01 02 03 04 05 06 07 08 09 1

True Pareto frontNSGA-II

f1(x) f1(x)

f2(x)

f2(x)

f2(x)

f1(x)

Figure 4 Pareto fronts obtained by different algorithms for test problem ZDT2

As shown in Table 2 for the test functions Fon and KUR11986212= 0769 and 119862

12= 0929 mean that 768 and 929 of

the solutions obtained by UMOGA are dominated by thoseobtained by UKMOPSO Similarly 119862

21= 001means that 1

of the solutions obtained by UKMOPSO are dominated bythose obtained by UMOGA For DTLZ1 and DTLZ2119862

12= 1

and 11986221= 0 mean that all solutions obtained by UMOGA

are dominated by those obtained by UKMOPSO but noneof solutions from UKMOPSO is dominated by those fromUMOGA It means that the solution quality via UKMOPSOis much better than that via UMOGA for the above testfunctions For ZDT1 ZDT2 ZDT3 and ZDT6 the valuesof the 119862 metrics do not have too many differences betweenUKMOPSO and UMOGA It means the solution qualities viaboth are almost identical

FromTable 3 we can see that for ZDT1 ZDT2 ZDT3 andZDT6 119862

13= 1 and 119862

31= 0mean that all solutions obtained

byNSGA-II are dominated by those obtained byUKMOPSObut none of solutions from UKMOPSO is dominated bythose from NSGA-II It means that the solution quality viaUKMOPSO is much better than that via NSGA-II for theabove test functions For the rest of the test functions thesolution qualities via both are almost identical

In Table 4 for all test problems the IGD values ofUKMOPSO are the smallest among UKMOPSO UMOGAand NSGA-II This means the PF found by UKMOPSO isthe nearest to the true PF compared with the PF obtainedby the other two algorithms namely the performance ofUKMOPSO is the best in the three algorithms The IGDvalues of UMOGA are all larger than those of NSGA-II for

Mathematical Problems in Engineering 13

0

02

04

06

08

1 92 86

0

05

1

15

2

25

3

35

4

45 40

0 01 02 03 04 05 06 07 08 09f1(x)

0 01 02 03 04 05 06 07 08 09f1(x)

0 01 02 03 04 05 06 07 08 09f1(x)

f2(x)

minus08

minus06

minus04

minus02

0

02

04

06

08

1

f2(x)

f2(x)

minus08

minus05

minus06

minus04

minus02

UKMOPSO UMOGA

NSGA-II

Figure 5 Pareto fronts obtained by different algorithms for test problem ZDT3

all test functions It means that the PF found by UMOGA iscloser to the true PF than the PF obtained by NSGA-II

From Table 5 we can see that all the MS values ofUKMOPSO are almost close to 1 for each test function Itmeans that almost all the true PF is totally covered by the PFobtained by UKMOPSO UMOGA is similar to UKMOPSOThe MS values of NSGA-II are much lesser than UKMOPSOand UMOGA especially MS = 00019 for ZDT2

Figures 1ndash6 all demonstrate that the PF found byUKMOPSO is the nearest to the true PF and scatters mostuniformly in those obtained by three algorithms Most of thepoints in the PF found by UKMOPSO overlap the pointsin the true PF It means the solution quality obtained byUKMOPSO is very high

45 Influence of the Uniform Crossover and PAM In orderto find out the influence of the uniform crossover on the

proposed algorithm we compare the distribution and num-ber of Pareto optimal solutions before and after performingthe uniform crossover One of the simulating results on thetest function ZDT1 is shown in Figure 7

From Figure 7 it can be seen that before and afterperforming the uniform crossover the number of data pointsin the 1st cluster and the 2nd cluster varies from 7 to 15and from 19 to 26 respectively namely many new Paretooptimal solutions having not been found before performingthe uniform crossover are generated and the differences ofthe data points between two clusters have been decreasedThis will directly influence the uniformity of the PF andacquire more uniform Pareto solutions

PAM is used to determine which Pareto solutions are tobe removed fromor be inserted into the external archiveThisis to maintain the diversity of Pareto optimal solutions The

14 Mathematical Problems in Engineering

100 16

0

05

1

15

2

25

3

35

4

45

540

02 03 04 05 06 07 08 09 10

01

02

03

04

05

06

07

08

09

1

True Pareto frontUKMOPSO

02 03 04 05 06 07 08 09 10

01

02

03

04

05

06

07

08

09

1

True Pareto frontUMOGA

02 03 04 05 06 07 08 09 1

True Pareto frontNSGA-II

f1(x)

f2(x)

f1(x)

f2(x)

f2(x)

f1(x)

Figure 6 Pareto fronts obtained by different algorithms for test problem ZDT6

diversity can be computed by the ldquodistance-to-average-pointrdquomeasure [44] defined as

diversity (119878) = 1

|119878|

|119878|

sum119894=1

radic119873

sum119895=1

(119901119894119895minus 119901119895)2

(20)

where 119878 is the population |119878| is the swarm size 119873 is thedimensionality of the problem 119901

119894119895is the 1198951015840th value of the 1198941015840th

particle and 119901119895is the 1198951015840th value of the average point 119901

If the number of Pareto optimality is less than or equalto the size of the external archive PAM has no influenceon the proposed algorithm Otherwise it is used to selectthe different type of Pareto optimality from several clustersTherefore The diversity of Pareto optimality will certainlyincreaseWemonitor the diversity of Pareto optimality before

and after performing PAMonZDT1 at a certain timeThe val-ues are respectively 8762 and 11752 This fully demonstratesthat PAM can improve the diversity of Pareto optimality

5 Conclusion and Future Work

In this paper a multiobjective particle swarm optimizationbased on PAM and uniform design is presented It firstlyimplements PAM to partition the Pareto front into119870 clustersin the objective space and then it implements the uniformcrossover operator on the minimal cluster so as to generatemore Pareto optimality in the minimal cluster When thesize of the Pareto solution is larger than that of the externalarchive PAM is used to determine which Pareto solutions areto be removed from or be inserted into the external archive

Mathematical Problems in Engineering 15

0 01 02 03 04 05 06 07 08 09

19

10

01

02

03

04

05

06

077

08

09

1

The 1st clusterThe 2nd cluster

26

(a)

41

0 01 02 03 04 05 06

26

07 08 09 10

01

02

03

04

051506

07

08

09

1

The 1st clusterThe 2nd cluster

(b)

Figure 7 Pareto optimal solutions before (a) and after (b) performing the uniform crossover

Finally it keeps all the points in the lesser clusters anddiscards some points in the larger clusters This can ensurethat each of the clusters will contain approximately the samedata points Therefore the diversity of the Pareto solutionswill increase and they can scatter uniformly over the Paretofront The results of the experimental simulation performedon several well-known test problems indicate that theproposed algorithm obviously outperforms the other twoalgorithms

This algorithm is going on for further enhancement andimprovement One attempt is to use a more efficient orapproximate clustering algorithm to speed up the executiontime of this algorithm Another attempt is to extend its appli-cation scopes

Conflict of Interests

The authors declare that there is no conflict of interestsregarding the publication of this paper

Acknowledgments

This research was supported by Guangxi Natural ScienceFoundation (no 2013GXNSFAA019337) Guangxi Universi-ties Key Project of Science and Technology Research (noKY2015ZD099) Key project of Guangxi Education Depart-ment (no 2013ZD055) Scientific Research Staring Founda-tion for the PHD Scholars of Yulin Normal University (noG2014005) Special Project of Yulin Normal University (no2012YJZX04) and Key Project of Yulin Normal University(no 2014YJZD05)The authors are grateful to the anonymousreferee for a careful checking of the details and for helpfulcomments that improved this paper

References

[1] J D Schaffer ldquoMultiple objective optimization with vector eval-uated genetic algorithmsrdquo in Proceedings of the 1st InternationalConference on Genetic Algorithms pp 93ndash100 1985

[2] L Tang and X Wang ldquoA hybrid multiobjective evolutionaryalgorithm for multiobjective optimization problemsrdquo IEEETransactions on Evolutionary Computation vol 17 no 1 pp 20ndash45 2013

[3] K Deb S Agrawal A Pratap and T Meyarivan ldquoA fast elitistnon-dominated sorting genetic algorithm for multi-objectiveoptimizationNSGA-IIrdquo inParallel ProblemSolving fromNaturePPSN VI vol 1917 of Lecture Notes in Computer Science pp849ndash858 Springer Berlin Germany 2000

[4] J Zhang Y Wang and J Feng ldquoAttribute index and uniformdesign based multiobjective association rule mining with evo-lutionary algorithmrdquo The Scientific World Journal vol 2013Article ID 259347 16 pages 2013

[5] S Jiang and Z Cai ldquoFaster convergence and higher hypervol-ume for multi-objective evolutionary algorithms by orthogonaland uniform designrdquo in Advances in Computation and Intelli-gence vol 6382 of Lecture Notes in Computer Science pp 312ndash328 Springer Berlin Germany 2010

[6] J Kennedy and R Eberhart ldquoParticle swarm optimizationrdquoin Proceedings of the IEEE International Conference on NeuralNetworks pp 1942ndash1948 December 1995

[7] R Eberhart and J Kennedy ldquoA new optimizer using particleswarm theoryrdquo in Proceedings of the 6th International Sympo-sium on Micro Machine and Human Science pp 39ndash43 IEEENagoya Japan October 1995

[8] L Yang ldquoA fast and elitist multi-objective particle swarm algo-rithm NSPSOrdquo in Proceedings of the IEEE International Con-ference on Granular Computing (GRC rsquo08) pp 470ndash475 August2008

[9] C A C Coello G T Pulido and M S Lechuga ldquoHandlingmultiple objectives with particle swarm optimizationrdquo IEEE

16 Mathematical Problems in Engineering

Transactions on Evolutionary Computation vol 8 no 3 pp256ndash279 2004

[10] M Clerc and J Kennedy ldquoThe particle swarm-explosion sta-bility and convergence in a multidimensional complex spacerdquoIEEE Transactions on Evolutionary Computation vol 6 no 1pp 58ndash73 2002

[11] H Xiaohui and R Eberhart ldquoMultiobjective optimization usingdynamic neighborhood particle swarm optimizationrdquo in Pro-ceedings of the Congress on Evolutionary Computation (CEC rsquo02)vol 2 pp 1677ndash1681 May 2002

[12] S Mostaghim and J Teich ldquoStrategies for finding goodlocal guides in multi-objective particle swarm optimization(MOPSO)rdquo in Proceedings of the IEEE Swarm IntelligenceSymposium (SIS rsquo03) pp 26ndash33 Indianapolis Ind USA April2003

[13] T Bartz-Beielstein P Limbourg J Mehnen K Schmitt K EParsopoulos and M N Vrahatis ldquoParticle swarm optimizersfor Pareto optimizationwith enhanced archiving techniquesrdquo inProceedings of the Congress on Evolutionary Computation (CECrsquo03) vol 3 pp 1780ndash1787 December 2003

[14] K E Parsopoulos D K Tasoulis and M N Vrahatis ldquoMulti-objective optimization using parallel vector evaluated particleswarm optimizationrdquo in Proceedings of the IASTED Interna-tional Conference on Artificial Intelligence and Applications (AIArsquo04) vol 2 pp 823ndash828 February 2004

[15] S-J Tsai T-Y Sun C-C Liu S-T Hsieh W-C Wu and S-YChiu ldquoAn improved multi-objective particle swarm optimizerformulti-objective problemsrdquo Expert Systems with Applicationsvol 37 no 8 pp 5872ndash5886 2010

[16] A A Mousa M A El-Shorbagy and W F Abd-El-WahedldquoLocal search based hybrid particle swarm optimization algo-rithm for multiobjective optimizationrdquo Swarm and Evolution-ary Computation vol 3 pp 1ndash14 2012

[17] J Wang W Liu W Zhang and B Yang ldquoMulti-objective parti-cle swarm optimization based on self-update and grid strategyrdquoin Proceedings of the 2012 International Conference on Informa-tion Technology and Software Engineering vol 211 of LectureNotes in Electrical Engineering pp 869ndash876 Springer BerlinGermany 2013

[18] K Khalili-Damghani A-R Abtahi and M Tavana ldquoA newmulti-objective particle swarmoptimizationmethod for solvingreliability redundancy allocation problemsrdquo Reliability Engi-neering amp System Safety vol 111 pp 58ndash75 2013

[19] J Zhang Y Wang and J Feng ldquoParallel particle swarm opti-mization based on PAMrdquo in Proceedings of the 2nd InternationalConference on Information Engineering and Computer Science(ICIECS rsquo10) Wuhan China December 2010

[20] J Han and M Kamber Data Mining Concepts and TechniquesMorgan Kaufmann 2006

[21] L F Ibrahim ldquoUsing of clustering algorithm CWSP-PAM forrural network planningrdquo in Proceedings of the 3rd InternationalConference on Information Technology and Applications (ICITArsquo05) vol 1 pp 280ndash283 IEEE July 2005

[22] P Chopra J Kang and J Lee ldquoUsing gene pair combinationsto improve the accuracy of the PAM classifierrdquo in Proceedingsof the IEEE International Conference on Bioinformatics andBiomedicine (BIBM rsquo09) pp 174ndash177 November 2009

[23] Y-W Leung and Y Wang ldquoMultiobjective programming usinguniform design and genetic algorithmrdquo IEEE Transactions onSystems Man and Cybernetics Part C Applications and Reviewsvol 30 no 3 pp 293ndash304 2000

[24] D Liu K C Tan C K Goh and W K Ho ldquoA multiobjectivememetic algorithm based on particle swarm optimizationrdquoIEEE Transactions on Systems Man and Cybernetics Part BCybernetics vol 37 no 1 pp 42ndash50 2007

[25] K Deb A Pratap S Agarwal and T Meyarivan ldquoA fast andelitist multiobjective genetic algorithm NSGA-IIrdquo IEEE Trans-actions on Evolutionary Computation vol 6 no 2 pp 182ndash1972002

[26] C M Fonseca and P J Fleming ldquoMultiobjective optimizationandmultiple constraint handling with evolutionary algorithmsII Application examplerdquo IEEE Transactions on Systems Manand Cybernetics Part A Systems and Humans vol 28 no 1 pp38ndash47 1998

[27] F Kursawe ldquoA variant of evolution strategies for vector opti-mizationrdquo in Parallel Problem Solving from Nature vol 496 ofLecture Notes in Computer Science pp 193ndash197 Springer BerlinGermany 1991

[28] E Zitzler K Deb and LThiele ldquoComparison of multiobjectiveevolutionary algorithms empirical resultsrdquo Evolutionary Com-putation vol 8 no 2 pp 173ndash195 2000

[29] H-L Liu Y Wang and Y-M Cheung ldquoA Multi-objective evo-lutionary algorithm usingmin-max strategy and sphere coordi-nate transformationrdquo Intelligent Automation amp Soft Computingvol 15 no 3 pp 361ndash384 2009

[30] N Chakraborti B S Kumar V S Babu S Moitra andA Mukhopadhyay ldquoA new multi-objective genetic algorithmapplied to hot-rolling processrdquoAppliedMathematicalModellingvol 32 no 9 pp 1781ndash1789 2008

[31] K Deb ldquoMulti-objective genetic algorithms problem difficul-ties and construction of test problemsrdquo Evolutionary Computa-tion vol 7 no 3 pp 205ndash230 1999

[32] K Deb L Thiele M Laumanns and E Zitzler ldquoScalable testproblems for evolutionarymultiobjective optimizationrdquo in Evo-lutionary Multiobjective Optimization Advanced Informationand Knowledge Processing pp 105ndash145 Springer London UK2005

[33] Y Shi and R Eberhart ldquoModified particle swarm optimizerrdquoin Proceedings of the IEEE World Congress on ComputationalIntelligence pp 69ndash73 May 1998

[34] C S Feng S Cong and X Y Feng ldquoA new adaptive inertiaweight strategy in particle swarm optimizationrdquo in Procedingsof the IEEE Congress on Evolutionary Computation (CEC rsquo07)pp 4186ndash4190 IEEE Singapore September 2007

[35] C K Goh K C Tan D S Liu and S C Chiam ldquoA competitiveand cooperative co-evolutionary approach to multi-objectiveparticle swarm optimization algorithm designrdquo European Jour-nal of Operational Research vol 202 no 1 pp 42ndash54 2010

[36] P Cheng J-S Pan L Li Y Tang and C Huang ldquoA surveyof performance assessment for multiobjective optimizersrdquo inProceedings of the 4th International Conference on Genetic andEvolutionary Computing (ICGEC rsquo10) pp 341ndash345 December2010

[37] E Zitzler L Thiele M Laumanns C M Fonseca and V GDa Fonseca ldquoPerformance assessment of multiobjective opti-mizers an analysis and reviewrdquo IEEE Transactions on Evolution-ary Computation vol 7 no 2 pp 117ndash132 2003

[38] E Zitzler and L Thiele ldquoMultiobjective evolutionary algo-rithms a comparative case study and the strength Paretoapproachrdquo IEEETransactions onEvolutionaryComputation vol3 no 4 pp 257ndash271 1999

[39] H Li and Q Zhang ldquoMultiobjective optimization problemswith complicated pareto sets MOEAD and NSGA-IIrdquo IEEE

Mathematical Problems in Engineering 17

Transactions on Evolutionary Computation vol 13 no 2 pp284ndash302 2009

[40] Q Zhang A Zhou S Zhao P N Suganthan W Liu andS Tiwari ldquoMultiobjective optimization test instances for theCEC 2009 special session and competitionrdquo Working ReportDepartment of Computing and Electronic Systems Universityof Essex 2008

[41] Q Zhang A Zhou and Y Jin ldquoRM-MEDA a regularity model-based multiobjective estimation of distribution algorithmrdquoIEEE Transactions on Evolutionary Computation vol 12 no 1pp 41ndash63 2008

[42] C-S Tsou H-H Fang H-H Chang and C-H Kao ldquoAnimproved particle swarmPareto optimizerwith local search andclusteringrdquo in Simulated Evolution and Learning vol 4247 ofLecture Notes in Computer Science pp 400ndash407 2006

[43] U Wickramasinghe and X Li ldquoChoosing leaders for multiob-jective PSO algorithms using differential evolutionrdquo in Proceed-ing of the 7th International Conference Simulated Evolution andLearning vol 5361 of Lecture Notes in Computer Science pp249ndash258 Springer Berlin Germany 2008

[44] T Krink J S Vesterstrom and J Riget ldquoParticle swarm opti-misation with spatial particle extensionrdquo in Proceedings of theCongress on Evolutionary Computation (CEC rsquo02) pp 1474ndash1479 May 2002

Submit your manuscripts athttpwwwhindawicom

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical Problems in Engineering

Hindawi Publishing Corporationhttpwwwhindawicom

Differential EquationsInternational Journal of

Volume 2014

Applied MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Probability and StatisticsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical PhysicsAdvances in

Complex AnalysisJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

OptimizationJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

CombinatoricsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Operations ResearchAdvances in

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Function Spaces

Abstract and Applied AnalysisHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of Mathematics and Mathematical Sciences

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Algebra

Discrete Dynamics in Nature and Society

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Decision SciencesAdvances in

Discrete MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014 Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Stochastic AnalysisInternational Journal of

Page 10: Research Article Multiobjective Particle Swarm

10 Mathematical Problems in Engineering

0100

UKMOPSOTrue Pareto front

14

UMOGATrue Pareto front

40

NSGA-IITrue Pareto front

f1(x)

f2(x)

minus20 minus19 minus18 minus17 minus16 minus15 minus14

f1(x)

minus20 minus19 minus18 minus17 minus16 minus15 minus14 minus13 minus12minus12

minus10

minus8

minus6

minus4

minus2

0

f2(x)

f1(x)

minus20 minus19 minus18 minus17 minus16 minus15 minus14minus12

minus10

minus8

minus6

minus4

minus2

0

f2(x)

minus12

minus10

minus8

minus6

minus4

minus2

Figure 2 Pareto fronts obtained by different algorithms for test problem KUR

Table 2 Comparison of C metric between UKMOPSO andUMOGA

C (UKMOPSO UMOGA) C (UMOGA UKMOPSO)FON 0769 001KUR 0929 001ZDT1 012 013ZDT2 0063 003ZDT3 0139 0098ZDT6 0063 001DTLZ1 1 0DTLZ2 1 0

Table 3 Comparison of Cmetric between UKMOPSO and NSGA-II

C (UKMOPSO NSGA-II) C (NSGA-II UKMOPSO)FON 0025 001KUR 0075 011ZDT1 1 0ZDT2 1 0ZDT3 1 0ZDT6 1 0DTLZ1 0 013DTLZ2 0025 006

Mathematical Problems in Engineering 11

0

02

04

06

08

1

12

14100 76

0

05

1

15

2

25

325

0 01 02 03 04 05 06 07 08 09 1

True Pareto frontUKMOPSO

0 01 02 03 04 05 06 07 08 09 1

True Pareto frontNSGA-II

f1(x)

f2(x)

f1(x)

0 01 02 03 04 05 06 07 08 09 10

01

02

03

04

05

06

07

08

09

1

True Pareto frontUMOGA

f1(x)

f2(x)

f2(x)

Figure 3 Pareto fronts obtained by different algorithms for test problem ZDT1

Table 4 Comparison of IGD metric

UKMOPSO UMOGA NSGA-IIFON 00072 00432 03707KUR 00668 11576 12323ZDT1 00067 00265 13933ZDT2 00067 00727 21409ZDT3 01964 02151 02816ZDT6 00033 00139 59223DTLZ1 74827 22965 54182DTLZ2 01123 04319 02724

Table 5 Comparison of maximum spread (MS) metric

UKMOPSO UMOGA NSGA-IIFON 09769 09754 02368KUR 09789 09580 03414ZDT1 09429 09984 05495ZDT2 09999 09986 00019ZDT3 09276 09275 07963ZDT6 1 09974 03738DTLZ1 1 1 06588DTLZ2 1 1 07125

12 Mathematical Problems in Engineering

0

02

04

06

08

1

12

14100 9

0

05

1

15

2

25

38

0 01 02 03 04 05 06 07 08 09 1

True Pareto frontUKMOPSO

0 01 02 03 04 05 06 07 08 09 10

01

02

03

04

05

06

07

08

09

1

True Pareto frontUMOGA

0 01 02 03 04 05 06 07 08 09 1

True Pareto frontNSGA-II

f1(x) f1(x)

f2(x)

f2(x)

f2(x)

f1(x)

Figure 4 Pareto fronts obtained by different algorithms for test problem ZDT2

As shown in Table 2 for the test functions Fon and KUR11986212= 0769 and 119862

12= 0929 mean that 768 and 929 of

the solutions obtained by UMOGA are dominated by thoseobtained by UKMOPSO Similarly 119862

21= 001means that 1

of the solutions obtained by UKMOPSO are dominated bythose obtained by UMOGA For DTLZ1 and DTLZ2119862

12= 1

and 11986221= 0 mean that all solutions obtained by UMOGA

are dominated by those obtained by UKMOPSO but noneof solutions from UKMOPSO is dominated by those fromUMOGA It means that the solution quality via UKMOPSOis much better than that via UMOGA for the above testfunctions For ZDT1 ZDT2 ZDT3 and ZDT6 the valuesof the 119862 metrics do not have too many differences betweenUKMOPSO and UMOGA It means the solution qualities viaboth are almost identical

FromTable 3 we can see that for ZDT1 ZDT2 ZDT3 andZDT6 119862

13= 1 and 119862

31= 0mean that all solutions obtained

byNSGA-II are dominated by those obtained byUKMOPSObut none of solutions from UKMOPSO is dominated bythose from NSGA-II It means that the solution quality viaUKMOPSO is much better than that via NSGA-II for theabove test functions For the rest of the test functions thesolution qualities via both are almost identical

In Table 4 for all test problems the IGD values ofUKMOPSO are the smallest among UKMOPSO UMOGAand NSGA-II This means the PF found by UKMOPSO isthe nearest to the true PF compared with the PF obtainedby the other two algorithms namely the performance ofUKMOPSO is the best in the three algorithms The IGDvalues of UMOGA are all larger than those of NSGA-II for

Mathematical Problems in Engineering 13

0

02

04

06

08

1 92 86

0

05

1

15

2

25

3

35

4

45 40

0 01 02 03 04 05 06 07 08 09f1(x)

0 01 02 03 04 05 06 07 08 09f1(x)

0 01 02 03 04 05 06 07 08 09f1(x)

f2(x)

minus08

minus06

minus04

minus02

0

02

04

06

08

1

f2(x)

f2(x)

minus08

minus05

minus06

minus04

minus02

UKMOPSO UMOGA

NSGA-II

Figure 5 Pareto fronts obtained by different algorithms for test problem ZDT3

all test functions It means that the PF found by UMOGA iscloser to the true PF than the PF obtained by NSGA-II

From Table 5 we can see that all the MS values ofUKMOPSO are almost close to 1 for each test function Itmeans that almost all the true PF is totally covered by the PFobtained by UKMOPSO UMOGA is similar to UKMOPSOThe MS values of NSGA-II are much lesser than UKMOPSOand UMOGA especially MS = 00019 for ZDT2

Figures 1ndash6 all demonstrate that the PF found byUKMOPSO is the nearest to the true PF and scatters mostuniformly in those obtained by three algorithms Most of thepoints in the PF found by UKMOPSO overlap the pointsin the true PF It means the solution quality obtained byUKMOPSO is very high

45 Influence of the Uniform Crossover and PAM In orderto find out the influence of the uniform crossover on the

proposed algorithm we compare the distribution and num-ber of Pareto optimal solutions before and after performingthe uniform crossover One of the simulating results on thetest function ZDT1 is shown in Figure 7

From Figure 7 it can be seen that before and afterperforming the uniform crossover the number of data pointsin the 1st cluster and the 2nd cluster varies from 7 to 15and from 19 to 26 respectively namely many new Paretooptimal solutions having not been found before performingthe uniform crossover are generated and the differences ofthe data points between two clusters have been decreasedThis will directly influence the uniformity of the PF andacquire more uniform Pareto solutions

PAM is used to determine which Pareto solutions are tobe removed fromor be inserted into the external archiveThisis to maintain the diversity of Pareto optimal solutions The

14 Mathematical Problems in Engineering

100 16

0

05

1

15

2

25

3

35

4

45

540

02 03 04 05 06 07 08 09 10

01

02

03

04

05

06

07

08

09

1

True Pareto frontUKMOPSO

02 03 04 05 06 07 08 09 10

01

02

03

04

05

06

07

08

09

1

True Pareto frontUMOGA

02 03 04 05 06 07 08 09 1

True Pareto frontNSGA-II

f1(x)

f2(x)

f1(x)

f2(x)

f2(x)

f1(x)

Figure 6 Pareto fronts obtained by different algorithms for test problem ZDT6

diversity can be computed by the ldquodistance-to-average-pointrdquomeasure [44] defined as

diversity (119878) = 1

|119878|

|119878|

sum119894=1

radic119873

sum119895=1

(119901119894119895minus 119901119895)2

(20)

where 119878 is the population |119878| is the swarm size 119873 is thedimensionality of the problem 119901

119894119895is the 1198951015840th value of the 1198941015840th

particle and 119901119895is the 1198951015840th value of the average point 119901

If the number of Pareto optimality is less than or equalto the size of the external archive PAM has no influenceon the proposed algorithm Otherwise it is used to selectthe different type of Pareto optimality from several clustersTherefore The diversity of Pareto optimality will certainlyincreaseWemonitor the diversity of Pareto optimality before

and after performing PAMonZDT1 at a certain timeThe val-ues are respectively 8762 and 11752 This fully demonstratesthat PAM can improve the diversity of Pareto optimality

5 Conclusion and Future Work

In this paper a multiobjective particle swarm optimizationbased on PAM and uniform design is presented It firstlyimplements PAM to partition the Pareto front into119870 clustersin the objective space and then it implements the uniformcrossover operator on the minimal cluster so as to generatemore Pareto optimality in the minimal cluster When thesize of the Pareto solution is larger than that of the externalarchive PAM is used to determine which Pareto solutions areto be removed from or be inserted into the external archive

Mathematical Problems in Engineering 15

0 01 02 03 04 05 06 07 08 09

19

10

01

02

03

04

05

06

077

08

09

1

The 1st clusterThe 2nd cluster

26

(a)

41

0 01 02 03 04 05 06

26

07 08 09 10

01

02

03

04

051506

07

08

09

1

The 1st clusterThe 2nd cluster

(b)

Figure 7 Pareto optimal solutions before (a) and after (b) performing the uniform crossover

Finally it keeps all the points in the lesser clusters anddiscards some points in the larger clusters This can ensurethat each of the clusters will contain approximately the samedata points Therefore the diversity of the Pareto solutionswill increase and they can scatter uniformly over the Paretofront The results of the experimental simulation performedon several well-known test problems indicate that theproposed algorithm obviously outperforms the other twoalgorithms

This algorithm is going on for further enhancement andimprovement One attempt is to use a more efficient orapproximate clustering algorithm to speed up the executiontime of this algorithm Another attempt is to extend its appli-cation scopes

Conflict of Interests

The authors declare that there is no conflict of interestsregarding the publication of this paper

Acknowledgments

This research was supported by Guangxi Natural ScienceFoundation (no 2013GXNSFAA019337) Guangxi Universi-ties Key Project of Science and Technology Research (noKY2015ZD099) Key project of Guangxi Education Depart-ment (no 2013ZD055) Scientific Research Staring Founda-tion for the PHD Scholars of Yulin Normal University (noG2014005) Special Project of Yulin Normal University (no2012YJZX04) and Key Project of Yulin Normal University(no 2014YJZD05)The authors are grateful to the anonymousreferee for a careful checking of the details and for helpfulcomments that improved this paper

References

[1] J D Schaffer ldquoMultiple objective optimization with vector eval-uated genetic algorithmsrdquo in Proceedings of the 1st InternationalConference on Genetic Algorithms pp 93ndash100 1985

[2] L Tang and X Wang ldquoA hybrid multiobjective evolutionaryalgorithm for multiobjective optimization problemsrdquo IEEETransactions on Evolutionary Computation vol 17 no 1 pp 20ndash45 2013

[3] K Deb S Agrawal A Pratap and T Meyarivan ldquoA fast elitistnon-dominated sorting genetic algorithm for multi-objectiveoptimizationNSGA-IIrdquo inParallel ProblemSolving fromNaturePPSN VI vol 1917 of Lecture Notes in Computer Science pp849ndash858 Springer Berlin Germany 2000

[4] J Zhang Y Wang and J Feng ldquoAttribute index and uniformdesign based multiobjective association rule mining with evo-lutionary algorithmrdquo The Scientific World Journal vol 2013Article ID 259347 16 pages 2013

[5] S Jiang and Z Cai ldquoFaster convergence and higher hypervol-ume for multi-objective evolutionary algorithms by orthogonaland uniform designrdquo in Advances in Computation and Intelli-gence vol 6382 of Lecture Notes in Computer Science pp 312ndash328 Springer Berlin Germany 2010

[6] J Kennedy and R Eberhart ldquoParticle swarm optimizationrdquoin Proceedings of the IEEE International Conference on NeuralNetworks pp 1942ndash1948 December 1995

[7] R Eberhart and J Kennedy ldquoA new optimizer using particleswarm theoryrdquo in Proceedings of the 6th International Sympo-sium on Micro Machine and Human Science pp 39ndash43 IEEENagoya Japan October 1995

[8] L Yang ldquoA fast and elitist multi-objective particle swarm algo-rithm NSPSOrdquo in Proceedings of the IEEE International Con-ference on Granular Computing (GRC rsquo08) pp 470ndash475 August2008

[9] C A C Coello G T Pulido and M S Lechuga ldquoHandlingmultiple objectives with particle swarm optimizationrdquo IEEE

16 Mathematical Problems in Engineering

Transactions on Evolutionary Computation vol 8 no 3 pp256ndash279 2004

[10] M Clerc and J Kennedy ldquoThe particle swarm-explosion sta-bility and convergence in a multidimensional complex spacerdquoIEEE Transactions on Evolutionary Computation vol 6 no 1pp 58ndash73 2002

[11] H Xiaohui and R Eberhart ldquoMultiobjective optimization usingdynamic neighborhood particle swarm optimizationrdquo in Pro-ceedings of the Congress on Evolutionary Computation (CEC rsquo02)vol 2 pp 1677ndash1681 May 2002

[12] S Mostaghim and J Teich ldquoStrategies for finding goodlocal guides in multi-objective particle swarm optimization(MOPSO)rdquo in Proceedings of the IEEE Swarm IntelligenceSymposium (SIS rsquo03) pp 26ndash33 Indianapolis Ind USA April2003

[13] T Bartz-Beielstein P Limbourg J Mehnen K Schmitt K EParsopoulos and M N Vrahatis ldquoParticle swarm optimizersfor Pareto optimizationwith enhanced archiving techniquesrdquo inProceedings of the Congress on Evolutionary Computation (CECrsquo03) vol 3 pp 1780ndash1787 December 2003

[14] K E Parsopoulos D K Tasoulis and M N Vrahatis ldquoMulti-objective optimization using parallel vector evaluated particleswarm optimizationrdquo in Proceedings of the IASTED Interna-tional Conference on Artificial Intelligence and Applications (AIArsquo04) vol 2 pp 823ndash828 February 2004

[15] S-J Tsai T-Y Sun C-C Liu S-T Hsieh W-C Wu and S-YChiu ldquoAn improved multi-objective particle swarm optimizerformulti-objective problemsrdquo Expert Systems with Applicationsvol 37 no 8 pp 5872ndash5886 2010

[16] A A Mousa M A El-Shorbagy and W F Abd-El-WahedldquoLocal search based hybrid particle swarm optimization algo-rithm for multiobjective optimizationrdquo Swarm and Evolution-ary Computation vol 3 pp 1ndash14 2012

[17] J Wang W Liu W Zhang and B Yang ldquoMulti-objective parti-cle swarm optimization based on self-update and grid strategyrdquoin Proceedings of the 2012 International Conference on Informa-tion Technology and Software Engineering vol 211 of LectureNotes in Electrical Engineering pp 869ndash876 Springer BerlinGermany 2013

[18] K Khalili-Damghani A-R Abtahi and M Tavana ldquoA newmulti-objective particle swarmoptimizationmethod for solvingreliability redundancy allocation problemsrdquo Reliability Engi-neering amp System Safety vol 111 pp 58ndash75 2013

[19] J Zhang Y Wang and J Feng ldquoParallel particle swarm opti-mization based on PAMrdquo in Proceedings of the 2nd InternationalConference on Information Engineering and Computer Science(ICIECS rsquo10) Wuhan China December 2010

[20] J Han and M Kamber Data Mining Concepts and TechniquesMorgan Kaufmann 2006

[21] L F Ibrahim ldquoUsing of clustering algorithm CWSP-PAM forrural network planningrdquo in Proceedings of the 3rd InternationalConference on Information Technology and Applications (ICITArsquo05) vol 1 pp 280ndash283 IEEE July 2005

[22] P Chopra J Kang and J Lee ldquoUsing gene pair combinationsto improve the accuracy of the PAM classifierrdquo in Proceedingsof the IEEE International Conference on Bioinformatics andBiomedicine (BIBM rsquo09) pp 174ndash177 November 2009

[23] Y-W Leung and Y Wang ldquoMultiobjective programming usinguniform design and genetic algorithmrdquo IEEE Transactions onSystems Man and Cybernetics Part C Applications and Reviewsvol 30 no 3 pp 293ndash304 2000

[24] D Liu K C Tan C K Goh and W K Ho ldquoA multiobjectivememetic algorithm based on particle swarm optimizationrdquoIEEE Transactions on Systems Man and Cybernetics Part BCybernetics vol 37 no 1 pp 42ndash50 2007

[25] K Deb A Pratap S Agarwal and T Meyarivan ldquoA fast andelitist multiobjective genetic algorithm NSGA-IIrdquo IEEE Trans-actions on Evolutionary Computation vol 6 no 2 pp 182ndash1972002

[26] C M Fonseca and P J Fleming ldquoMultiobjective optimizationandmultiple constraint handling with evolutionary algorithmsII Application examplerdquo IEEE Transactions on Systems Manand Cybernetics Part A Systems and Humans vol 28 no 1 pp38ndash47 1998

[27] F Kursawe ldquoA variant of evolution strategies for vector opti-mizationrdquo in Parallel Problem Solving from Nature vol 496 ofLecture Notes in Computer Science pp 193ndash197 Springer BerlinGermany 1991

[28] E Zitzler K Deb and LThiele ldquoComparison of multiobjectiveevolutionary algorithms empirical resultsrdquo Evolutionary Com-putation vol 8 no 2 pp 173ndash195 2000

[29] H-L Liu Y Wang and Y-M Cheung ldquoA Multi-objective evo-lutionary algorithm usingmin-max strategy and sphere coordi-nate transformationrdquo Intelligent Automation amp Soft Computingvol 15 no 3 pp 361ndash384 2009

[30] N Chakraborti B S Kumar V S Babu S Moitra andA Mukhopadhyay ldquoA new multi-objective genetic algorithmapplied to hot-rolling processrdquoAppliedMathematicalModellingvol 32 no 9 pp 1781ndash1789 2008

[31] K Deb ldquoMulti-objective genetic algorithms problem difficul-ties and construction of test problemsrdquo Evolutionary Computa-tion vol 7 no 3 pp 205ndash230 1999

[32] K Deb L Thiele M Laumanns and E Zitzler ldquoScalable testproblems for evolutionarymultiobjective optimizationrdquo in Evo-lutionary Multiobjective Optimization Advanced Informationand Knowledge Processing pp 105ndash145 Springer London UK2005

[33] Y Shi and R Eberhart ldquoModified particle swarm optimizerrdquoin Proceedings of the IEEE World Congress on ComputationalIntelligence pp 69ndash73 May 1998

[34] C S Feng S Cong and X Y Feng ldquoA new adaptive inertiaweight strategy in particle swarm optimizationrdquo in Procedingsof the IEEE Congress on Evolutionary Computation (CEC rsquo07)pp 4186ndash4190 IEEE Singapore September 2007

[35] C K Goh K C Tan D S Liu and S C Chiam ldquoA competitiveand cooperative co-evolutionary approach to multi-objectiveparticle swarm optimization algorithm designrdquo European Jour-nal of Operational Research vol 202 no 1 pp 42ndash54 2010

[36] P Cheng J-S Pan L Li Y Tang and C Huang ldquoA surveyof performance assessment for multiobjective optimizersrdquo inProceedings of the 4th International Conference on Genetic andEvolutionary Computing (ICGEC rsquo10) pp 341ndash345 December2010

[37] E Zitzler L Thiele M Laumanns C M Fonseca and V GDa Fonseca ldquoPerformance assessment of multiobjective opti-mizers an analysis and reviewrdquo IEEE Transactions on Evolution-ary Computation vol 7 no 2 pp 117ndash132 2003

[38] E Zitzler and L Thiele ldquoMultiobjective evolutionary algo-rithms a comparative case study and the strength Paretoapproachrdquo IEEETransactions onEvolutionaryComputation vol3 no 4 pp 257ndash271 1999

[39] H Li and Q Zhang ldquoMultiobjective optimization problemswith complicated pareto sets MOEAD and NSGA-IIrdquo IEEE

Mathematical Problems in Engineering 17

Transactions on Evolutionary Computation vol 13 no 2 pp284ndash302 2009

[40] Q Zhang A Zhou S Zhao P N Suganthan W Liu andS Tiwari ldquoMultiobjective optimization test instances for theCEC 2009 special session and competitionrdquo Working ReportDepartment of Computing and Electronic Systems Universityof Essex 2008

[41] Q Zhang A Zhou and Y Jin ldquoRM-MEDA a regularity model-based multiobjective estimation of distribution algorithmrdquoIEEE Transactions on Evolutionary Computation vol 12 no 1pp 41ndash63 2008

[42] C-S Tsou H-H Fang H-H Chang and C-H Kao ldquoAnimproved particle swarmPareto optimizerwith local search andclusteringrdquo in Simulated Evolution and Learning vol 4247 ofLecture Notes in Computer Science pp 400ndash407 2006

[43] U Wickramasinghe and X Li ldquoChoosing leaders for multiob-jective PSO algorithms using differential evolutionrdquo in Proceed-ing of the 7th International Conference Simulated Evolution andLearning vol 5361 of Lecture Notes in Computer Science pp249ndash258 Springer Berlin Germany 2008

[44] T Krink J S Vesterstrom and J Riget ldquoParticle swarm opti-misation with spatial particle extensionrdquo in Proceedings of theCongress on Evolutionary Computation (CEC rsquo02) pp 1474ndash1479 May 2002

Submit your manuscripts athttpwwwhindawicom

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical Problems in Engineering

Hindawi Publishing Corporationhttpwwwhindawicom

Differential EquationsInternational Journal of

Volume 2014

Applied MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Probability and StatisticsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical PhysicsAdvances in

Complex AnalysisJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

OptimizationJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

CombinatoricsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Operations ResearchAdvances in

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Function Spaces

Abstract and Applied AnalysisHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of Mathematics and Mathematical Sciences

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Algebra

Discrete Dynamics in Nature and Society

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Decision SciencesAdvances in

Discrete MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014 Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Stochastic AnalysisInternational Journal of

Page 11: Research Article Multiobjective Particle Swarm

Mathematical Problems in Engineering 11

0

02

04

06

08

1

12

14100 76

0

05

1

15

2

25

325

0 01 02 03 04 05 06 07 08 09 1

True Pareto frontUKMOPSO

0 01 02 03 04 05 06 07 08 09 1

True Pareto frontNSGA-II

f1(x)

f2(x)

f1(x)

0 01 02 03 04 05 06 07 08 09 10

01

02

03

04

05

06

07

08

09

1

True Pareto frontUMOGA

f1(x)

f2(x)

f2(x)

Figure 3 Pareto fronts obtained by different algorithms for test problem ZDT1

Table 4 Comparison of IGD metric

UKMOPSO UMOGA NSGA-IIFON 00072 00432 03707KUR 00668 11576 12323ZDT1 00067 00265 13933ZDT2 00067 00727 21409ZDT3 01964 02151 02816ZDT6 00033 00139 59223DTLZ1 74827 22965 54182DTLZ2 01123 04319 02724

Table 5 Comparison of maximum spread (MS) metric

UKMOPSO UMOGA NSGA-IIFON 09769 09754 02368KUR 09789 09580 03414ZDT1 09429 09984 05495ZDT2 09999 09986 00019ZDT3 09276 09275 07963ZDT6 1 09974 03738DTLZ1 1 1 06588DTLZ2 1 1 07125

12 Mathematical Problems in Engineering

0

02

04

06

08

1

12

14100 9

0

05

1

15

2

25

38

0 01 02 03 04 05 06 07 08 09 1

True Pareto frontUKMOPSO

0 01 02 03 04 05 06 07 08 09 10

01

02

03

04

05

06

07

08

09

1

True Pareto frontUMOGA

0 01 02 03 04 05 06 07 08 09 1

True Pareto frontNSGA-II

f1(x) f1(x)

f2(x)

f2(x)

f2(x)

f1(x)

Figure 4 Pareto fronts obtained by different algorithms for test problem ZDT2

As shown in Table 2 for the test functions Fon and KUR11986212= 0769 and 119862

12= 0929 mean that 768 and 929 of

the solutions obtained by UMOGA are dominated by thoseobtained by UKMOPSO Similarly 119862

21= 001means that 1

of the solutions obtained by UKMOPSO are dominated bythose obtained by UMOGA For DTLZ1 and DTLZ2119862

12= 1

and 11986221= 0 mean that all solutions obtained by UMOGA

are dominated by those obtained by UKMOPSO but noneof solutions from UKMOPSO is dominated by those fromUMOGA It means that the solution quality via UKMOPSOis much better than that via UMOGA for the above testfunctions For ZDT1 ZDT2 ZDT3 and ZDT6 the valuesof the 119862 metrics do not have too many differences betweenUKMOPSO and UMOGA It means the solution qualities viaboth are almost identical

FromTable 3 we can see that for ZDT1 ZDT2 ZDT3 andZDT6 119862

13= 1 and 119862

31= 0mean that all solutions obtained

byNSGA-II are dominated by those obtained byUKMOPSObut none of solutions from UKMOPSO is dominated bythose from NSGA-II It means that the solution quality viaUKMOPSO is much better than that via NSGA-II for theabove test functions For the rest of the test functions thesolution qualities via both are almost identical

In Table 4 for all test problems the IGD values ofUKMOPSO are the smallest among UKMOPSO UMOGAand NSGA-II This means the PF found by UKMOPSO isthe nearest to the true PF compared with the PF obtainedby the other two algorithms namely the performance ofUKMOPSO is the best in the three algorithms The IGDvalues of UMOGA are all larger than those of NSGA-II for

Mathematical Problems in Engineering 13

0

02

04

06

08

1 92 86

0

05

1

15

2

25

3

35

4

45 40

0 01 02 03 04 05 06 07 08 09f1(x)

0 01 02 03 04 05 06 07 08 09f1(x)

0 01 02 03 04 05 06 07 08 09f1(x)

f2(x)

minus08

minus06

minus04

minus02

0

02

04

06

08

1

f2(x)

f2(x)

minus08

minus05

minus06

minus04

minus02

UKMOPSO UMOGA

NSGA-II

Figure 5 Pareto fronts obtained by different algorithms for test problem ZDT3

all test functions It means that the PF found by UMOGA iscloser to the true PF than the PF obtained by NSGA-II

From Table 5 we can see that all the MS values ofUKMOPSO are almost close to 1 for each test function Itmeans that almost all the true PF is totally covered by the PFobtained by UKMOPSO UMOGA is similar to UKMOPSOThe MS values of NSGA-II are much lesser than UKMOPSOand UMOGA especially MS = 00019 for ZDT2

Figures 1ndash6 all demonstrate that the PF found byUKMOPSO is the nearest to the true PF and scatters mostuniformly in those obtained by three algorithms Most of thepoints in the PF found by UKMOPSO overlap the pointsin the true PF It means the solution quality obtained byUKMOPSO is very high

45 Influence of the Uniform Crossover and PAM In orderto find out the influence of the uniform crossover on the

proposed algorithm we compare the distribution and num-ber of Pareto optimal solutions before and after performingthe uniform crossover One of the simulating results on thetest function ZDT1 is shown in Figure 7

From Figure 7 it can be seen that before and afterperforming the uniform crossover the number of data pointsin the 1st cluster and the 2nd cluster varies from 7 to 15and from 19 to 26 respectively namely many new Paretooptimal solutions having not been found before performingthe uniform crossover are generated and the differences ofthe data points between two clusters have been decreasedThis will directly influence the uniformity of the PF andacquire more uniform Pareto solutions

PAM is used to determine which Pareto solutions are tobe removed fromor be inserted into the external archiveThisis to maintain the diversity of Pareto optimal solutions The

14 Mathematical Problems in Engineering

100 16

0

05

1

15

2

25

3

35

4

45

540

02 03 04 05 06 07 08 09 10

01

02

03

04

05

06

07

08

09

1

True Pareto frontUKMOPSO

02 03 04 05 06 07 08 09 10

01

02

03

04

05

06

07

08

09

1

True Pareto frontUMOGA

02 03 04 05 06 07 08 09 1

True Pareto frontNSGA-II

f1(x)

f2(x)

f1(x)

f2(x)

f2(x)

f1(x)

Figure 6 Pareto fronts obtained by different algorithms for test problem ZDT6

diversity can be computed by the ldquodistance-to-average-pointrdquomeasure [44] defined as

diversity (119878) = 1

|119878|

|119878|

sum119894=1

radic119873

sum119895=1

(119901119894119895minus 119901119895)2

(20)

where 119878 is the population |119878| is the swarm size 119873 is thedimensionality of the problem 119901

119894119895is the 1198951015840th value of the 1198941015840th

particle and 119901119895is the 1198951015840th value of the average point 119901

If the number of Pareto optimality is less than or equalto the size of the external archive PAM has no influenceon the proposed algorithm Otherwise it is used to selectthe different type of Pareto optimality from several clustersTherefore The diversity of Pareto optimality will certainlyincreaseWemonitor the diversity of Pareto optimality before

and after performing PAMonZDT1 at a certain timeThe val-ues are respectively 8762 and 11752 This fully demonstratesthat PAM can improve the diversity of Pareto optimality

5 Conclusion and Future Work

In this paper a multiobjective particle swarm optimizationbased on PAM and uniform design is presented It firstlyimplements PAM to partition the Pareto front into119870 clustersin the objective space and then it implements the uniformcrossover operator on the minimal cluster so as to generatemore Pareto optimality in the minimal cluster When thesize of the Pareto solution is larger than that of the externalarchive PAM is used to determine which Pareto solutions areto be removed from or be inserted into the external archive

Mathematical Problems in Engineering 15

0 01 02 03 04 05 06 07 08 09

19

10

01

02

03

04

05

06

077

08

09

1

The 1st clusterThe 2nd cluster

26

(a)

41

0 01 02 03 04 05 06

26

07 08 09 10

01

02

03

04

051506

07

08

09

1

The 1st clusterThe 2nd cluster

(b)

Figure 7 Pareto optimal solutions before (a) and after (b) performing the uniform crossover

Finally it keeps all the points in the lesser clusters anddiscards some points in the larger clusters This can ensurethat each of the clusters will contain approximately the samedata points Therefore the diversity of the Pareto solutionswill increase and they can scatter uniformly over the Paretofront The results of the experimental simulation performedon several well-known test problems indicate that theproposed algorithm obviously outperforms the other twoalgorithms

This algorithm is going on for further enhancement andimprovement One attempt is to use a more efficient orapproximate clustering algorithm to speed up the executiontime of this algorithm Another attempt is to extend its appli-cation scopes

Conflict of Interests

The authors declare that there is no conflict of interestsregarding the publication of this paper

Acknowledgments

This research was supported by Guangxi Natural ScienceFoundation (no 2013GXNSFAA019337) Guangxi Universi-ties Key Project of Science and Technology Research (noKY2015ZD099) Key project of Guangxi Education Depart-ment (no 2013ZD055) Scientific Research Staring Founda-tion for the PHD Scholars of Yulin Normal University (noG2014005) Special Project of Yulin Normal University (no2012YJZX04) and Key Project of Yulin Normal University(no 2014YJZD05)The authors are grateful to the anonymousreferee for a careful checking of the details and for helpfulcomments that improved this paper

References

[1] J D Schaffer ldquoMultiple objective optimization with vector eval-uated genetic algorithmsrdquo in Proceedings of the 1st InternationalConference on Genetic Algorithms pp 93ndash100 1985

[2] L Tang and X Wang ldquoA hybrid multiobjective evolutionaryalgorithm for multiobjective optimization problemsrdquo IEEETransactions on Evolutionary Computation vol 17 no 1 pp 20ndash45 2013

[3] K Deb S Agrawal A Pratap and T Meyarivan ldquoA fast elitistnon-dominated sorting genetic algorithm for multi-objectiveoptimizationNSGA-IIrdquo inParallel ProblemSolving fromNaturePPSN VI vol 1917 of Lecture Notes in Computer Science pp849ndash858 Springer Berlin Germany 2000

[4] J Zhang Y Wang and J Feng ldquoAttribute index and uniformdesign based multiobjective association rule mining with evo-lutionary algorithmrdquo The Scientific World Journal vol 2013Article ID 259347 16 pages 2013

[5] S Jiang and Z Cai ldquoFaster convergence and higher hypervol-ume for multi-objective evolutionary algorithms by orthogonaland uniform designrdquo in Advances in Computation and Intelli-gence vol 6382 of Lecture Notes in Computer Science pp 312ndash328 Springer Berlin Germany 2010

[6] J Kennedy and R Eberhart ldquoParticle swarm optimizationrdquoin Proceedings of the IEEE International Conference on NeuralNetworks pp 1942ndash1948 December 1995

[7] R Eberhart and J Kennedy ldquoA new optimizer using particleswarm theoryrdquo in Proceedings of the 6th International Sympo-sium on Micro Machine and Human Science pp 39ndash43 IEEENagoya Japan October 1995

[8] L Yang ldquoA fast and elitist multi-objective particle swarm algo-rithm NSPSOrdquo in Proceedings of the IEEE International Con-ference on Granular Computing (GRC rsquo08) pp 470ndash475 August2008

[9] C A C Coello G T Pulido and M S Lechuga ldquoHandlingmultiple objectives with particle swarm optimizationrdquo IEEE

16 Mathematical Problems in Engineering

Transactions on Evolutionary Computation vol 8 no 3 pp256ndash279 2004

[10] M Clerc and J Kennedy ldquoThe particle swarm-explosion sta-bility and convergence in a multidimensional complex spacerdquoIEEE Transactions on Evolutionary Computation vol 6 no 1pp 58ndash73 2002

[11] H Xiaohui and R Eberhart ldquoMultiobjective optimization usingdynamic neighborhood particle swarm optimizationrdquo in Pro-ceedings of the Congress on Evolutionary Computation (CEC rsquo02)vol 2 pp 1677ndash1681 May 2002

[12] S Mostaghim and J Teich ldquoStrategies for finding goodlocal guides in multi-objective particle swarm optimization(MOPSO)rdquo in Proceedings of the IEEE Swarm IntelligenceSymposium (SIS rsquo03) pp 26ndash33 Indianapolis Ind USA April2003

[13] T Bartz-Beielstein P Limbourg J Mehnen K Schmitt K EParsopoulos and M N Vrahatis ldquoParticle swarm optimizersfor Pareto optimizationwith enhanced archiving techniquesrdquo inProceedings of the Congress on Evolutionary Computation (CECrsquo03) vol 3 pp 1780ndash1787 December 2003

[14] K E Parsopoulos D K Tasoulis and M N Vrahatis ldquoMulti-objective optimization using parallel vector evaluated particleswarm optimizationrdquo in Proceedings of the IASTED Interna-tional Conference on Artificial Intelligence and Applications (AIArsquo04) vol 2 pp 823ndash828 February 2004

[15] S-J Tsai T-Y Sun C-C Liu S-T Hsieh W-C Wu and S-YChiu ldquoAn improved multi-objective particle swarm optimizerformulti-objective problemsrdquo Expert Systems with Applicationsvol 37 no 8 pp 5872ndash5886 2010

[16] A A Mousa M A El-Shorbagy and W F Abd-El-WahedldquoLocal search based hybrid particle swarm optimization algo-rithm for multiobjective optimizationrdquo Swarm and Evolution-ary Computation vol 3 pp 1ndash14 2012

[17] J Wang W Liu W Zhang and B Yang ldquoMulti-objective parti-cle swarm optimization based on self-update and grid strategyrdquoin Proceedings of the 2012 International Conference on Informa-tion Technology and Software Engineering vol 211 of LectureNotes in Electrical Engineering pp 869ndash876 Springer BerlinGermany 2013

[18] K Khalili-Damghani A-R Abtahi and M Tavana ldquoA newmulti-objective particle swarmoptimizationmethod for solvingreliability redundancy allocation problemsrdquo Reliability Engi-neering amp System Safety vol 111 pp 58ndash75 2013

[19] J Zhang Y Wang and J Feng ldquoParallel particle swarm opti-mization based on PAMrdquo in Proceedings of the 2nd InternationalConference on Information Engineering and Computer Science(ICIECS rsquo10) Wuhan China December 2010

[20] J Han and M Kamber Data Mining Concepts and TechniquesMorgan Kaufmann 2006

[21] L F Ibrahim ldquoUsing of clustering algorithm CWSP-PAM forrural network planningrdquo in Proceedings of the 3rd InternationalConference on Information Technology and Applications (ICITArsquo05) vol 1 pp 280ndash283 IEEE July 2005

[22] P Chopra J Kang and J Lee ldquoUsing gene pair combinationsto improve the accuracy of the PAM classifierrdquo in Proceedingsof the IEEE International Conference on Bioinformatics andBiomedicine (BIBM rsquo09) pp 174ndash177 November 2009

[23] Y-W Leung and Y Wang ldquoMultiobjective programming usinguniform design and genetic algorithmrdquo IEEE Transactions onSystems Man and Cybernetics Part C Applications and Reviewsvol 30 no 3 pp 293ndash304 2000

[24] D Liu K C Tan C K Goh and W K Ho ldquoA multiobjectivememetic algorithm based on particle swarm optimizationrdquoIEEE Transactions on Systems Man and Cybernetics Part BCybernetics vol 37 no 1 pp 42ndash50 2007

[25] K Deb A Pratap S Agarwal and T Meyarivan ldquoA fast andelitist multiobjective genetic algorithm NSGA-IIrdquo IEEE Trans-actions on Evolutionary Computation vol 6 no 2 pp 182ndash1972002

[26] C M Fonseca and P J Fleming ldquoMultiobjective optimizationandmultiple constraint handling with evolutionary algorithmsII Application examplerdquo IEEE Transactions on Systems Manand Cybernetics Part A Systems and Humans vol 28 no 1 pp38ndash47 1998

[27] F Kursawe ldquoA variant of evolution strategies for vector opti-mizationrdquo in Parallel Problem Solving from Nature vol 496 ofLecture Notes in Computer Science pp 193ndash197 Springer BerlinGermany 1991

[28] E Zitzler K Deb and LThiele ldquoComparison of multiobjectiveevolutionary algorithms empirical resultsrdquo Evolutionary Com-putation vol 8 no 2 pp 173ndash195 2000

[29] H-L Liu Y Wang and Y-M Cheung ldquoA Multi-objective evo-lutionary algorithm usingmin-max strategy and sphere coordi-nate transformationrdquo Intelligent Automation amp Soft Computingvol 15 no 3 pp 361ndash384 2009

[30] N Chakraborti B S Kumar V S Babu S Moitra andA Mukhopadhyay ldquoA new multi-objective genetic algorithmapplied to hot-rolling processrdquoAppliedMathematicalModellingvol 32 no 9 pp 1781ndash1789 2008

[31] K Deb ldquoMulti-objective genetic algorithms problem difficul-ties and construction of test problemsrdquo Evolutionary Computa-tion vol 7 no 3 pp 205ndash230 1999

[32] K Deb L Thiele M Laumanns and E Zitzler ldquoScalable testproblems for evolutionarymultiobjective optimizationrdquo in Evo-lutionary Multiobjective Optimization Advanced Informationand Knowledge Processing pp 105ndash145 Springer London UK2005

[33] Y Shi and R Eberhart ldquoModified particle swarm optimizerrdquoin Proceedings of the IEEE World Congress on ComputationalIntelligence pp 69ndash73 May 1998

[34] C S Feng S Cong and X Y Feng ldquoA new adaptive inertiaweight strategy in particle swarm optimizationrdquo in Procedingsof the IEEE Congress on Evolutionary Computation (CEC rsquo07)pp 4186ndash4190 IEEE Singapore September 2007

[35] C K Goh K C Tan D S Liu and S C Chiam ldquoA competitiveand cooperative co-evolutionary approach to multi-objectiveparticle swarm optimization algorithm designrdquo European Jour-nal of Operational Research vol 202 no 1 pp 42ndash54 2010

[36] P Cheng J-S Pan L Li Y Tang and C Huang ldquoA surveyof performance assessment for multiobjective optimizersrdquo inProceedings of the 4th International Conference on Genetic andEvolutionary Computing (ICGEC rsquo10) pp 341ndash345 December2010

[37] E Zitzler L Thiele M Laumanns C M Fonseca and V GDa Fonseca ldquoPerformance assessment of multiobjective opti-mizers an analysis and reviewrdquo IEEE Transactions on Evolution-ary Computation vol 7 no 2 pp 117ndash132 2003

[38] E Zitzler and L Thiele ldquoMultiobjective evolutionary algo-rithms a comparative case study and the strength Paretoapproachrdquo IEEETransactions onEvolutionaryComputation vol3 no 4 pp 257ndash271 1999

[39] H Li and Q Zhang ldquoMultiobjective optimization problemswith complicated pareto sets MOEAD and NSGA-IIrdquo IEEE

Mathematical Problems in Engineering 17

Transactions on Evolutionary Computation vol 13 no 2 pp284ndash302 2009

[40] Q Zhang A Zhou S Zhao P N Suganthan W Liu andS Tiwari ldquoMultiobjective optimization test instances for theCEC 2009 special session and competitionrdquo Working ReportDepartment of Computing and Electronic Systems Universityof Essex 2008

[41] Q Zhang A Zhou and Y Jin ldquoRM-MEDA a regularity model-based multiobjective estimation of distribution algorithmrdquoIEEE Transactions on Evolutionary Computation vol 12 no 1pp 41ndash63 2008

[42] C-S Tsou H-H Fang H-H Chang and C-H Kao ldquoAnimproved particle swarmPareto optimizerwith local search andclusteringrdquo in Simulated Evolution and Learning vol 4247 ofLecture Notes in Computer Science pp 400ndash407 2006

[43] U Wickramasinghe and X Li ldquoChoosing leaders for multiob-jective PSO algorithms using differential evolutionrdquo in Proceed-ing of the 7th International Conference Simulated Evolution andLearning vol 5361 of Lecture Notes in Computer Science pp249ndash258 Springer Berlin Germany 2008

[44] T Krink J S Vesterstrom and J Riget ldquoParticle swarm opti-misation with spatial particle extensionrdquo in Proceedings of theCongress on Evolutionary Computation (CEC rsquo02) pp 1474ndash1479 May 2002

Submit your manuscripts athttpwwwhindawicom

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical Problems in Engineering

Hindawi Publishing Corporationhttpwwwhindawicom

Differential EquationsInternational Journal of

Volume 2014

Applied MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Probability and StatisticsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical PhysicsAdvances in

Complex AnalysisJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

OptimizationJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

CombinatoricsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Operations ResearchAdvances in

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Function Spaces

Abstract and Applied AnalysisHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of Mathematics and Mathematical Sciences

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Algebra

Discrete Dynamics in Nature and Society

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Decision SciencesAdvances in

Discrete MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014 Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Stochastic AnalysisInternational Journal of

Page 12: Research Article Multiobjective Particle Swarm

12 Mathematical Problems in Engineering

0

02

04

06

08

1

12

14100 9

0

05

1

15

2

25

38

0 01 02 03 04 05 06 07 08 09 1

True Pareto frontUKMOPSO

0 01 02 03 04 05 06 07 08 09 10

01

02

03

04

05

06

07

08

09

1

True Pareto frontUMOGA

0 01 02 03 04 05 06 07 08 09 1

True Pareto frontNSGA-II

f1(x) f1(x)

f2(x)

f2(x)

f2(x)

f1(x)

Figure 4 Pareto fronts obtained by different algorithms for test problem ZDT2

As shown in Table 2 for the test functions Fon and KUR11986212= 0769 and 119862

12= 0929 mean that 768 and 929 of

the solutions obtained by UMOGA are dominated by thoseobtained by UKMOPSO Similarly 119862

21= 001means that 1

of the solutions obtained by UKMOPSO are dominated bythose obtained by UMOGA For DTLZ1 and DTLZ2119862

12= 1

and 11986221= 0 mean that all solutions obtained by UMOGA

are dominated by those obtained by UKMOPSO but noneof solutions from UKMOPSO is dominated by those fromUMOGA It means that the solution quality via UKMOPSOis much better than that via UMOGA for the above testfunctions For ZDT1 ZDT2 ZDT3 and ZDT6 the valuesof the 119862 metrics do not have too many differences betweenUKMOPSO and UMOGA It means the solution qualities viaboth are almost identical

FromTable 3 we can see that for ZDT1 ZDT2 ZDT3 andZDT6 119862

13= 1 and 119862

31= 0mean that all solutions obtained

byNSGA-II are dominated by those obtained byUKMOPSObut none of solutions from UKMOPSO is dominated bythose from NSGA-II It means that the solution quality viaUKMOPSO is much better than that via NSGA-II for theabove test functions For the rest of the test functions thesolution qualities via both are almost identical

In Table 4 for all test problems the IGD values ofUKMOPSO are the smallest among UKMOPSO UMOGAand NSGA-II This means the PF found by UKMOPSO isthe nearest to the true PF compared with the PF obtainedby the other two algorithms namely the performance ofUKMOPSO is the best in the three algorithms The IGDvalues of UMOGA are all larger than those of NSGA-II for

Mathematical Problems in Engineering 13

0

02

04

06

08

1 92 86

0

05

1

15

2

25

3

35

4

45 40

0 01 02 03 04 05 06 07 08 09f1(x)

0 01 02 03 04 05 06 07 08 09f1(x)

0 01 02 03 04 05 06 07 08 09f1(x)

f2(x)

minus08

minus06

minus04

minus02

0

02

04

06

08

1

f2(x)

f2(x)

minus08

minus05

minus06

minus04

minus02

UKMOPSO UMOGA

NSGA-II

Figure 5 Pareto fronts obtained by different algorithms for test problem ZDT3

all test functions It means that the PF found by UMOGA iscloser to the true PF than the PF obtained by NSGA-II

From Table 5 we can see that all the MS values ofUKMOPSO are almost close to 1 for each test function Itmeans that almost all the true PF is totally covered by the PFobtained by UKMOPSO UMOGA is similar to UKMOPSOThe MS values of NSGA-II are much lesser than UKMOPSOand UMOGA especially MS = 00019 for ZDT2

Figures 1ndash6 all demonstrate that the PF found byUKMOPSO is the nearest to the true PF and scatters mostuniformly in those obtained by three algorithms Most of thepoints in the PF found by UKMOPSO overlap the pointsin the true PF It means the solution quality obtained byUKMOPSO is very high

45 Influence of the Uniform Crossover and PAM In orderto find out the influence of the uniform crossover on the

proposed algorithm we compare the distribution and num-ber of Pareto optimal solutions before and after performingthe uniform crossover One of the simulating results on thetest function ZDT1 is shown in Figure 7

From Figure 7 it can be seen that before and afterperforming the uniform crossover the number of data pointsin the 1st cluster and the 2nd cluster varies from 7 to 15and from 19 to 26 respectively namely many new Paretooptimal solutions having not been found before performingthe uniform crossover are generated and the differences ofthe data points between two clusters have been decreasedThis will directly influence the uniformity of the PF andacquire more uniform Pareto solutions

PAM is used to determine which Pareto solutions are tobe removed fromor be inserted into the external archiveThisis to maintain the diversity of Pareto optimal solutions The

14 Mathematical Problems in Engineering

100 16

0

05

1

15

2

25

3

35

4

45

540

02 03 04 05 06 07 08 09 10

01

02

03

04

05

06

07

08

09

1

True Pareto frontUKMOPSO

02 03 04 05 06 07 08 09 10

01

02

03

04

05

06

07

08

09

1

True Pareto frontUMOGA

02 03 04 05 06 07 08 09 1

True Pareto frontNSGA-II

f1(x)

f2(x)

f1(x)

f2(x)

f2(x)

f1(x)

Figure 6 Pareto fronts obtained by different algorithms for test problem ZDT6

diversity can be computed by the ldquodistance-to-average-pointrdquomeasure [44] defined as

diversity (119878) = 1

|119878|

|119878|

sum119894=1

radic119873

sum119895=1

(119901119894119895minus 119901119895)2

(20)

where 119878 is the population |119878| is the swarm size 119873 is thedimensionality of the problem 119901

119894119895is the 1198951015840th value of the 1198941015840th

particle and 119901119895is the 1198951015840th value of the average point 119901

If the number of Pareto optimality is less than or equalto the size of the external archive PAM has no influenceon the proposed algorithm Otherwise it is used to selectthe different type of Pareto optimality from several clustersTherefore The diversity of Pareto optimality will certainlyincreaseWemonitor the diversity of Pareto optimality before

and after performing PAMonZDT1 at a certain timeThe val-ues are respectively 8762 and 11752 This fully demonstratesthat PAM can improve the diversity of Pareto optimality

5 Conclusion and Future Work

In this paper a multiobjective particle swarm optimizationbased on PAM and uniform design is presented It firstlyimplements PAM to partition the Pareto front into119870 clustersin the objective space and then it implements the uniformcrossover operator on the minimal cluster so as to generatemore Pareto optimality in the minimal cluster When thesize of the Pareto solution is larger than that of the externalarchive PAM is used to determine which Pareto solutions areto be removed from or be inserted into the external archive

Mathematical Problems in Engineering 15

0 01 02 03 04 05 06 07 08 09

19

10

01

02

03

04

05

06

077

08

09

1

The 1st clusterThe 2nd cluster

26

(a)

41

0 01 02 03 04 05 06

26

07 08 09 10

01

02

03

04

051506

07

08

09

1

The 1st clusterThe 2nd cluster

(b)

Figure 7 Pareto optimal solutions before (a) and after (b) performing the uniform crossover

Finally it keeps all the points in the lesser clusters anddiscards some points in the larger clusters This can ensurethat each of the clusters will contain approximately the samedata points Therefore the diversity of the Pareto solutionswill increase and they can scatter uniformly over the Paretofront The results of the experimental simulation performedon several well-known test problems indicate that theproposed algorithm obviously outperforms the other twoalgorithms

This algorithm is going on for further enhancement andimprovement One attempt is to use a more efficient orapproximate clustering algorithm to speed up the executiontime of this algorithm Another attempt is to extend its appli-cation scopes

Conflict of Interests

The authors declare that there is no conflict of interestsregarding the publication of this paper

Acknowledgments

This research was supported by Guangxi Natural ScienceFoundation (no 2013GXNSFAA019337) Guangxi Universi-ties Key Project of Science and Technology Research (noKY2015ZD099) Key project of Guangxi Education Depart-ment (no 2013ZD055) Scientific Research Staring Founda-tion for the PHD Scholars of Yulin Normal University (noG2014005) Special Project of Yulin Normal University (no2012YJZX04) and Key Project of Yulin Normal University(no 2014YJZD05)The authors are grateful to the anonymousreferee for a careful checking of the details and for helpfulcomments that improved this paper

References

[1] J D Schaffer ldquoMultiple objective optimization with vector eval-uated genetic algorithmsrdquo in Proceedings of the 1st InternationalConference on Genetic Algorithms pp 93ndash100 1985

[2] L Tang and X Wang ldquoA hybrid multiobjective evolutionaryalgorithm for multiobjective optimization problemsrdquo IEEETransactions on Evolutionary Computation vol 17 no 1 pp 20ndash45 2013

[3] K Deb S Agrawal A Pratap and T Meyarivan ldquoA fast elitistnon-dominated sorting genetic algorithm for multi-objectiveoptimizationNSGA-IIrdquo inParallel ProblemSolving fromNaturePPSN VI vol 1917 of Lecture Notes in Computer Science pp849ndash858 Springer Berlin Germany 2000

[4] J Zhang Y Wang and J Feng ldquoAttribute index and uniformdesign based multiobjective association rule mining with evo-lutionary algorithmrdquo The Scientific World Journal vol 2013Article ID 259347 16 pages 2013

[5] S Jiang and Z Cai ldquoFaster convergence and higher hypervol-ume for multi-objective evolutionary algorithms by orthogonaland uniform designrdquo in Advances in Computation and Intelli-gence vol 6382 of Lecture Notes in Computer Science pp 312ndash328 Springer Berlin Germany 2010

[6] J Kennedy and R Eberhart ldquoParticle swarm optimizationrdquoin Proceedings of the IEEE International Conference on NeuralNetworks pp 1942ndash1948 December 1995

[7] R Eberhart and J Kennedy ldquoA new optimizer using particleswarm theoryrdquo in Proceedings of the 6th International Sympo-sium on Micro Machine and Human Science pp 39ndash43 IEEENagoya Japan October 1995

[8] L Yang ldquoA fast and elitist multi-objective particle swarm algo-rithm NSPSOrdquo in Proceedings of the IEEE International Con-ference on Granular Computing (GRC rsquo08) pp 470ndash475 August2008

[9] C A C Coello G T Pulido and M S Lechuga ldquoHandlingmultiple objectives with particle swarm optimizationrdquo IEEE

16 Mathematical Problems in Engineering

Transactions on Evolutionary Computation vol 8 no 3 pp256ndash279 2004

[10] M Clerc and J Kennedy ldquoThe particle swarm-explosion sta-bility and convergence in a multidimensional complex spacerdquoIEEE Transactions on Evolutionary Computation vol 6 no 1pp 58ndash73 2002

[11] H Xiaohui and R Eberhart ldquoMultiobjective optimization usingdynamic neighborhood particle swarm optimizationrdquo in Pro-ceedings of the Congress on Evolutionary Computation (CEC rsquo02)vol 2 pp 1677ndash1681 May 2002

[12] S Mostaghim and J Teich ldquoStrategies for finding goodlocal guides in multi-objective particle swarm optimization(MOPSO)rdquo in Proceedings of the IEEE Swarm IntelligenceSymposium (SIS rsquo03) pp 26ndash33 Indianapolis Ind USA April2003

[13] T Bartz-Beielstein P Limbourg J Mehnen K Schmitt K EParsopoulos and M N Vrahatis ldquoParticle swarm optimizersfor Pareto optimizationwith enhanced archiving techniquesrdquo inProceedings of the Congress on Evolutionary Computation (CECrsquo03) vol 3 pp 1780ndash1787 December 2003

[14] K E Parsopoulos D K Tasoulis and M N Vrahatis ldquoMulti-objective optimization using parallel vector evaluated particleswarm optimizationrdquo in Proceedings of the IASTED Interna-tional Conference on Artificial Intelligence and Applications (AIArsquo04) vol 2 pp 823ndash828 February 2004

[15] S-J Tsai T-Y Sun C-C Liu S-T Hsieh W-C Wu and S-YChiu ldquoAn improved multi-objective particle swarm optimizerformulti-objective problemsrdquo Expert Systems with Applicationsvol 37 no 8 pp 5872ndash5886 2010

[16] A A Mousa M A El-Shorbagy and W F Abd-El-WahedldquoLocal search based hybrid particle swarm optimization algo-rithm for multiobjective optimizationrdquo Swarm and Evolution-ary Computation vol 3 pp 1ndash14 2012

[17] J Wang W Liu W Zhang and B Yang ldquoMulti-objective parti-cle swarm optimization based on self-update and grid strategyrdquoin Proceedings of the 2012 International Conference on Informa-tion Technology and Software Engineering vol 211 of LectureNotes in Electrical Engineering pp 869ndash876 Springer BerlinGermany 2013

[18] K Khalili-Damghani A-R Abtahi and M Tavana ldquoA newmulti-objective particle swarmoptimizationmethod for solvingreliability redundancy allocation problemsrdquo Reliability Engi-neering amp System Safety vol 111 pp 58ndash75 2013

[19] J Zhang Y Wang and J Feng ldquoParallel particle swarm opti-mization based on PAMrdquo in Proceedings of the 2nd InternationalConference on Information Engineering and Computer Science(ICIECS rsquo10) Wuhan China December 2010

[20] J Han and M Kamber Data Mining Concepts and TechniquesMorgan Kaufmann 2006

[21] L F Ibrahim ldquoUsing of clustering algorithm CWSP-PAM forrural network planningrdquo in Proceedings of the 3rd InternationalConference on Information Technology and Applications (ICITArsquo05) vol 1 pp 280ndash283 IEEE July 2005

[22] P Chopra J Kang and J Lee ldquoUsing gene pair combinationsto improve the accuracy of the PAM classifierrdquo in Proceedingsof the IEEE International Conference on Bioinformatics andBiomedicine (BIBM rsquo09) pp 174ndash177 November 2009

[23] Y-W Leung and Y Wang ldquoMultiobjective programming usinguniform design and genetic algorithmrdquo IEEE Transactions onSystems Man and Cybernetics Part C Applications and Reviewsvol 30 no 3 pp 293ndash304 2000

[24] D Liu K C Tan C K Goh and W K Ho ldquoA multiobjectivememetic algorithm based on particle swarm optimizationrdquoIEEE Transactions on Systems Man and Cybernetics Part BCybernetics vol 37 no 1 pp 42ndash50 2007

[25] K Deb A Pratap S Agarwal and T Meyarivan ldquoA fast andelitist multiobjective genetic algorithm NSGA-IIrdquo IEEE Trans-actions on Evolutionary Computation vol 6 no 2 pp 182ndash1972002

[26] C M Fonseca and P J Fleming ldquoMultiobjective optimizationandmultiple constraint handling with evolutionary algorithmsII Application examplerdquo IEEE Transactions on Systems Manand Cybernetics Part A Systems and Humans vol 28 no 1 pp38ndash47 1998

[27] F Kursawe ldquoA variant of evolution strategies for vector opti-mizationrdquo in Parallel Problem Solving from Nature vol 496 ofLecture Notes in Computer Science pp 193ndash197 Springer BerlinGermany 1991

[28] E Zitzler K Deb and LThiele ldquoComparison of multiobjectiveevolutionary algorithms empirical resultsrdquo Evolutionary Com-putation vol 8 no 2 pp 173ndash195 2000

[29] H-L Liu Y Wang and Y-M Cheung ldquoA Multi-objective evo-lutionary algorithm usingmin-max strategy and sphere coordi-nate transformationrdquo Intelligent Automation amp Soft Computingvol 15 no 3 pp 361ndash384 2009

[30] N Chakraborti B S Kumar V S Babu S Moitra andA Mukhopadhyay ldquoA new multi-objective genetic algorithmapplied to hot-rolling processrdquoAppliedMathematicalModellingvol 32 no 9 pp 1781ndash1789 2008

[31] K Deb ldquoMulti-objective genetic algorithms problem difficul-ties and construction of test problemsrdquo Evolutionary Computa-tion vol 7 no 3 pp 205ndash230 1999

[32] K Deb L Thiele M Laumanns and E Zitzler ldquoScalable testproblems for evolutionarymultiobjective optimizationrdquo in Evo-lutionary Multiobjective Optimization Advanced Informationand Knowledge Processing pp 105ndash145 Springer London UK2005

[33] Y Shi and R Eberhart ldquoModified particle swarm optimizerrdquoin Proceedings of the IEEE World Congress on ComputationalIntelligence pp 69ndash73 May 1998

[34] C S Feng S Cong and X Y Feng ldquoA new adaptive inertiaweight strategy in particle swarm optimizationrdquo in Procedingsof the IEEE Congress on Evolutionary Computation (CEC rsquo07)pp 4186ndash4190 IEEE Singapore September 2007

[35] C K Goh K C Tan D S Liu and S C Chiam ldquoA competitiveand cooperative co-evolutionary approach to multi-objectiveparticle swarm optimization algorithm designrdquo European Jour-nal of Operational Research vol 202 no 1 pp 42ndash54 2010

[36] P Cheng J-S Pan L Li Y Tang and C Huang ldquoA surveyof performance assessment for multiobjective optimizersrdquo inProceedings of the 4th International Conference on Genetic andEvolutionary Computing (ICGEC rsquo10) pp 341ndash345 December2010

[37] E Zitzler L Thiele M Laumanns C M Fonseca and V GDa Fonseca ldquoPerformance assessment of multiobjective opti-mizers an analysis and reviewrdquo IEEE Transactions on Evolution-ary Computation vol 7 no 2 pp 117ndash132 2003

[38] E Zitzler and L Thiele ldquoMultiobjective evolutionary algo-rithms a comparative case study and the strength Paretoapproachrdquo IEEETransactions onEvolutionaryComputation vol3 no 4 pp 257ndash271 1999

[39] H Li and Q Zhang ldquoMultiobjective optimization problemswith complicated pareto sets MOEAD and NSGA-IIrdquo IEEE

Mathematical Problems in Engineering 17

Transactions on Evolutionary Computation vol 13 no 2 pp284ndash302 2009

[40] Q Zhang A Zhou S Zhao P N Suganthan W Liu andS Tiwari ldquoMultiobjective optimization test instances for theCEC 2009 special session and competitionrdquo Working ReportDepartment of Computing and Electronic Systems Universityof Essex 2008

[41] Q Zhang A Zhou and Y Jin ldquoRM-MEDA a regularity model-based multiobjective estimation of distribution algorithmrdquoIEEE Transactions on Evolutionary Computation vol 12 no 1pp 41ndash63 2008

[42] C-S Tsou H-H Fang H-H Chang and C-H Kao ldquoAnimproved particle swarmPareto optimizerwith local search andclusteringrdquo in Simulated Evolution and Learning vol 4247 ofLecture Notes in Computer Science pp 400ndash407 2006

[43] U Wickramasinghe and X Li ldquoChoosing leaders for multiob-jective PSO algorithms using differential evolutionrdquo in Proceed-ing of the 7th International Conference Simulated Evolution andLearning vol 5361 of Lecture Notes in Computer Science pp249ndash258 Springer Berlin Germany 2008

[44] T Krink J S Vesterstrom and J Riget ldquoParticle swarm opti-misation with spatial particle extensionrdquo in Proceedings of theCongress on Evolutionary Computation (CEC rsquo02) pp 1474ndash1479 May 2002

Submit your manuscripts athttpwwwhindawicom

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical Problems in Engineering

Hindawi Publishing Corporationhttpwwwhindawicom

Differential EquationsInternational Journal of

Volume 2014

Applied MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Probability and StatisticsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical PhysicsAdvances in

Complex AnalysisJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

OptimizationJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

CombinatoricsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Operations ResearchAdvances in

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Function Spaces

Abstract and Applied AnalysisHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of Mathematics and Mathematical Sciences

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Algebra

Discrete Dynamics in Nature and Society

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Decision SciencesAdvances in

Discrete MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014 Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Stochastic AnalysisInternational Journal of

Page 13: Research Article Multiobjective Particle Swarm

Mathematical Problems in Engineering 13

0

02

04

06

08

1 92 86

0

05

1

15

2

25

3

35

4

45 40

0 01 02 03 04 05 06 07 08 09f1(x)

0 01 02 03 04 05 06 07 08 09f1(x)

0 01 02 03 04 05 06 07 08 09f1(x)

f2(x)

minus08

minus06

minus04

minus02

0

02

04

06

08

1

f2(x)

f2(x)

minus08

minus05

minus06

minus04

minus02

UKMOPSO UMOGA

NSGA-II

Figure 5 Pareto fronts obtained by different algorithms for test problem ZDT3

all test functions It means that the PF found by UMOGA iscloser to the true PF than the PF obtained by NSGA-II

From Table 5 we can see that all the MS values ofUKMOPSO are almost close to 1 for each test function Itmeans that almost all the true PF is totally covered by the PFobtained by UKMOPSO UMOGA is similar to UKMOPSOThe MS values of NSGA-II are much lesser than UKMOPSOand UMOGA especially MS = 00019 for ZDT2

Figures 1ndash6 all demonstrate that the PF found byUKMOPSO is the nearest to the true PF and scatters mostuniformly in those obtained by three algorithms Most of thepoints in the PF found by UKMOPSO overlap the pointsin the true PF It means the solution quality obtained byUKMOPSO is very high

45 Influence of the Uniform Crossover and PAM In orderto find out the influence of the uniform crossover on the

proposed algorithm we compare the distribution and num-ber of Pareto optimal solutions before and after performingthe uniform crossover One of the simulating results on thetest function ZDT1 is shown in Figure 7

From Figure 7 it can be seen that before and afterperforming the uniform crossover the number of data pointsin the 1st cluster and the 2nd cluster varies from 7 to 15and from 19 to 26 respectively namely many new Paretooptimal solutions having not been found before performingthe uniform crossover are generated and the differences ofthe data points between two clusters have been decreasedThis will directly influence the uniformity of the PF andacquire more uniform Pareto solutions

PAM is used to determine which Pareto solutions are tobe removed fromor be inserted into the external archiveThisis to maintain the diversity of Pareto optimal solutions The

14 Mathematical Problems in Engineering

100 16

0

05

1

15

2

25

3

35

4

45

540

02 03 04 05 06 07 08 09 10

01

02

03

04

05

06

07

08

09

1

True Pareto frontUKMOPSO

02 03 04 05 06 07 08 09 10

01

02

03

04

05

06

07

08

09

1

True Pareto frontUMOGA

02 03 04 05 06 07 08 09 1

True Pareto frontNSGA-II

f1(x)

f2(x)

f1(x)

f2(x)

f2(x)

f1(x)

Figure 6 Pareto fronts obtained by different algorithms for test problem ZDT6

diversity can be computed by the ldquodistance-to-average-pointrdquomeasure [44] defined as

diversity (119878) = 1

|119878|

|119878|

sum119894=1

radic119873

sum119895=1

(119901119894119895minus 119901119895)2

(20)

where 119878 is the population |119878| is the swarm size 119873 is thedimensionality of the problem 119901

119894119895is the 1198951015840th value of the 1198941015840th

particle and 119901119895is the 1198951015840th value of the average point 119901

If the number of Pareto optimality is less than or equalto the size of the external archive PAM has no influenceon the proposed algorithm Otherwise it is used to selectthe different type of Pareto optimality from several clustersTherefore The diversity of Pareto optimality will certainlyincreaseWemonitor the diversity of Pareto optimality before

and after performing PAMonZDT1 at a certain timeThe val-ues are respectively 8762 and 11752 This fully demonstratesthat PAM can improve the diversity of Pareto optimality

5 Conclusion and Future Work

In this paper a multiobjective particle swarm optimizationbased on PAM and uniform design is presented It firstlyimplements PAM to partition the Pareto front into119870 clustersin the objective space and then it implements the uniformcrossover operator on the minimal cluster so as to generatemore Pareto optimality in the minimal cluster When thesize of the Pareto solution is larger than that of the externalarchive PAM is used to determine which Pareto solutions areto be removed from or be inserted into the external archive

Mathematical Problems in Engineering 15

0 01 02 03 04 05 06 07 08 09

19

10

01

02

03

04

05

06

077

08

09

1

The 1st clusterThe 2nd cluster

26

(a)

41

0 01 02 03 04 05 06

26

07 08 09 10

01

02

03

04

051506

07

08

09

1

The 1st clusterThe 2nd cluster

(b)

Figure 7 Pareto optimal solutions before (a) and after (b) performing the uniform crossover

Finally it keeps all the points in the lesser clusters anddiscards some points in the larger clusters This can ensurethat each of the clusters will contain approximately the samedata points Therefore the diversity of the Pareto solutionswill increase and they can scatter uniformly over the Paretofront The results of the experimental simulation performedon several well-known test problems indicate that theproposed algorithm obviously outperforms the other twoalgorithms

This algorithm is going on for further enhancement andimprovement One attempt is to use a more efficient orapproximate clustering algorithm to speed up the executiontime of this algorithm Another attempt is to extend its appli-cation scopes

Conflict of Interests

The authors declare that there is no conflict of interestsregarding the publication of this paper

Acknowledgments

This research was supported by Guangxi Natural ScienceFoundation (no 2013GXNSFAA019337) Guangxi Universi-ties Key Project of Science and Technology Research (noKY2015ZD099) Key project of Guangxi Education Depart-ment (no 2013ZD055) Scientific Research Staring Founda-tion for the PHD Scholars of Yulin Normal University (noG2014005) Special Project of Yulin Normal University (no2012YJZX04) and Key Project of Yulin Normal University(no 2014YJZD05)The authors are grateful to the anonymousreferee for a careful checking of the details and for helpfulcomments that improved this paper

References

[1] J D Schaffer ldquoMultiple objective optimization with vector eval-uated genetic algorithmsrdquo in Proceedings of the 1st InternationalConference on Genetic Algorithms pp 93ndash100 1985

[2] L Tang and X Wang ldquoA hybrid multiobjective evolutionaryalgorithm for multiobjective optimization problemsrdquo IEEETransactions on Evolutionary Computation vol 17 no 1 pp 20ndash45 2013

[3] K Deb S Agrawal A Pratap and T Meyarivan ldquoA fast elitistnon-dominated sorting genetic algorithm for multi-objectiveoptimizationNSGA-IIrdquo inParallel ProblemSolving fromNaturePPSN VI vol 1917 of Lecture Notes in Computer Science pp849ndash858 Springer Berlin Germany 2000

[4] J Zhang Y Wang and J Feng ldquoAttribute index and uniformdesign based multiobjective association rule mining with evo-lutionary algorithmrdquo The Scientific World Journal vol 2013Article ID 259347 16 pages 2013

[5] S Jiang and Z Cai ldquoFaster convergence and higher hypervol-ume for multi-objective evolutionary algorithms by orthogonaland uniform designrdquo in Advances in Computation and Intelli-gence vol 6382 of Lecture Notes in Computer Science pp 312ndash328 Springer Berlin Germany 2010

[6] J Kennedy and R Eberhart ldquoParticle swarm optimizationrdquoin Proceedings of the IEEE International Conference on NeuralNetworks pp 1942ndash1948 December 1995

[7] R Eberhart and J Kennedy ldquoA new optimizer using particleswarm theoryrdquo in Proceedings of the 6th International Sympo-sium on Micro Machine and Human Science pp 39ndash43 IEEENagoya Japan October 1995

[8] L Yang ldquoA fast and elitist multi-objective particle swarm algo-rithm NSPSOrdquo in Proceedings of the IEEE International Con-ference on Granular Computing (GRC rsquo08) pp 470ndash475 August2008

[9] C A C Coello G T Pulido and M S Lechuga ldquoHandlingmultiple objectives with particle swarm optimizationrdquo IEEE

16 Mathematical Problems in Engineering

Transactions on Evolutionary Computation vol 8 no 3 pp256ndash279 2004

[10] M Clerc and J Kennedy ldquoThe particle swarm-explosion sta-bility and convergence in a multidimensional complex spacerdquoIEEE Transactions on Evolutionary Computation vol 6 no 1pp 58ndash73 2002

[11] H Xiaohui and R Eberhart ldquoMultiobjective optimization usingdynamic neighborhood particle swarm optimizationrdquo in Pro-ceedings of the Congress on Evolutionary Computation (CEC rsquo02)vol 2 pp 1677ndash1681 May 2002

[12] S Mostaghim and J Teich ldquoStrategies for finding goodlocal guides in multi-objective particle swarm optimization(MOPSO)rdquo in Proceedings of the IEEE Swarm IntelligenceSymposium (SIS rsquo03) pp 26ndash33 Indianapolis Ind USA April2003

[13] T Bartz-Beielstein P Limbourg J Mehnen K Schmitt K EParsopoulos and M N Vrahatis ldquoParticle swarm optimizersfor Pareto optimizationwith enhanced archiving techniquesrdquo inProceedings of the Congress on Evolutionary Computation (CECrsquo03) vol 3 pp 1780ndash1787 December 2003

[14] K E Parsopoulos D K Tasoulis and M N Vrahatis ldquoMulti-objective optimization using parallel vector evaluated particleswarm optimizationrdquo in Proceedings of the IASTED Interna-tional Conference on Artificial Intelligence and Applications (AIArsquo04) vol 2 pp 823ndash828 February 2004

[15] S-J Tsai T-Y Sun C-C Liu S-T Hsieh W-C Wu and S-YChiu ldquoAn improved multi-objective particle swarm optimizerformulti-objective problemsrdquo Expert Systems with Applicationsvol 37 no 8 pp 5872ndash5886 2010

[16] A A Mousa M A El-Shorbagy and W F Abd-El-WahedldquoLocal search based hybrid particle swarm optimization algo-rithm for multiobjective optimizationrdquo Swarm and Evolution-ary Computation vol 3 pp 1ndash14 2012

[17] J Wang W Liu W Zhang and B Yang ldquoMulti-objective parti-cle swarm optimization based on self-update and grid strategyrdquoin Proceedings of the 2012 International Conference on Informa-tion Technology and Software Engineering vol 211 of LectureNotes in Electrical Engineering pp 869ndash876 Springer BerlinGermany 2013

[18] K Khalili-Damghani A-R Abtahi and M Tavana ldquoA newmulti-objective particle swarmoptimizationmethod for solvingreliability redundancy allocation problemsrdquo Reliability Engi-neering amp System Safety vol 111 pp 58ndash75 2013

[19] J Zhang Y Wang and J Feng ldquoParallel particle swarm opti-mization based on PAMrdquo in Proceedings of the 2nd InternationalConference on Information Engineering and Computer Science(ICIECS rsquo10) Wuhan China December 2010

[20] J Han and M Kamber Data Mining Concepts and TechniquesMorgan Kaufmann 2006

[21] L F Ibrahim ldquoUsing of clustering algorithm CWSP-PAM forrural network planningrdquo in Proceedings of the 3rd InternationalConference on Information Technology and Applications (ICITArsquo05) vol 1 pp 280ndash283 IEEE July 2005

[22] P Chopra J Kang and J Lee ldquoUsing gene pair combinationsto improve the accuracy of the PAM classifierrdquo in Proceedingsof the IEEE International Conference on Bioinformatics andBiomedicine (BIBM rsquo09) pp 174ndash177 November 2009

[23] Y-W Leung and Y Wang ldquoMultiobjective programming usinguniform design and genetic algorithmrdquo IEEE Transactions onSystems Man and Cybernetics Part C Applications and Reviewsvol 30 no 3 pp 293ndash304 2000

[24] D Liu K C Tan C K Goh and W K Ho ldquoA multiobjectivememetic algorithm based on particle swarm optimizationrdquoIEEE Transactions on Systems Man and Cybernetics Part BCybernetics vol 37 no 1 pp 42ndash50 2007

[25] K Deb A Pratap S Agarwal and T Meyarivan ldquoA fast andelitist multiobjective genetic algorithm NSGA-IIrdquo IEEE Trans-actions on Evolutionary Computation vol 6 no 2 pp 182ndash1972002

[26] C M Fonseca and P J Fleming ldquoMultiobjective optimizationandmultiple constraint handling with evolutionary algorithmsII Application examplerdquo IEEE Transactions on Systems Manand Cybernetics Part A Systems and Humans vol 28 no 1 pp38ndash47 1998

[27] F Kursawe ldquoA variant of evolution strategies for vector opti-mizationrdquo in Parallel Problem Solving from Nature vol 496 ofLecture Notes in Computer Science pp 193ndash197 Springer BerlinGermany 1991

[28] E Zitzler K Deb and LThiele ldquoComparison of multiobjectiveevolutionary algorithms empirical resultsrdquo Evolutionary Com-putation vol 8 no 2 pp 173ndash195 2000

[29] H-L Liu Y Wang and Y-M Cheung ldquoA Multi-objective evo-lutionary algorithm usingmin-max strategy and sphere coordi-nate transformationrdquo Intelligent Automation amp Soft Computingvol 15 no 3 pp 361ndash384 2009

[30] N Chakraborti B S Kumar V S Babu S Moitra andA Mukhopadhyay ldquoA new multi-objective genetic algorithmapplied to hot-rolling processrdquoAppliedMathematicalModellingvol 32 no 9 pp 1781ndash1789 2008

[31] K Deb ldquoMulti-objective genetic algorithms problem difficul-ties and construction of test problemsrdquo Evolutionary Computa-tion vol 7 no 3 pp 205ndash230 1999

[32] K Deb L Thiele M Laumanns and E Zitzler ldquoScalable testproblems for evolutionarymultiobjective optimizationrdquo in Evo-lutionary Multiobjective Optimization Advanced Informationand Knowledge Processing pp 105ndash145 Springer London UK2005

[33] Y Shi and R Eberhart ldquoModified particle swarm optimizerrdquoin Proceedings of the IEEE World Congress on ComputationalIntelligence pp 69ndash73 May 1998

[34] C S Feng S Cong and X Y Feng ldquoA new adaptive inertiaweight strategy in particle swarm optimizationrdquo in Procedingsof the IEEE Congress on Evolutionary Computation (CEC rsquo07)pp 4186ndash4190 IEEE Singapore September 2007

[35] C K Goh K C Tan D S Liu and S C Chiam ldquoA competitiveand cooperative co-evolutionary approach to multi-objectiveparticle swarm optimization algorithm designrdquo European Jour-nal of Operational Research vol 202 no 1 pp 42ndash54 2010

[36] P Cheng J-S Pan L Li Y Tang and C Huang ldquoA surveyof performance assessment for multiobjective optimizersrdquo inProceedings of the 4th International Conference on Genetic andEvolutionary Computing (ICGEC rsquo10) pp 341ndash345 December2010

[37] E Zitzler L Thiele M Laumanns C M Fonseca and V GDa Fonseca ldquoPerformance assessment of multiobjective opti-mizers an analysis and reviewrdquo IEEE Transactions on Evolution-ary Computation vol 7 no 2 pp 117ndash132 2003

[38] E Zitzler and L Thiele ldquoMultiobjective evolutionary algo-rithms a comparative case study and the strength Paretoapproachrdquo IEEETransactions onEvolutionaryComputation vol3 no 4 pp 257ndash271 1999

[39] H Li and Q Zhang ldquoMultiobjective optimization problemswith complicated pareto sets MOEAD and NSGA-IIrdquo IEEE

Mathematical Problems in Engineering 17

Transactions on Evolutionary Computation vol 13 no 2 pp284ndash302 2009

[40] Q Zhang A Zhou S Zhao P N Suganthan W Liu andS Tiwari ldquoMultiobjective optimization test instances for theCEC 2009 special session and competitionrdquo Working ReportDepartment of Computing and Electronic Systems Universityof Essex 2008

[41] Q Zhang A Zhou and Y Jin ldquoRM-MEDA a regularity model-based multiobjective estimation of distribution algorithmrdquoIEEE Transactions on Evolutionary Computation vol 12 no 1pp 41ndash63 2008

[42] C-S Tsou H-H Fang H-H Chang and C-H Kao ldquoAnimproved particle swarmPareto optimizerwith local search andclusteringrdquo in Simulated Evolution and Learning vol 4247 ofLecture Notes in Computer Science pp 400ndash407 2006

[43] U Wickramasinghe and X Li ldquoChoosing leaders for multiob-jective PSO algorithms using differential evolutionrdquo in Proceed-ing of the 7th International Conference Simulated Evolution andLearning vol 5361 of Lecture Notes in Computer Science pp249ndash258 Springer Berlin Germany 2008

[44] T Krink J S Vesterstrom and J Riget ldquoParticle swarm opti-misation with spatial particle extensionrdquo in Proceedings of theCongress on Evolutionary Computation (CEC rsquo02) pp 1474ndash1479 May 2002

Submit your manuscripts athttpwwwhindawicom

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical Problems in Engineering

Hindawi Publishing Corporationhttpwwwhindawicom

Differential EquationsInternational Journal of

Volume 2014

Applied MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Probability and StatisticsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical PhysicsAdvances in

Complex AnalysisJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

OptimizationJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

CombinatoricsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Operations ResearchAdvances in

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Function Spaces

Abstract and Applied AnalysisHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of Mathematics and Mathematical Sciences

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Algebra

Discrete Dynamics in Nature and Society

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Decision SciencesAdvances in

Discrete MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014 Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Stochastic AnalysisInternational Journal of

Page 14: Research Article Multiobjective Particle Swarm

14 Mathematical Problems in Engineering

100 16

0

05

1

15

2

25

3

35

4

45

540

02 03 04 05 06 07 08 09 10

01

02

03

04

05

06

07

08

09

1

True Pareto frontUKMOPSO

02 03 04 05 06 07 08 09 10

01

02

03

04

05

06

07

08

09

1

True Pareto frontUMOGA

02 03 04 05 06 07 08 09 1

True Pareto frontNSGA-II

f1(x)

f2(x)

f1(x)

f2(x)

f2(x)

f1(x)

Figure 6 Pareto fronts obtained by different algorithms for test problem ZDT6

diversity can be computed by the ldquodistance-to-average-pointrdquomeasure [44] defined as

diversity (119878) = 1

|119878|

|119878|

sum119894=1

radic119873

sum119895=1

(119901119894119895minus 119901119895)2

(20)

where 119878 is the population |119878| is the swarm size 119873 is thedimensionality of the problem 119901

119894119895is the 1198951015840th value of the 1198941015840th

particle and 119901119895is the 1198951015840th value of the average point 119901

If the number of Pareto optimality is less than or equalto the size of the external archive PAM has no influenceon the proposed algorithm Otherwise it is used to selectthe different type of Pareto optimality from several clustersTherefore The diversity of Pareto optimality will certainlyincreaseWemonitor the diversity of Pareto optimality before

and after performing PAMonZDT1 at a certain timeThe val-ues are respectively 8762 and 11752 This fully demonstratesthat PAM can improve the diversity of Pareto optimality

5 Conclusion and Future Work

In this paper a multiobjective particle swarm optimizationbased on PAM and uniform design is presented It firstlyimplements PAM to partition the Pareto front into119870 clustersin the objective space and then it implements the uniformcrossover operator on the minimal cluster so as to generatemore Pareto optimality in the minimal cluster When thesize of the Pareto solution is larger than that of the externalarchive PAM is used to determine which Pareto solutions areto be removed from or be inserted into the external archive

Mathematical Problems in Engineering 15

0 01 02 03 04 05 06 07 08 09

19

10

01

02

03

04

05

06

077

08

09

1

The 1st clusterThe 2nd cluster

26

(a)

41

0 01 02 03 04 05 06

26

07 08 09 10

01

02

03

04

051506

07

08

09

1

The 1st clusterThe 2nd cluster

(b)

Figure 7 Pareto optimal solutions before (a) and after (b) performing the uniform crossover

Finally it keeps all the points in the lesser clusters anddiscards some points in the larger clusters This can ensurethat each of the clusters will contain approximately the samedata points Therefore the diversity of the Pareto solutionswill increase and they can scatter uniformly over the Paretofront The results of the experimental simulation performedon several well-known test problems indicate that theproposed algorithm obviously outperforms the other twoalgorithms

This algorithm is going on for further enhancement andimprovement One attempt is to use a more efficient orapproximate clustering algorithm to speed up the executiontime of this algorithm Another attempt is to extend its appli-cation scopes

Conflict of Interests

The authors declare that there is no conflict of interestsregarding the publication of this paper

Acknowledgments

This research was supported by Guangxi Natural ScienceFoundation (no 2013GXNSFAA019337) Guangxi Universi-ties Key Project of Science and Technology Research (noKY2015ZD099) Key project of Guangxi Education Depart-ment (no 2013ZD055) Scientific Research Staring Founda-tion for the PHD Scholars of Yulin Normal University (noG2014005) Special Project of Yulin Normal University (no2012YJZX04) and Key Project of Yulin Normal University(no 2014YJZD05)The authors are grateful to the anonymousreferee for a careful checking of the details and for helpfulcomments that improved this paper

References

[1] J D Schaffer ldquoMultiple objective optimization with vector eval-uated genetic algorithmsrdquo in Proceedings of the 1st InternationalConference on Genetic Algorithms pp 93ndash100 1985

[2] L Tang and X Wang ldquoA hybrid multiobjective evolutionaryalgorithm for multiobjective optimization problemsrdquo IEEETransactions on Evolutionary Computation vol 17 no 1 pp 20ndash45 2013

[3] K Deb S Agrawal A Pratap and T Meyarivan ldquoA fast elitistnon-dominated sorting genetic algorithm for multi-objectiveoptimizationNSGA-IIrdquo inParallel ProblemSolving fromNaturePPSN VI vol 1917 of Lecture Notes in Computer Science pp849ndash858 Springer Berlin Germany 2000

[4] J Zhang Y Wang and J Feng ldquoAttribute index and uniformdesign based multiobjective association rule mining with evo-lutionary algorithmrdquo The Scientific World Journal vol 2013Article ID 259347 16 pages 2013

[5] S Jiang and Z Cai ldquoFaster convergence and higher hypervol-ume for multi-objective evolutionary algorithms by orthogonaland uniform designrdquo in Advances in Computation and Intelli-gence vol 6382 of Lecture Notes in Computer Science pp 312ndash328 Springer Berlin Germany 2010

[6] J Kennedy and R Eberhart ldquoParticle swarm optimizationrdquoin Proceedings of the IEEE International Conference on NeuralNetworks pp 1942ndash1948 December 1995

[7] R Eberhart and J Kennedy ldquoA new optimizer using particleswarm theoryrdquo in Proceedings of the 6th International Sympo-sium on Micro Machine and Human Science pp 39ndash43 IEEENagoya Japan October 1995

[8] L Yang ldquoA fast and elitist multi-objective particle swarm algo-rithm NSPSOrdquo in Proceedings of the IEEE International Con-ference on Granular Computing (GRC rsquo08) pp 470ndash475 August2008

[9] C A C Coello G T Pulido and M S Lechuga ldquoHandlingmultiple objectives with particle swarm optimizationrdquo IEEE

16 Mathematical Problems in Engineering

Transactions on Evolutionary Computation vol 8 no 3 pp256ndash279 2004

[10] M Clerc and J Kennedy ldquoThe particle swarm-explosion sta-bility and convergence in a multidimensional complex spacerdquoIEEE Transactions on Evolutionary Computation vol 6 no 1pp 58ndash73 2002

[11] H Xiaohui and R Eberhart ldquoMultiobjective optimization usingdynamic neighborhood particle swarm optimizationrdquo in Pro-ceedings of the Congress on Evolutionary Computation (CEC rsquo02)vol 2 pp 1677ndash1681 May 2002

[12] S Mostaghim and J Teich ldquoStrategies for finding goodlocal guides in multi-objective particle swarm optimization(MOPSO)rdquo in Proceedings of the IEEE Swarm IntelligenceSymposium (SIS rsquo03) pp 26ndash33 Indianapolis Ind USA April2003

[13] T Bartz-Beielstein P Limbourg J Mehnen K Schmitt K EParsopoulos and M N Vrahatis ldquoParticle swarm optimizersfor Pareto optimizationwith enhanced archiving techniquesrdquo inProceedings of the Congress on Evolutionary Computation (CECrsquo03) vol 3 pp 1780ndash1787 December 2003

[14] K E Parsopoulos D K Tasoulis and M N Vrahatis ldquoMulti-objective optimization using parallel vector evaluated particleswarm optimizationrdquo in Proceedings of the IASTED Interna-tional Conference on Artificial Intelligence and Applications (AIArsquo04) vol 2 pp 823ndash828 February 2004

[15] S-J Tsai T-Y Sun C-C Liu S-T Hsieh W-C Wu and S-YChiu ldquoAn improved multi-objective particle swarm optimizerformulti-objective problemsrdquo Expert Systems with Applicationsvol 37 no 8 pp 5872ndash5886 2010

[16] A A Mousa M A El-Shorbagy and W F Abd-El-WahedldquoLocal search based hybrid particle swarm optimization algo-rithm for multiobjective optimizationrdquo Swarm and Evolution-ary Computation vol 3 pp 1ndash14 2012

[17] J Wang W Liu W Zhang and B Yang ldquoMulti-objective parti-cle swarm optimization based on self-update and grid strategyrdquoin Proceedings of the 2012 International Conference on Informa-tion Technology and Software Engineering vol 211 of LectureNotes in Electrical Engineering pp 869ndash876 Springer BerlinGermany 2013

[18] K Khalili-Damghani A-R Abtahi and M Tavana ldquoA newmulti-objective particle swarmoptimizationmethod for solvingreliability redundancy allocation problemsrdquo Reliability Engi-neering amp System Safety vol 111 pp 58ndash75 2013

[19] J Zhang Y Wang and J Feng ldquoParallel particle swarm opti-mization based on PAMrdquo in Proceedings of the 2nd InternationalConference on Information Engineering and Computer Science(ICIECS rsquo10) Wuhan China December 2010

[20] J Han and M Kamber Data Mining Concepts and TechniquesMorgan Kaufmann 2006

[21] L F Ibrahim ldquoUsing of clustering algorithm CWSP-PAM forrural network planningrdquo in Proceedings of the 3rd InternationalConference on Information Technology and Applications (ICITArsquo05) vol 1 pp 280ndash283 IEEE July 2005

[22] P Chopra J Kang and J Lee ldquoUsing gene pair combinationsto improve the accuracy of the PAM classifierrdquo in Proceedingsof the IEEE International Conference on Bioinformatics andBiomedicine (BIBM rsquo09) pp 174ndash177 November 2009

[23] Y-W Leung and Y Wang ldquoMultiobjective programming usinguniform design and genetic algorithmrdquo IEEE Transactions onSystems Man and Cybernetics Part C Applications and Reviewsvol 30 no 3 pp 293ndash304 2000

[24] D Liu K C Tan C K Goh and W K Ho ldquoA multiobjectivememetic algorithm based on particle swarm optimizationrdquoIEEE Transactions on Systems Man and Cybernetics Part BCybernetics vol 37 no 1 pp 42ndash50 2007

[25] K Deb A Pratap S Agarwal and T Meyarivan ldquoA fast andelitist multiobjective genetic algorithm NSGA-IIrdquo IEEE Trans-actions on Evolutionary Computation vol 6 no 2 pp 182ndash1972002

[26] C M Fonseca and P J Fleming ldquoMultiobjective optimizationandmultiple constraint handling with evolutionary algorithmsII Application examplerdquo IEEE Transactions on Systems Manand Cybernetics Part A Systems and Humans vol 28 no 1 pp38ndash47 1998

[27] F Kursawe ldquoA variant of evolution strategies for vector opti-mizationrdquo in Parallel Problem Solving from Nature vol 496 ofLecture Notes in Computer Science pp 193ndash197 Springer BerlinGermany 1991

[28] E Zitzler K Deb and LThiele ldquoComparison of multiobjectiveevolutionary algorithms empirical resultsrdquo Evolutionary Com-putation vol 8 no 2 pp 173ndash195 2000

[29] H-L Liu Y Wang and Y-M Cheung ldquoA Multi-objective evo-lutionary algorithm usingmin-max strategy and sphere coordi-nate transformationrdquo Intelligent Automation amp Soft Computingvol 15 no 3 pp 361ndash384 2009

[30] N Chakraborti B S Kumar V S Babu S Moitra andA Mukhopadhyay ldquoA new multi-objective genetic algorithmapplied to hot-rolling processrdquoAppliedMathematicalModellingvol 32 no 9 pp 1781ndash1789 2008

[31] K Deb ldquoMulti-objective genetic algorithms problem difficul-ties and construction of test problemsrdquo Evolutionary Computa-tion vol 7 no 3 pp 205ndash230 1999

[32] K Deb L Thiele M Laumanns and E Zitzler ldquoScalable testproblems for evolutionarymultiobjective optimizationrdquo in Evo-lutionary Multiobjective Optimization Advanced Informationand Knowledge Processing pp 105ndash145 Springer London UK2005

[33] Y Shi and R Eberhart ldquoModified particle swarm optimizerrdquoin Proceedings of the IEEE World Congress on ComputationalIntelligence pp 69ndash73 May 1998

[34] C S Feng S Cong and X Y Feng ldquoA new adaptive inertiaweight strategy in particle swarm optimizationrdquo in Procedingsof the IEEE Congress on Evolutionary Computation (CEC rsquo07)pp 4186ndash4190 IEEE Singapore September 2007

[35] C K Goh K C Tan D S Liu and S C Chiam ldquoA competitiveand cooperative co-evolutionary approach to multi-objectiveparticle swarm optimization algorithm designrdquo European Jour-nal of Operational Research vol 202 no 1 pp 42ndash54 2010

[36] P Cheng J-S Pan L Li Y Tang and C Huang ldquoA surveyof performance assessment for multiobjective optimizersrdquo inProceedings of the 4th International Conference on Genetic andEvolutionary Computing (ICGEC rsquo10) pp 341ndash345 December2010

[37] E Zitzler L Thiele M Laumanns C M Fonseca and V GDa Fonseca ldquoPerformance assessment of multiobjective opti-mizers an analysis and reviewrdquo IEEE Transactions on Evolution-ary Computation vol 7 no 2 pp 117ndash132 2003

[38] E Zitzler and L Thiele ldquoMultiobjective evolutionary algo-rithms a comparative case study and the strength Paretoapproachrdquo IEEETransactions onEvolutionaryComputation vol3 no 4 pp 257ndash271 1999

[39] H Li and Q Zhang ldquoMultiobjective optimization problemswith complicated pareto sets MOEAD and NSGA-IIrdquo IEEE

Mathematical Problems in Engineering 17

Transactions on Evolutionary Computation vol 13 no 2 pp284ndash302 2009

[40] Q Zhang A Zhou S Zhao P N Suganthan W Liu andS Tiwari ldquoMultiobjective optimization test instances for theCEC 2009 special session and competitionrdquo Working ReportDepartment of Computing and Electronic Systems Universityof Essex 2008

[41] Q Zhang A Zhou and Y Jin ldquoRM-MEDA a regularity model-based multiobjective estimation of distribution algorithmrdquoIEEE Transactions on Evolutionary Computation vol 12 no 1pp 41ndash63 2008

[42] C-S Tsou H-H Fang H-H Chang and C-H Kao ldquoAnimproved particle swarmPareto optimizerwith local search andclusteringrdquo in Simulated Evolution and Learning vol 4247 ofLecture Notes in Computer Science pp 400ndash407 2006

[43] U Wickramasinghe and X Li ldquoChoosing leaders for multiob-jective PSO algorithms using differential evolutionrdquo in Proceed-ing of the 7th International Conference Simulated Evolution andLearning vol 5361 of Lecture Notes in Computer Science pp249ndash258 Springer Berlin Germany 2008

[44] T Krink J S Vesterstrom and J Riget ldquoParticle swarm opti-misation with spatial particle extensionrdquo in Proceedings of theCongress on Evolutionary Computation (CEC rsquo02) pp 1474ndash1479 May 2002

Submit your manuscripts athttpwwwhindawicom

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical Problems in Engineering

Hindawi Publishing Corporationhttpwwwhindawicom

Differential EquationsInternational Journal of

Volume 2014

Applied MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Probability and StatisticsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical PhysicsAdvances in

Complex AnalysisJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

OptimizationJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

CombinatoricsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Operations ResearchAdvances in

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Function Spaces

Abstract and Applied AnalysisHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of Mathematics and Mathematical Sciences

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Algebra

Discrete Dynamics in Nature and Society

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Decision SciencesAdvances in

Discrete MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014 Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Stochastic AnalysisInternational Journal of

Page 15: Research Article Multiobjective Particle Swarm

Mathematical Problems in Engineering 15

0 01 02 03 04 05 06 07 08 09

19

10

01

02

03

04

05

06

077

08

09

1

The 1st clusterThe 2nd cluster

26

(a)

41

0 01 02 03 04 05 06

26

07 08 09 10

01

02

03

04

051506

07

08

09

1

The 1st clusterThe 2nd cluster

(b)

Figure 7 Pareto optimal solutions before (a) and after (b) performing the uniform crossover

Finally it keeps all the points in the lesser clusters anddiscards some points in the larger clusters This can ensurethat each of the clusters will contain approximately the samedata points Therefore the diversity of the Pareto solutionswill increase and they can scatter uniformly over the Paretofront The results of the experimental simulation performedon several well-known test problems indicate that theproposed algorithm obviously outperforms the other twoalgorithms

This algorithm is going on for further enhancement andimprovement One attempt is to use a more efficient orapproximate clustering algorithm to speed up the executiontime of this algorithm Another attempt is to extend its appli-cation scopes

Conflict of Interests

The authors declare that there is no conflict of interestsregarding the publication of this paper

Acknowledgments

This research was supported by Guangxi Natural ScienceFoundation (no 2013GXNSFAA019337) Guangxi Universi-ties Key Project of Science and Technology Research (noKY2015ZD099) Key project of Guangxi Education Depart-ment (no 2013ZD055) Scientific Research Staring Founda-tion for the PHD Scholars of Yulin Normal University (noG2014005) Special Project of Yulin Normal University (no2012YJZX04) and Key Project of Yulin Normal University(no 2014YJZD05)The authors are grateful to the anonymousreferee for a careful checking of the details and for helpfulcomments that improved this paper

References

[1] J D Schaffer ldquoMultiple objective optimization with vector eval-uated genetic algorithmsrdquo in Proceedings of the 1st InternationalConference on Genetic Algorithms pp 93ndash100 1985

[2] L Tang and X Wang ldquoA hybrid multiobjective evolutionaryalgorithm for multiobjective optimization problemsrdquo IEEETransactions on Evolutionary Computation vol 17 no 1 pp 20ndash45 2013

[3] K Deb S Agrawal A Pratap and T Meyarivan ldquoA fast elitistnon-dominated sorting genetic algorithm for multi-objectiveoptimizationNSGA-IIrdquo inParallel ProblemSolving fromNaturePPSN VI vol 1917 of Lecture Notes in Computer Science pp849ndash858 Springer Berlin Germany 2000

[4] J Zhang Y Wang and J Feng ldquoAttribute index and uniformdesign based multiobjective association rule mining with evo-lutionary algorithmrdquo The Scientific World Journal vol 2013Article ID 259347 16 pages 2013

[5] S Jiang and Z Cai ldquoFaster convergence and higher hypervol-ume for multi-objective evolutionary algorithms by orthogonaland uniform designrdquo in Advances in Computation and Intelli-gence vol 6382 of Lecture Notes in Computer Science pp 312ndash328 Springer Berlin Germany 2010

[6] J Kennedy and R Eberhart ldquoParticle swarm optimizationrdquoin Proceedings of the IEEE International Conference on NeuralNetworks pp 1942ndash1948 December 1995

[7] R Eberhart and J Kennedy ldquoA new optimizer using particleswarm theoryrdquo in Proceedings of the 6th International Sympo-sium on Micro Machine and Human Science pp 39ndash43 IEEENagoya Japan October 1995

[8] L Yang ldquoA fast and elitist multi-objective particle swarm algo-rithm NSPSOrdquo in Proceedings of the IEEE International Con-ference on Granular Computing (GRC rsquo08) pp 470ndash475 August2008

[9] C A C Coello G T Pulido and M S Lechuga ldquoHandlingmultiple objectives with particle swarm optimizationrdquo IEEE

16 Mathematical Problems in Engineering

Transactions on Evolutionary Computation vol 8 no 3 pp256ndash279 2004

[10] M Clerc and J Kennedy ldquoThe particle swarm-explosion sta-bility and convergence in a multidimensional complex spacerdquoIEEE Transactions on Evolutionary Computation vol 6 no 1pp 58ndash73 2002

[11] H Xiaohui and R Eberhart ldquoMultiobjective optimization usingdynamic neighborhood particle swarm optimizationrdquo in Pro-ceedings of the Congress on Evolutionary Computation (CEC rsquo02)vol 2 pp 1677ndash1681 May 2002

[12] S Mostaghim and J Teich ldquoStrategies for finding goodlocal guides in multi-objective particle swarm optimization(MOPSO)rdquo in Proceedings of the IEEE Swarm IntelligenceSymposium (SIS rsquo03) pp 26ndash33 Indianapolis Ind USA April2003

[13] T Bartz-Beielstein P Limbourg J Mehnen K Schmitt K EParsopoulos and M N Vrahatis ldquoParticle swarm optimizersfor Pareto optimizationwith enhanced archiving techniquesrdquo inProceedings of the Congress on Evolutionary Computation (CECrsquo03) vol 3 pp 1780ndash1787 December 2003

[14] K E Parsopoulos D K Tasoulis and M N Vrahatis ldquoMulti-objective optimization using parallel vector evaluated particleswarm optimizationrdquo in Proceedings of the IASTED Interna-tional Conference on Artificial Intelligence and Applications (AIArsquo04) vol 2 pp 823ndash828 February 2004

[15] S-J Tsai T-Y Sun C-C Liu S-T Hsieh W-C Wu and S-YChiu ldquoAn improved multi-objective particle swarm optimizerformulti-objective problemsrdquo Expert Systems with Applicationsvol 37 no 8 pp 5872ndash5886 2010

[16] A A Mousa M A El-Shorbagy and W F Abd-El-WahedldquoLocal search based hybrid particle swarm optimization algo-rithm for multiobjective optimizationrdquo Swarm and Evolution-ary Computation vol 3 pp 1ndash14 2012

[17] J Wang W Liu W Zhang and B Yang ldquoMulti-objective parti-cle swarm optimization based on self-update and grid strategyrdquoin Proceedings of the 2012 International Conference on Informa-tion Technology and Software Engineering vol 211 of LectureNotes in Electrical Engineering pp 869ndash876 Springer BerlinGermany 2013

[18] K Khalili-Damghani A-R Abtahi and M Tavana ldquoA newmulti-objective particle swarmoptimizationmethod for solvingreliability redundancy allocation problemsrdquo Reliability Engi-neering amp System Safety vol 111 pp 58ndash75 2013

[19] J Zhang Y Wang and J Feng ldquoParallel particle swarm opti-mization based on PAMrdquo in Proceedings of the 2nd InternationalConference on Information Engineering and Computer Science(ICIECS rsquo10) Wuhan China December 2010

[20] J Han and M Kamber Data Mining Concepts and TechniquesMorgan Kaufmann 2006

[21] L F Ibrahim ldquoUsing of clustering algorithm CWSP-PAM forrural network planningrdquo in Proceedings of the 3rd InternationalConference on Information Technology and Applications (ICITArsquo05) vol 1 pp 280ndash283 IEEE July 2005

[22] P Chopra J Kang and J Lee ldquoUsing gene pair combinationsto improve the accuracy of the PAM classifierrdquo in Proceedingsof the IEEE International Conference on Bioinformatics andBiomedicine (BIBM rsquo09) pp 174ndash177 November 2009

[23] Y-W Leung and Y Wang ldquoMultiobjective programming usinguniform design and genetic algorithmrdquo IEEE Transactions onSystems Man and Cybernetics Part C Applications and Reviewsvol 30 no 3 pp 293ndash304 2000

[24] D Liu K C Tan C K Goh and W K Ho ldquoA multiobjectivememetic algorithm based on particle swarm optimizationrdquoIEEE Transactions on Systems Man and Cybernetics Part BCybernetics vol 37 no 1 pp 42ndash50 2007

[25] K Deb A Pratap S Agarwal and T Meyarivan ldquoA fast andelitist multiobjective genetic algorithm NSGA-IIrdquo IEEE Trans-actions on Evolutionary Computation vol 6 no 2 pp 182ndash1972002

[26] C M Fonseca and P J Fleming ldquoMultiobjective optimizationandmultiple constraint handling with evolutionary algorithmsII Application examplerdquo IEEE Transactions on Systems Manand Cybernetics Part A Systems and Humans vol 28 no 1 pp38ndash47 1998

[27] F Kursawe ldquoA variant of evolution strategies for vector opti-mizationrdquo in Parallel Problem Solving from Nature vol 496 ofLecture Notes in Computer Science pp 193ndash197 Springer BerlinGermany 1991

[28] E Zitzler K Deb and LThiele ldquoComparison of multiobjectiveevolutionary algorithms empirical resultsrdquo Evolutionary Com-putation vol 8 no 2 pp 173ndash195 2000

[29] H-L Liu Y Wang and Y-M Cheung ldquoA Multi-objective evo-lutionary algorithm usingmin-max strategy and sphere coordi-nate transformationrdquo Intelligent Automation amp Soft Computingvol 15 no 3 pp 361ndash384 2009

[30] N Chakraborti B S Kumar V S Babu S Moitra andA Mukhopadhyay ldquoA new multi-objective genetic algorithmapplied to hot-rolling processrdquoAppliedMathematicalModellingvol 32 no 9 pp 1781ndash1789 2008

[31] K Deb ldquoMulti-objective genetic algorithms problem difficul-ties and construction of test problemsrdquo Evolutionary Computa-tion vol 7 no 3 pp 205ndash230 1999

[32] K Deb L Thiele M Laumanns and E Zitzler ldquoScalable testproblems for evolutionarymultiobjective optimizationrdquo in Evo-lutionary Multiobjective Optimization Advanced Informationand Knowledge Processing pp 105ndash145 Springer London UK2005

[33] Y Shi and R Eberhart ldquoModified particle swarm optimizerrdquoin Proceedings of the IEEE World Congress on ComputationalIntelligence pp 69ndash73 May 1998

[34] C S Feng S Cong and X Y Feng ldquoA new adaptive inertiaweight strategy in particle swarm optimizationrdquo in Procedingsof the IEEE Congress on Evolutionary Computation (CEC rsquo07)pp 4186ndash4190 IEEE Singapore September 2007

[35] C K Goh K C Tan D S Liu and S C Chiam ldquoA competitiveand cooperative co-evolutionary approach to multi-objectiveparticle swarm optimization algorithm designrdquo European Jour-nal of Operational Research vol 202 no 1 pp 42ndash54 2010

[36] P Cheng J-S Pan L Li Y Tang and C Huang ldquoA surveyof performance assessment for multiobjective optimizersrdquo inProceedings of the 4th International Conference on Genetic andEvolutionary Computing (ICGEC rsquo10) pp 341ndash345 December2010

[37] E Zitzler L Thiele M Laumanns C M Fonseca and V GDa Fonseca ldquoPerformance assessment of multiobjective opti-mizers an analysis and reviewrdquo IEEE Transactions on Evolution-ary Computation vol 7 no 2 pp 117ndash132 2003

[38] E Zitzler and L Thiele ldquoMultiobjective evolutionary algo-rithms a comparative case study and the strength Paretoapproachrdquo IEEETransactions onEvolutionaryComputation vol3 no 4 pp 257ndash271 1999

[39] H Li and Q Zhang ldquoMultiobjective optimization problemswith complicated pareto sets MOEAD and NSGA-IIrdquo IEEE

Mathematical Problems in Engineering 17

Transactions on Evolutionary Computation vol 13 no 2 pp284ndash302 2009

[40] Q Zhang A Zhou S Zhao P N Suganthan W Liu andS Tiwari ldquoMultiobjective optimization test instances for theCEC 2009 special session and competitionrdquo Working ReportDepartment of Computing and Electronic Systems Universityof Essex 2008

[41] Q Zhang A Zhou and Y Jin ldquoRM-MEDA a regularity model-based multiobjective estimation of distribution algorithmrdquoIEEE Transactions on Evolutionary Computation vol 12 no 1pp 41ndash63 2008

[42] C-S Tsou H-H Fang H-H Chang and C-H Kao ldquoAnimproved particle swarmPareto optimizerwith local search andclusteringrdquo in Simulated Evolution and Learning vol 4247 ofLecture Notes in Computer Science pp 400ndash407 2006

[43] U Wickramasinghe and X Li ldquoChoosing leaders for multiob-jective PSO algorithms using differential evolutionrdquo in Proceed-ing of the 7th International Conference Simulated Evolution andLearning vol 5361 of Lecture Notes in Computer Science pp249ndash258 Springer Berlin Germany 2008

[44] T Krink J S Vesterstrom and J Riget ldquoParticle swarm opti-misation with spatial particle extensionrdquo in Proceedings of theCongress on Evolutionary Computation (CEC rsquo02) pp 1474ndash1479 May 2002

Submit your manuscripts athttpwwwhindawicom

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical Problems in Engineering

Hindawi Publishing Corporationhttpwwwhindawicom

Differential EquationsInternational Journal of

Volume 2014

Applied MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Probability and StatisticsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical PhysicsAdvances in

Complex AnalysisJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

OptimizationJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

CombinatoricsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Operations ResearchAdvances in

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Function Spaces

Abstract and Applied AnalysisHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of Mathematics and Mathematical Sciences

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Algebra

Discrete Dynamics in Nature and Society

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Decision SciencesAdvances in

Discrete MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014 Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Stochastic AnalysisInternational Journal of

Page 16: Research Article Multiobjective Particle Swarm

16 Mathematical Problems in Engineering

Transactions on Evolutionary Computation vol 8 no 3 pp256ndash279 2004

[10] M Clerc and J Kennedy ldquoThe particle swarm-explosion sta-bility and convergence in a multidimensional complex spacerdquoIEEE Transactions on Evolutionary Computation vol 6 no 1pp 58ndash73 2002

[11] H Xiaohui and R Eberhart ldquoMultiobjective optimization usingdynamic neighborhood particle swarm optimizationrdquo in Pro-ceedings of the Congress on Evolutionary Computation (CEC rsquo02)vol 2 pp 1677ndash1681 May 2002

[12] S Mostaghim and J Teich ldquoStrategies for finding goodlocal guides in multi-objective particle swarm optimization(MOPSO)rdquo in Proceedings of the IEEE Swarm IntelligenceSymposium (SIS rsquo03) pp 26ndash33 Indianapolis Ind USA April2003

[13] T Bartz-Beielstein P Limbourg J Mehnen K Schmitt K EParsopoulos and M N Vrahatis ldquoParticle swarm optimizersfor Pareto optimizationwith enhanced archiving techniquesrdquo inProceedings of the Congress on Evolutionary Computation (CECrsquo03) vol 3 pp 1780ndash1787 December 2003

[14] K E Parsopoulos D K Tasoulis and M N Vrahatis ldquoMulti-objective optimization using parallel vector evaluated particleswarm optimizationrdquo in Proceedings of the IASTED Interna-tional Conference on Artificial Intelligence and Applications (AIArsquo04) vol 2 pp 823ndash828 February 2004

[15] S-J Tsai T-Y Sun C-C Liu S-T Hsieh W-C Wu and S-YChiu ldquoAn improved multi-objective particle swarm optimizerformulti-objective problemsrdquo Expert Systems with Applicationsvol 37 no 8 pp 5872ndash5886 2010

[16] A A Mousa M A El-Shorbagy and W F Abd-El-WahedldquoLocal search based hybrid particle swarm optimization algo-rithm for multiobjective optimizationrdquo Swarm and Evolution-ary Computation vol 3 pp 1ndash14 2012

[17] J Wang W Liu W Zhang and B Yang ldquoMulti-objective parti-cle swarm optimization based on self-update and grid strategyrdquoin Proceedings of the 2012 International Conference on Informa-tion Technology and Software Engineering vol 211 of LectureNotes in Electrical Engineering pp 869ndash876 Springer BerlinGermany 2013

[18] K Khalili-Damghani A-R Abtahi and M Tavana ldquoA newmulti-objective particle swarmoptimizationmethod for solvingreliability redundancy allocation problemsrdquo Reliability Engi-neering amp System Safety vol 111 pp 58ndash75 2013

[19] J Zhang Y Wang and J Feng ldquoParallel particle swarm opti-mization based on PAMrdquo in Proceedings of the 2nd InternationalConference on Information Engineering and Computer Science(ICIECS rsquo10) Wuhan China December 2010

[20] J Han and M Kamber Data Mining Concepts and TechniquesMorgan Kaufmann 2006

[21] L F Ibrahim ldquoUsing of clustering algorithm CWSP-PAM forrural network planningrdquo in Proceedings of the 3rd InternationalConference on Information Technology and Applications (ICITArsquo05) vol 1 pp 280ndash283 IEEE July 2005

[22] P Chopra J Kang and J Lee ldquoUsing gene pair combinationsto improve the accuracy of the PAM classifierrdquo in Proceedingsof the IEEE International Conference on Bioinformatics andBiomedicine (BIBM rsquo09) pp 174ndash177 November 2009

[23] Y-W Leung and Y Wang ldquoMultiobjective programming usinguniform design and genetic algorithmrdquo IEEE Transactions onSystems Man and Cybernetics Part C Applications and Reviewsvol 30 no 3 pp 293ndash304 2000

[24] D Liu K C Tan C K Goh and W K Ho ldquoA multiobjectivememetic algorithm based on particle swarm optimizationrdquoIEEE Transactions on Systems Man and Cybernetics Part BCybernetics vol 37 no 1 pp 42ndash50 2007

[25] K Deb A Pratap S Agarwal and T Meyarivan ldquoA fast andelitist multiobjective genetic algorithm NSGA-IIrdquo IEEE Trans-actions on Evolutionary Computation vol 6 no 2 pp 182ndash1972002

[26] C M Fonseca and P J Fleming ldquoMultiobjective optimizationandmultiple constraint handling with evolutionary algorithmsII Application examplerdquo IEEE Transactions on Systems Manand Cybernetics Part A Systems and Humans vol 28 no 1 pp38ndash47 1998

[27] F Kursawe ldquoA variant of evolution strategies for vector opti-mizationrdquo in Parallel Problem Solving from Nature vol 496 ofLecture Notes in Computer Science pp 193ndash197 Springer BerlinGermany 1991

[28] E Zitzler K Deb and LThiele ldquoComparison of multiobjectiveevolutionary algorithms empirical resultsrdquo Evolutionary Com-putation vol 8 no 2 pp 173ndash195 2000

[29] H-L Liu Y Wang and Y-M Cheung ldquoA Multi-objective evo-lutionary algorithm usingmin-max strategy and sphere coordi-nate transformationrdquo Intelligent Automation amp Soft Computingvol 15 no 3 pp 361ndash384 2009

[30] N Chakraborti B S Kumar V S Babu S Moitra andA Mukhopadhyay ldquoA new multi-objective genetic algorithmapplied to hot-rolling processrdquoAppliedMathematicalModellingvol 32 no 9 pp 1781ndash1789 2008

[31] K Deb ldquoMulti-objective genetic algorithms problem difficul-ties and construction of test problemsrdquo Evolutionary Computa-tion vol 7 no 3 pp 205ndash230 1999

[32] K Deb L Thiele M Laumanns and E Zitzler ldquoScalable testproblems for evolutionarymultiobjective optimizationrdquo in Evo-lutionary Multiobjective Optimization Advanced Informationand Knowledge Processing pp 105ndash145 Springer London UK2005

[33] Y Shi and R Eberhart ldquoModified particle swarm optimizerrdquoin Proceedings of the IEEE World Congress on ComputationalIntelligence pp 69ndash73 May 1998

[34] C S Feng S Cong and X Y Feng ldquoA new adaptive inertiaweight strategy in particle swarm optimizationrdquo in Procedingsof the IEEE Congress on Evolutionary Computation (CEC rsquo07)pp 4186ndash4190 IEEE Singapore September 2007

[35] C K Goh K C Tan D S Liu and S C Chiam ldquoA competitiveand cooperative co-evolutionary approach to multi-objectiveparticle swarm optimization algorithm designrdquo European Jour-nal of Operational Research vol 202 no 1 pp 42ndash54 2010

[36] P Cheng J-S Pan L Li Y Tang and C Huang ldquoA surveyof performance assessment for multiobjective optimizersrdquo inProceedings of the 4th International Conference on Genetic andEvolutionary Computing (ICGEC rsquo10) pp 341ndash345 December2010

[37] E Zitzler L Thiele M Laumanns C M Fonseca and V GDa Fonseca ldquoPerformance assessment of multiobjective opti-mizers an analysis and reviewrdquo IEEE Transactions on Evolution-ary Computation vol 7 no 2 pp 117ndash132 2003

[38] E Zitzler and L Thiele ldquoMultiobjective evolutionary algo-rithms a comparative case study and the strength Paretoapproachrdquo IEEETransactions onEvolutionaryComputation vol3 no 4 pp 257ndash271 1999

[39] H Li and Q Zhang ldquoMultiobjective optimization problemswith complicated pareto sets MOEAD and NSGA-IIrdquo IEEE

Mathematical Problems in Engineering 17

Transactions on Evolutionary Computation vol 13 no 2 pp284ndash302 2009

[40] Q Zhang A Zhou S Zhao P N Suganthan W Liu andS Tiwari ldquoMultiobjective optimization test instances for theCEC 2009 special session and competitionrdquo Working ReportDepartment of Computing and Electronic Systems Universityof Essex 2008

[41] Q Zhang A Zhou and Y Jin ldquoRM-MEDA a regularity model-based multiobjective estimation of distribution algorithmrdquoIEEE Transactions on Evolutionary Computation vol 12 no 1pp 41ndash63 2008

[42] C-S Tsou H-H Fang H-H Chang and C-H Kao ldquoAnimproved particle swarmPareto optimizerwith local search andclusteringrdquo in Simulated Evolution and Learning vol 4247 ofLecture Notes in Computer Science pp 400ndash407 2006

[43] U Wickramasinghe and X Li ldquoChoosing leaders for multiob-jective PSO algorithms using differential evolutionrdquo in Proceed-ing of the 7th International Conference Simulated Evolution andLearning vol 5361 of Lecture Notes in Computer Science pp249ndash258 Springer Berlin Germany 2008

[44] T Krink J S Vesterstrom and J Riget ldquoParticle swarm opti-misation with spatial particle extensionrdquo in Proceedings of theCongress on Evolutionary Computation (CEC rsquo02) pp 1474ndash1479 May 2002

Submit your manuscripts athttpwwwhindawicom

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical Problems in Engineering

Hindawi Publishing Corporationhttpwwwhindawicom

Differential EquationsInternational Journal of

Volume 2014

Applied MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Probability and StatisticsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical PhysicsAdvances in

Complex AnalysisJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

OptimizationJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

CombinatoricsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Operations ResearchAdvances in

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Function Spaces

Abstract and Applied AnalysisHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of Mathematics and Mathematical Sciences

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Algebra

Discrete Dynamics in Nature and Society

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Decision SciencesAdvances in

Discrete MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014 Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Stochastic AnalysisInternational Journal of

Page 17: Research Article Multiobjective Particle Swarm

Mathematical Problems in Engineering 17

Transactions on Evolutionary Computation vol 13 no 2 pp284ndash302 2009

[40] Q Zhang A Zhou S Zhao P N Suganthan W Liu andS Tiwari ldquoMultiobjective optimization test instances for theCEC 2009 special session and competitionrdquo Working ReportDepartment of Computing and Electronic Systems Universityof Essex 2008

[41] Q Zhang A Zhou and Y Jin ldquoRM-MEDA a regularity model-based multiobjective estimation of distribution algorithmrdquoIEEE Transactions on Evolutionary Computation vol 12 no 1pp 41ndash63 2008

[42] C-S Tsou H-H Fang H-H Chang and C-H Kao ldquoAnimproved particle swarmPareto optimizerwith local search andclusteringrdquo in Simulated Evolution and Learning vol 4247 ofLecture Notes in Computer Science pp 400ndash407 2006

[43] U Wickramasinghe and X Li ldquoChoosing leaders for multiob-jective PSO algorithms using differential evolutionrdquo in Proceed-ing of the 7th International Conference Simulated Evolution andLearning vol 5361 of Lecture Notes in Computer Science pp249ndash258 Springer Berlin Germany 2008

[44] T Krink J S Vesterstrom and J Riget ldquoParticle swarm opti-misation with spatial particle extensionrdquo in Proceedings of theCongress on Evolutionary Computation (CEC rsquo02) pp 1474ndash1479 May 2002

Submit your manuscripts athttpwwwhindawicom

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical Problems in Engineering

Hindawi Publishing Corporationhttpwwwhindawicom

Differential EquationsInternational Journal of

Volume 2014

Applied MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Probability and StatisticsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical PhysicsAdvances in

Complex AnalysisJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

OptimizationJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

CombinatoricsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Operations ResearchAdvances in

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Function Spaces

Abstract and Applied AnalysisHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of Mathematics and Mathematical Sciences

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Algebra

Discrete Dynamics in Nature and Society

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Decision SciencesAdvances in

Discrete MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014 Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Stochastic AnalysisInternational Journal of

Page 18: Research Article Multiobjective Particle Swarm

Submit your manuscripts athttpwwwhindawicom

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical Problems in Engineering

Hindawi Publishing Corporationhttpwwwhindawicom

Differential EquationsInternational Journal of

Volume 2014

Applied MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Probability and StatisticsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical PhysicsAdvances in

Complex AnalysisJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

OptimizationJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

CombinatoricsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Operations ResearchAdvances in

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Function Spaces

Abstract and Applied AnalysisHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of Mathematics and Mathematical Sciences

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Algebra

Discrete Dynamics in Nature and Society

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Decision SciencesAdvances in

Discrete MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014 Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Stochastic AnalysisInternational Journal of