performance of infeasibility driven evolutionary algorithm ...xin/papers/singhisaacs... ·...

8
Performance of Infeasibility Driven Evolutionary Algorithm (IDEA) on Constrained Dynamic Single Objective Optimization Problems Hemant Kumar Singh, Amitay Isaacs, Trung Thanh Nguyen, Tapabrata Ray and Xin Yao Abstract—A number of population based optimization al- gorithms have been proposed in recent years to solve un- constrained and constrained single and multi-objective opti- mization problems. Most of such algorithms inherently prefer a feasible solution over an infeasible one during the course of search, which translates to approaching the constraint boundary from the feasible side of the search space. Previous studies [1], [2] have already demonstrated the benefits of explic- itly maintaining a fraction of infeasible solutions in Infeasiblity Driven Evolutionary Algorithm (IDEA) for single and multi- objective constrained optimization problems. In this paper, the benefits of IDEA as a sub-evolve mechanism are highlighted for dynamic, constrained single objective optimization problems. IDEA is particularly attractive for such problems as it offers a faster rate of convergence over a conventional EA, which is of significant interest in dynamic optimization problems. The algorithm is tested on two new dynamic constrained test problems. For both the problems, the performance of IDEA is found to be significantly better than conventional EA. I. I NTRODUCTION Dynamic single objective optimization problems are far more challenging as compared to its static counterpart as the underlying optimization algorithm needs to deliver optimal solutions with minimal time lag, whenever the objective and/or constraints change. In order to solve such class of problems effectively and efficiently, the optimization algo- rithm should posses the following mechanisms: (a) identify a change in the objective/constraint function(s), (b) converge to the optimal solution with a high rate of convergence i.e. with minimal time lag, and (c) allow migration from an optimal solution to another. In terms of identifying a change in the objective or the constraint function, there are basically two possible approaches – proactive and reactive models. In a proactive model, a forecasting scheme is developed based on the nature of change in the objective and the constraint functions and a fraction of the population is evolved based on the forecasted values [3]. Such an approach is suitable if the objective and the constraint functions follows a pattern over time. The reactive approach on the other hand responds to a change in the objective or the constraint function only when it is detected. However, the detection of a change is a nontrivial problem by itself. Hemant Kumar Singh, Amitay Isaacs and Tapabrata Ray are with School of Aerospace, Civil and Mechanical Engineering, University of New South Wales at Australian Defence Force Academy, Canberra ACT, Australia. email: {h.singh, a.isaacs, t.ray}@adfa.edu.au Trung Thanh Nguyen and Xin Yao are with The Centre of Excellence for Research in Computational Intelligence and Applications (CERCIA), School of Computer Science,,University of Birmingham, B15 2TT, United Kingdom. email: {T.T.Nguyen, X.Yao}@cs.bham.ac.uk In order to solve dynamic optimization problems effi- ciently, the underlying optimization algorithm should have a high rate of convergence. Preservation of infeasible solutions in Infeasiblity Driven Evolutionary Algorithm by Singh et al [1] has already demonstrated better rate of convergence for a number of constrained single and multi-objective op- timization problems. IDEA essentially maintains a number of infeasible solutions close to constraint boundaries and evolves them. Such an approach offers a scope for higher rate of convergence when the optima lie on the constraint boundary. In this paper the performance of IDEA is studied on dynamic, constrained single objective optimization prob- lems within a dynamic optimization framework. In order to assess the performance of an optimization algorithm, a set of dynamic, constrained single objective op- timization benchmarks are required. A set of such problems have been developed by Nguyen and Yao [4]. In this paper, the performance of IDEA is studied on two of those problems and results are compared to a conventional EA. Some of the evolutionary algorithms proposed for dy- namic optimization in the past include memory enhanced evolutionary algorithms [5], Population based Incremen- tal Learning (PBIL) [6], [7], Primal-Dual Genetic Algo- rithm (PDGA) [8], Thermodynamical Genetic Algorithm [9] and many more. For fast convergence, often a hybrid approach is employed, where a population based algo- rithm is combined with a local search. Memetic Algo- rithm (MA) [10], [11] is one such hybrid algorithm. Among the most recent studies [12], an associative memory scheme has been developed to enhance the performance of PBIL algorithms in dynamic environments. Rest of the paper is organized as follows: Section II describes the basic structure of dynamic optimization frame- work. Section III briefly describes Infeasibility Drive Evolu- tionary Algorithm (IDEA). In Section IV, two scalable dy- namic constrained optimization test problems are proposed. Details of the numerical experiments and the performance of the algorithms are discussed in Section V, followed by concluding remarks in Section VI. II. EVOLUTIONARY FRAMEWORK FOR DYNAMIC OPTIMIZATION The evolutionary framework for dynamic optimiza- tion [11] is outlined in Algorithm 1. Two of the steps (Eval- uation and Sub-evolve) are described in the following sub- sections. Initialization and reduction steps are done exactly as in NSGA-II [13] and have not been discussed here for the sake of brevity. 3127 978-1-4244-2959-2/09/$25.00 c 2009 IEEE

Upload: others

Post on 06-Jul-2020

5 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Performance of Infeasibility Driven Evolutionary Algorithm ...xin/papers/SinghIsaacs... · Infeasibility Driven Evolutionary Algorithm (IDEA) was proposed by Singh et al [1]. It differs

Performance of Infeasibility Driven Evolutionary Algorithm (IDEA)on Constrained Dynamic Single Objective Optimization Problems

Hemant Kumar Singh, Amitay Isaacs, Trung Thanh Nguyen, Tapabrata Ray and Xin Yao

Abstract—A number of population based optimization al-gorithms have been proposed in recent years to solve un-constrained and constrained single and multi-objective opti-mization problems. Most of such algorithms inherently prefera feasible solution over an infeasible one during the courseof search, which translates to approaching the constraintboundary from the feasible side of the search space. Previousstudies [1], [2] have already demonstrated the benefits of explic-itly maintaining a fraction of infeasible solutions in InfeasiblityDriven Evolutionary Algorithm (IDEA) for single and multi-objective constrained optimization problems. In this paper, thebenefits of IDEA as a sub-evolve mechanism are highlighted fordynamic, constrained single objective optimization problems.IDEA is particularly attractive for such problems as it offersa faster rate of convergence over a conventional EA, whichis of significant interest in dynamic optimization problems.The algorithm is tested on two new dynamic constrained testproblems. For both the problems, the performance of IDEA isfound to be significantly better than conventional EA.

I. INTRODUCTION

Dynamic single objective optimization problems are farmore challenging as compared to its static counterpart as theunderlying optimization algorithm needs to deliver optimalsolutions with minimal time lag, whenever the objectiveand/or constraints change. In order to solve such class ofproblems effectively and efficiently, the optimization algo-rithm should posses the following mechanisms: (a) identify achange in the objective/constraint function(s), (b) converge tothe optimal solution with a high rate of convergence i.e. withminimal time lag, and (c) allow migration from an optimalsolution to another.

In terms of identifying a change in the objective orthe constraint function, there are basically two possibleapproaches – proactive and reactive models. In a proactivemodel, a forecasting scheme is developed based on the natureof change in the objective and the constraint functions and afraction of the population is evolved based on the forecastedvalues [3]. Such an approach is suitable if the objective andthe constraint functions follows a pattern over time. Thereactive approach on the other hand responds to a changein the objective or the constraint function only when it isdetected. However, the detection of a change is a nontrivialproblem by itself.

Hemant Kumar Singh, Amitay Isaacs and Tapabrata Ray are with Schoolof Aerospace, Civil and Mechanical Engineering, University of New SouthWales at Australian Defence Force Academy, Canberra ACT, Australia.email: {h.singh, a.isaacs, t.ray}@adfa.edu.au

Trung Thanh Nguyen and Xin Yao are with The Centre of Excellencefor Research in Computational Intelligence and Applications (CERCIA),School of Computer Science,,University of Birmingham, B15 2TT, UnitedKingdom. email: {T.T.Nguyen, X.Yao}@cs.bham.ac.uk

In order to solve dynamic optimization problems effi-ciently, the underlying optimization algorithm should have ahigh rate of convergence. Preservation of infeasible solutionsin Infeasiblity Driven Evolutionary Algorithm by Singh etal [1] has already demonstrated better rate of convergencefor a number of constrained single and multi-objective op-timization problems. IDEA essentially maintains a numberof infeasible solutions close to constraint boundaries andevolves them. Such an approach offers a scope for higherrate of convergence when the optima lie on the constraintboundary. In this paper the performance of IDEA is studiedon dynamic, constrained single objective optimization prob-lems within a dynamic optimization framework.

In order to assess the performance of an optimizationalgorithm, a set of dynamic, constrained single objective op-timization benchmarks are required. A set of such problemshave been developed by Nguyen and Yao [4]. In this paper,the performance of IDEA is studied on two of those problemsand results are compared to a conventional EA.

Some of the evolutionary algorithms proposed for dy-namic optimization in the past include memory enhancedevolutionary algorithms [5], Population based Incremen-tal Learning (PBIL) [6], [7], Primal-Dual Genetic Algo-rithm (PDGA) [8], Thermodynamical Genetic Algorithm [9]and many more. For fast convergence, often a hybridapproach is employed, where a population based algo-rithm is combined with a local search. Memetic Algo-rithm (MA) [10], [11] is one such hybrid algorithm. Amongthe most recent studies [12], an associative memory schemehas been developed to enhance the performance of PBILalgorithms in dynamic environments.

Rest of the paper is organized as follows: Section IIdescribes the basic structure of dynamic optimization frame-work. Section III briefly describes Infeasibility Drive Evolu-tionary Algorithm (IDEA). In Section IV, two scalable dy-namic constrained optimization test problems are proposed.Details of the numerical experiments and the performanceof the algorithms are discussed in Section V, followed byconcluding remarks in Section VI.

II. EVOLUTIONARY FRAMEWORK FOR DYNAMIC

OPTIMIZATION

The evolutionary framework for dynamic optimiza-tion [11] is outlined in Algorithm 1. Two of the steps (Eval-uation and Sub-evolve) are described in the following sub-sections. Initialization and reduction steps are done exactlyas in NSGA-II [13] and have not been discussed here for thesake of brevity.

3127978-1-4244-2959-2/09/$25.00 c© 2009 IEEE

Page 2: Performance of Infeasibility Driven Evolutionary Algorithm ...xin/papers/SinghIsaacs... · Infeasibility Driven Evolutionary Algorithm (IDEA) was proposed by Singh et al [1]. It differs

Algorithm 1 Evolutionary Framework for Dynamic Opti-mizationRequire: NG > 1 {Number of Generations}Require: N > 0 {Population size}

1: P1 = Initialize()2: Evaluate(P1)3: for i = 2 to NG do4: if the function has changed then5: Evaluate(Pi−1)6: end if7: Ci−1 = SubEvolve(Pi−1)8: Evaluate(Ci−1)9: Pi = Reduce(Pi−1 + Ci−1)

10: end for

A. Evaluation

The objective and constraints are calculated for eachsolution in the population. Since for dynamic problems, theobjective and constraint values for the same design vectormay differ during different time steps, a check has to bemade in each generation to detect the change in functionbehavior, if any. This is done by evaluating a random solutionin the population at each time step. If the objective and/orconstraint values have changed from previous generation,then the function behavior is assumed to have changed andthe whole population is re-evaluated. It is to be noted herethat this scheme is not full proof, and more involved schemescan be used to detect the change in function, but with anassociated additional computation cost.

B. Sub-evolve Method

The parent population is evolved using an embeddedevolutionary algorithm (Sub-EA). The Sub-evolve methodis described in Algorithm 2. The population is evolvedthrough a given number of “Sub-EA generations” to producesolutions for the next generation of EA. Sub-EA uses samepopulation size as the top level EA. For Sub-EA evolution,simulated binary crossover (SBX) and polynomial mutationoperators [14] are used in the present studies, as done in [11].

Algorithm 2 Sub-EA Evolution Algorithm

Require: N ′G > 1 {Number of Sub-EA Generations}

Require: Pj {Parent Population of jth generation}1: P ′

1 = Pj

2: for i = 1 to N ′G do

3: C ′i = Crossover(P ′

i )4: C ′

i = Mutation(C ′i)

5: Evaluate(C ′i)

6: P ′i+1 = Reduce(P ′

i + C ′i)

7: end for8: Cj = P ′

N ′G

{Sub-EA evolved population is offspringpopulation for EA}

III. INFEASIBILITY DRIVEN EVOLUTIONARY

ALGORITHM (IDEA)

Infeasibility Driven Evolutionary Algorithm (IDEA) wasproposed by Singh et al [1]. It differs from the conventionalEAs significantly in the terms of ranking and selection ofthe solutions. While most EAs rank feasible solutions aboveinfeasible solutions, IDEA ranks solutions based on theoriginal objectives along with additional objective represent-ing constraint violation measure. IDEA explicitly maintainsa few infeasible solutions during the search. In addition,“good” infeasible solutions are ranked higher than the fea-sible solutions, and thereby the search proceeds throughboth feasible and infeasible regions, resulting in greater rateof convergence to optimal solutions. The concept of using“good” infeasible solutions during the search for constrainthandling has also been explored by a few other researchers inthe past for constraint handling [15], [16], [17], [18], [19].However, the method of exploitation of infeasible solutiondiffers for each of these approaches. For example, Hamidaand Shoenauer [17] used adaptive penalty to control theproportion of feasible solutions in the population. Vieira etal [16] treated constraints as objectives during the search.Coello Coello [15] proposed to split the population intovarious sub-populations, each sub-population using eitherobjective or one of the constraints as fitness function.Isaacs, Ray and Smith [18] proposed Constraint HandlingEvolutionary Algorithm (CHEA), where they treated thenumber of constraint violations as an additional objective, toobtain constrained and unconstrained optimum for a problemsimultaneously.

The benefits obtained in convergence rate using explicitpreservation of infeasible solutions in CHEA motivatedthe development of IDEA, where the original problem isreformulated as an unconstrained problem with “violationmeasure” of the solutions as an additional objective. Vio-lation measure is a quantity that is calculated based on theconstraint violations of the solutions in the population. SinceIDEA tries to minimize the violation measure during theevolution of solutions, it also obtains marginally violated so-lutions, which can be useful for tradeoff studies or sensitivityanalysis.

The studies reported in [1], [2] indicate that IDEA hasbetter rate of convergence over conventional EA for a numberof static constrained single and multi-objective constrainedoptimization problems. Since in dynamic optimization prob-lems, there is a need for faster rate of convergence, IDEAis certainly attractive. In this paper we present preliminarystudies of IDEA on single-objective constrained dynamicoptimization problems.

Before moving ahead with the proposal for the DynamicOptimization problems, the constraint handling for singleobjective optimization problems using Infeasibility DrivenEvolutionary Algorithm (IDEA) is described below.

A generalized (static) single-objective optimization prob-

3128 2009 IEEE Congress on Evolutionary Computation (CEC 2009)

Page 3: Performance of Infeasibility Driven Evolutionary Algorithm ...xin/papers/SinghIsaacs... · Infeasibility Driven Evolutionary Algorithm (IDEA) was proposed by Singh et al [1]. It differs

lem can be formulated as shown in (1)

Minimize f(x)Subject to gi(x) ≥ 0, i = 1, . . . , m

(1)

where x = (x1, . . . , xn) is the design variable vectorbounded by lower and upper bounds x ∈ S ⊂ �n.

To effectively search the design space (including thefeasible and the infeasible regions), the original single ob-jective constrained optimization problem is reformulated asbi-objective unconstrained optimization problem as shown in(2).

Minimize f ′1(x) = f1(x)

f ′2(x) = violation measure

(2)

The additional objective represents a measure of constraintviolation, which is referred to as “violation measure”. It isbased on the amount of relative constraint violations amongthe population members. Each solution in the population isassigned m ranks, corresponding to m constraints. The ranksare calculated as follows. To get the ranks correspondingto ith constraint, all the solutions are sorted based on theconstraint violation value of ith constraint. Solutions that donot violate the constraint are assigned rank 0. The solutionwith the least constraint violation value gets rank 1, andthe rest of the solutions are assigned increasing ranks inthe ascending order of their constraint violation values. Theprocess is repeated for all the constraints and as a resulteach solution in the population gets assigned m ranks. Theviolation measure is the sum of these m ranks correspondingto m constraints.

The main steps of IDEA are outlined in Algorithm 3.IDEA uses simulated binary crossover (SBX) and polynomialmutation operators to generate offspring from a pair ofparents selected using binary tournament as in NSGA-II [13].Individual solutions in the population are evaluated using theoriginal problem definition (1) and the infeasible solutionsare identified. The solutions in the parent and offspring pop-ulation are divided into a feasible set (Sf ) and an infeasibleset (Sinf ). The solutions in the feasible set and the infeasibleset are ranked separately using the non-dominated sortingand crowding distance sorting [13] based on 2 objectivesas per the modified problem definition (2). The solutionsfor the next generation are selected from both the sets tomaintain infeasible solutions in the population. In addition,the infeasible solutions are ranked higher than the feasiblesolutions to provide a selection pressure to create betterinfeasible solutions resulting in an active search through theinfeasible search space.

A user-defined parameter α is used to maintain a set ofinfeasible solutions as a fraction of the size of the population.The numbers Nf and Ninf denote the number of feasibleand infeasible solutions as determined by parameter α. Ifthe infeasible set Sinf has more than Ninf solutions, thenfirst Ninf solutions are selected based on their rank, else allthe solutions from Sinf are selected. The rest of the solutionsare selected from the feasible set Sf , provided there are

Algorithm 3 Infeasibility Driven Evolutionary Algo-rithm (IDEA)Require: N {Population Size}Require: NG > 1 {Number of Generations}Require: 0 < α < 1 {Proportion of infeasible solutions}

1: Ninf = α ∗ N2: Nf = N − Ninf

3: pop1 = Initialize()4: Evaluate(pop1)5: for i = 2 to NG do6: childpopi−1 = Evolve(popi−1)7: Evaluate(childpopi−1)8: (Sf , Sinf ) = Split(popi−1 + childpopi−1)9: Rank(Sf )

10: Rank(Sinf )11: popi = Sinf (1 : Ninf ) + Sf (1 : Nf )12: end for

at least Nf number of feasible solutions. If Sf has fewersolutions, all the feasible solutions are selected and the restare filled with infeasible solutions from Sinf . The solutionsare ranked from 1 to N in the order they are selected. Hence,the infeasible solutions selected first will be ranked higherthan the feasible solutions selected later.

Similar to Sub-evolve mechanism using EA described inSection II, we can now construct Sub-evolve mechanismusing IDEA (henceforth referred to as Sub-IDEA) as shownin Algorithm 4.

Algorithm 4 Sub-IDEA Evolution Algorithm

Require: N {Population size}Require: N ′

G > 1 {Number of Sub-IDEA Generations}Require: 0 < α < 1 {Proportion of infeasible solutions}Require: Pj {Parent Population of jth generation}

1: Ninf = α ∗ N2: Nf = N − Ninf

3: P ′1 = Pj

4: for i = 1 to N ′G do

5: C ′i = Crossover(P ′

i )6: C ′

i = Mutation(C ′i)

7: Evaluate(C ′i)

8: (Sf , Sinf ) = Split(P ′i + C ′

i)9: Rank(Sf )

10: Rank(Sinf )11: Pi+1 = Sinf (1 : Ninf ) + Sf (1 : Nf )12: end for13: Cj = P ′

N ′G

{Sub-EA evolved population is offspringpopulation for EA}

IV. TEST PROBLEMS

Although there are dynamic, single objective uncon-strained optimization benchmarks reported in literature (suchas moving peaks, dynamic knapsack etc.), there are no wellknown benchmarks for dynamic single objective constrained

2009 IEEE Congress on Evolutionary Computation (CEC 2009) 3129

Page 4: Performance of Infeasibility Driven Evolutionary Algorithm ...xin/papers/SinghIsaacs... · Infeasibility Driven Evolutionary Algorithm (IDEA) was proposed by Singh et al [1]. It differs

(a)

00.511.522.5302

4

−10

−5

0

5

10

15

20

25

30

35

40

x1

x2

g

(b) (c)

Fig. 1: Static function g24: (a) objective function surface, (b) constraint surfaces (c) objective function values for feasiblesearch space

optimization. To bridge this gap, a set of dynamic singleobjective constrained benchmark problems has been pro-posed recently by Nguyen and Yao [4]. In this paper wechoose two benchmark functions from that set to evaluate theperformance of IDEA. For most of the real world problems,the optimum often lies on the constraint boundary. Thesetwo test problems have been chosen specifically for thepresented studies, as their optimum function values occurat the constraint boundary at certain time steps. The detailsof the benchmark functions are provided below.

The dynamic constrained problem g24 is derived fromits static counterpart [20], [21]. The landscape for the staticproblem is shown in Figure 1.

The definition of the dynamic constrained problem g24 isset out as shown in Equation 3.

Minimize f(x, t) = −(X1(x1, t) + X2(x2, t))subject to

g1(x, t) = 2Y1(x1, t)4 − 8Y1(x1, t)3 + 8Y1(x1, t)2

− Y2(x2, t) + 2 ≥ 0

g2(x, t) = 4Y1(x1, t)4 − 32Y1(x1, t)3 + 88Y1(x1, t)2

− 96Y1(x1, t) − Y2(x2, t) + 36 ≥ 0where

Xi(x, t) = pi(t)(x + qi(t))Yi(x, t) = ri(t)(x + si(t))0 ≤ x1 ≤ 3, 0 ≤ x2 ≤ 4

(3)

Here, pi(t), qi(t), ri(t), and si(t) (i = 1, 2) are time-dependent dynamic drivers which determine the test problemdynamics. We consider two settings here as described below.

A. Test problem g24 2

if (t mod 2 = 0)

⎧⎪⎪⎨

⎪⎪⎩

p1(t) = sin

(kπt

2+

π

2

)

{p2(t) = p2(t − 1) if t > 0p2(t) = p2(0) if t = 0

if (t mod 2 = 0)

⎧⎨

p1(t) = p1(t − 1)

p2(t) = sin

(kπ(t − 1)

2+

π

2

)

qi(t) = si(t) = 0; i = 1, 2ri(t) = 1; i = 1, 2

(4)

The parameter k determines the severity of the change inthe function. A higher value of k represents a more severemovement of current global optima from the previous timestep.

The generalized function f(x, t) = −(X1(x1, t) +X2(x2, t)) is linear with x as X1 and X2 are both linearlydependent on x. The plot of objective function values forfeasible space at t = 0 is shown in Figure 1(a). As theparameters p1 and p2 vary with time, the objective functionsundergo cyclic change, while maintaining the linear form.The landscape at t = 2 is shown in Figure 2.

The test problem has following properties:

• Static constraints (do not vary with time).• Two disjoint feasible regions.• Dynamic (cyclic) objective functions• Global optimum switches between the disjoint feasible

regions.

3130 2009 IEEE Congress on Evolutionary Computation (CEC 2009)

Page 5: Performance of Infeasibility Driven Evolutionary Algorithm ...xin/papers/SinghIsaacs... · Infeasibility Driven Evolutionary Algorithm (IDEA) was proposed by Singh et al [1]. It differs

Fig. 2: Objective function landscape for g24 2 at t = 2

B. Test problem g24 5

if (t mod 2 = 0)

⎧⎪⎪⎨

⎪⎪⎩

p1(t) = sin

(kπt

2+

π

2

)

{p2(t) = p2(t − 1) if t > 0p2(t) = p2(0) if t = 0

if (t mod 2 = 0)

⎧⎨

p1(t) = p1(t − 1)

p2(t) = sin

(kπ(t − 1)

2+

π

2

)

qi(t) = s1(t) = 0; i = 1, 2ri(t) = 1; i = 1, 2s2(t) = (t/S) × (x2,max − x2,min)

(5)As in the case of g24 2, k determines the severity of

function change. The parameter S, determines the severityof changes in the constraints (distance that constraints moveafter one change). A lower value of S will cause a highermovement of constraints, resulting in higher severity ofproblem.

The objective function is linear in x for g24 5 and undergosimilar cyclic change as in g24 2. However, in this case, theconstraints are not static unlike g24 2. The position of theconstraint surfaces at t = 0 and t = 9 are shown in Figure 3.The movement of constraints causes a change in size of thefeasible regions with time.

The test problem possesses following characteristics:• Dynamic (cyclic) objective function.• Dynamic (linear moving) constraints.• Size of feasible region changes with time.• Multiple disconnected feasible regions.• Number of disconnected feasible regions changes.• Global optima move between the disconnected feasible

regions, due to moving of constraints.

V. NUMERICAL EXPERIMENTS

Since the function landscape of a dynamic optimizationproblem keeps changing with time, it is not sufficient tocompare the performance of algorithms in terms of a single

0 0.5 1 1.5 2 2.5 302

4

−10

−5

0

5

10

15

20

25

30

35

x1

x2

g

Constraint 1, t = 9

Constraint 1, t = 0

Constraint 2, t = 9

Constraint 2, t = 0

Fig. 3: Movement of constraints for g24 5

optimum solution. A solution which is best at a giventime may not be so after the function undergoes a change.Therefore, the performance index should be reflective ofhow the algorithm has performed over time. The mostcommonly used performance index for dynamic problems ismodified offline performance ([22], [23]). Modified Offlineperformance foffline after T evaluations during a run is givenby Equation 6.

foffline =1T

T∑

t=1

f∗t (6)

where f∗t denotes the best solution found since the last

change.In addition to modified offline performance, a comparison

has also been made between the two algorithms based ontheir progress plots over generations and best values achievedfor each change. Additionally, we propose that the area underthe curve (the best function value v/s generations graph) asan indicator of average performance of an algorithm overtime. An algorithm which converges to the minimum fasterwould tend to have a smaller area under the curve comparedto an algorithm which converges slowly (thereby havinghigher function value for longer amount of time, resultingin increased area under the curve). The algorithms have alsobeen compared based on the mean best-of-generations values,as done previously in literature [24]. Finally, t-test has beenconducted on the mean best-of-generations values in orderto establish the statistical significance of the tests.

Both the test problems were solved using two differ-ent Sub-evolve mechanisms, i.e. Sub-EA and Sub-IDEA.Multiple (30) runs were performed for each strategy. Thecrossover and mutation parameters used are listed in Table I.A population size of 20 solutions was evolved over 100generations. Number of Sub-EA generations used for each

2009 IEEE Congress on Evolutionary Computation (CEC 2009) 3131

Page 6: Performance of Infeasibility Driven Evolutionary Algorithm ...xin/papers/SinghIsaacs... · Infeasibility Driven Evolutionary Algorithm (IDEA) was proposed by Singh et al [1]. It differs

strategy was 15, thereby resulting in 15× 20 = 300 internalevaluations within each generation. The frequency of thefunction change was fixed at 10 generations, hence theperformance reported corresponds to 10 function changes.The value of parameter k was set to 1 for both problems.For g24 5, the parameter S is set to 50. The value of kcorresponds to large severity in terms of objective functionchanges, whereas value of S corresponds to large severity interms of constraint movements.

TABLE I: Parameters used for the studiesParameter Value

Crossover Probability 0.9Mutation Probability 0.1Crossover Distribution Index 15Mutation Distribution Index 20Infeasibility ratio (for IDEA) 0.2

The best objective function averaged over all runs throughgenerations for g24 2 and g24 5 is shown in Figure 4(a)and Figure 4(b) respectively. From the figures, it can bevisibly seen that on an average, Sub-IDEA shows a fasterimprovement in the function values than Sub-EA.

This is also reflected in the modified offline performance.The modified performance averaged over all runs is shown inFigure 5(a) and Figure 5(b) for the g24 2 and g24 5 respec-tively. The average offline index values are lower for Sub-IDEA compared to Sub-EA, indicating better performance.

The area under the curve is measured with reference thehorizontal line f = −6, since all the graphs lie entirelyabove this line. The values of area under the curve averagedover all runs is shown in Table II. The standard deviation isalso less in the case of Sub-IDEA, indicating its consistentperformance during multiple runs.

TABLE II: Area under the curveProblem Sub-EA Sub-IDEA

Average area S.D. Average area S.D.

g24 2 376.74 16.22 349.36 15.54g24 5 399.42 14.26 374.52 10.99

The best values found at each change is also of interest forthe dynamic problems. The best values found using Sub-EAand Sub-IDEA are listed in Table III. The values shown areaveraged over all runs. It is seen that Sub-IDEA is able toachieve equal or better values as compared to Sub-EA foreach change.

Mean best-of-generation for both algorithms, averagedover 30 runs has been shown in Table IV. It is seen that Sub-IDEA is able to achieve lower values for both test problems,indicating better overall performance during the runs.

In addition to the above comparisons, the t-test can beused to determine the statistical significance of the experi-ments done. Paired t-test between Sub-IDEA – SubEA wasperformed by observing the mean best-of-generation values

0 20 40 60 80 100−6

−4

−2

0

2

4

6

Generation/Time step

Obj

ectiv

e va

lue(

Ave

rage

, 30

runs

)

Sub−EASub−IDEA

(a) g24 2

0 20 40 60 80 100−6

−4

−2

0

2

4

6

Generation/Time step

Obj

ectiv

e va

lue(

Ave

rage

, 30

runs

)

Sub−EASub−IDEA

(b) g24 5

Fig. 4: Average performance of Sub-EA and Sub-IDEA overmultiple runs

TABLE III: Best values for 10 changes (Averaged over 30runs)

t g24 2 g24 5Sub-EA Sub-IDEA Sub-EA Sub-IDEA

0 -4.9548 -5.4106 -4.9548 -5.41061 -5.0783 -5.4661 -5.1186 -5.44472 0 0 0.0120 0.01003 0 0 0 04 -3.6411 -4.6449 -3.0985 -3.95105 -4.0929 -4.9169 -3.5784 -4.02706 0 0 0 07 0 0 0 08 -3.6757 -4.2759 -2.7889 -3.47219 -4.0961 -4.5392 -3.1748 -3.5167

over multiple runs. A degree of freedom (df ) of 58 wasused, at 0.05 level of significance(α). The calculated t-values for test problems g24 2 and g24 5 are -6.60 and -7.55 respectively, which exceed the critical t-value for givendf and α. This confirms that the mean values obtainedusing the two algorithms is statistically significant, indicatingsuperior performance of Sub-IDEA over Sub-EA for theabove conducted experiments.

3132 2009 IEEE Congress on Evolutionary Computation (CEC 2009)

Page 7: Performance of Infeasibility Driven Evolutionary Algorithm ...xin/papers/SinghIsaacs... · Infeasibility Driven Evolutionary Algorithm (IDEA) was proposed by Singh et al [1]. It differs

0 20 40 60 80 100−5.5

−5

−4.5

−4

−3.5

−3

−2.5

−2

−1.5

Generations/Timestep

Mod

ified

offl

ine

perf

orm

ance

(av

erag

e)

Sub−EASub−IDEA

(a) g24 2

0 20 40 60 80 100−5.5

−5

−4.5

−4

−3.5

−3

−2.5

−2

−1.5

Generations/Timestep

Mod

ified

offl

ine

perf

orm

ance

(av

erag

e)

Sub−EASub−IDEA

(b) g24 5

Fig. 5: Modified offline performance of Sub-EA and Sub-IDEA averaged over 30 runs

TABLE IV: Mean best-of-generation values (averaged over30 runs)

Sub-EA Sub-IDEA

g24 2 -2.2130 -2.4890g24 5 -1.9816 -2.2322

VI. SUMMARY AND FUTURE WORK

This paper highlights the benefits of Infeasibility DrivenEvolutionary Algorithm (IDEA) for dynamic, constrainedsingle objective optimization problems. The presence ofinfeasible solutions allow IDEA to approach the constrainedoptimum from the infeasible side as well as feasible sideof the search space, thereby converging faster than conven-tional EAs which approach the optimum from feasible sideonly. The paper provides results of preliminary studies ofthe algorithm on two dynamic, constrained single objectiveoptimization benchmarks. The results of using IDEA asa sub-evolve mechanism are certainly encouraging for theabove problems. Its performance is currently being studiedextensively for available constrained dynamic optimization

problems.The comparison of proposed algorithm has been made

with a structurally similar algorithm in order to highlightthe benefits of maintaining infeasible solutions for dynamicoptimization problems. Currently studies are underway tocompare the performance of IDEA with other existing al-gorithms.

ACKNOWLEDGMENT

The presented work was supported by grants from Defenceand Security Applications Research Center (DSARC), Aus-tralian Defence Force Academy, University of New SouthWales, Australia.

The work of Trung Thanh Nguyen is supported by theOverseas Research Scheme Award (ORS) and the School ofComputer Science, University of Birmingham.

The work of Xin Yao is supported by an EPSRC grant(EP/E058884/1) on “Evolutionary Algorithms for DynamicOptimisation Problems: Design, Analysis and Applications”.

REFERENCES

[1] H. K. Singh, A. Isaacs, T. Ray, and W. Smith, “Infeasibility DrivenEvolutionary Algorithm (IDEA) for Engineering Design Optimiza-tion,” in Proceedings of 21st Australiasian Joint Conference on Ar-tificial Intelligence AI-08, 2008, pp. 104–115.

[2] T. Ray, H. K. Singh, A. Isaacs, and W. Smith, “Infeasibility drivenevolutionary algorithm for constrained optimization,” in ConstraintHandling in Evolutionary Optimization, ser. Studies in ComputationalIntelligence. Springer, in press.

[3] I. Hatzakis and D. Wallace, “Dynamic multi-objective optimizationwith evolutionary algorithms: a forward-looking approach,” in GECCO’06: Proceedings of the 8th annual conference on Genetic and evolu-tionary computation. New York, NY, USA: ACM, 2006, pp. 1201–1208.

[4] T. Nguyen and X. Yao, “Benchmarking and solving dynamic con-strained problems,” in IEEE Congress on Evolutionary Computa-tion (CEC) 2009, Accepted.

[5] J. Branke, “Memory enhanced evolutionary algorithms for changingoptimization problems,” in Evolutionary Computation, 1999. CEC 99.Proceedings of the 1999 Congress on, vol. 3, 1999, pp. 1874–1882.

[6] S. Baluja, “Population-based incremental learning: A method forintegrating genetic search based function optimization and competitivelearning,” Carnegie Mellon University, Pittsburgh, PA, USA, Tech.Rep., 1994.

[7] S. Yang and X. Yao, “Experimental study of population-based incre-mental learning algorithms for dynamic optimization problems,” SoftComputing: A Fusion of Foundations, Methodologies and Applica-tions, vol. 9, no. 11, 2005.

[8] S. Yang, “Non-stationary problem optimization using the primal-dualgenetic algorithm,” The 2003 Congress on Evolutionary Computa-tion (CEC), vol. 3, pp. 2246–2253, December 2003.

[9] N. Mori, H. Kita, and Y. Nishikawa, “Adaption to a changing environ-ment by means of the thermodynamical genetic algorithm,” LectureNotes in Computer Science, vol. 1141, pp. 513–522, 1996.

[10] P. Moscato, “On evolution, search, optimization, genetic algorithmsand martial arts: Towards memetic algorithms,” California Institute ofTechnology, Tech. Rep., 1989.

[11] T. Ray, A. Isaacs, and W. Smith, “A Memetic Algorithm for DynamicMultiobjective Optimization,” in Multi-objective Memetic Algorithms,ser. Studies in Computational Intelligence. Springer, 2008 (In Press).

[12] S. Yang and X. Yao, “Population-based incremental learning withassociative memory for dynamic environments,” IEEE Transactionson Evolutionary Computation, vol. 12, no. 5, pp. 542–561, Oct. 2008.

[13] K. Deb, A. Pratap, S. Agarwal, and T. Meyarivan, “A fast and elitistmultiobjective genetic algorithm: NSGA-II,” Evolutionary Computa-tion, IEEE Transactions on, vol. 6, pp. 182–197, 2002.

[14] K. Deb and S. Agrawal, “Simulated binary crossover for continuoussearch space,” Complex Systems, vol. 9, pp. 115–148, 1995.

2009 IEEE Congress on Evolutionary Computation (CEC 2009) 3133

Page 8: Performance of Infeasibility Driven Evolutionary Algorithm ...xin/papers/SinghIsaacs... · Infeasibility Driven Evolutionary Algorithm (IDEA) was proposed by Singh et al [1]. It differs

[15] C. A. Coello Coello, “Constraint-handling using an evolutionarymultiobjective optimization technique,” Civil engineering and envi-ronmental systems, vol. 17, no. 4, pp. 319–346, 2000.

[16] D. A. G. Vieira, R. L. S. Adriano, L. Krahenbuhl, and J. A. Vascon-celos, “Handing constraints as objectives in a multiobjective geneticbased algorithm,” Journal of Microwaves and Optoelectronics, vol. 2,no. 6, pp. 50–58, Dec. 2002.

[17] S. B. Hamida and M. Schoenauer, “An adaptive algorithm for con-strained optimization problems,” in Lecture notes in Computer Science1917: Parallel Problem Solving from Nature PPSN-VI. SpringerBerlin/Heidelberg, 2000, pp. 529–538.

[18] A. Isaacs, T. Ray, and W. Smith, “Blessings of maintaining infeasiblesolutions for constrained multi-objective optimization problems,” inProceedings of IEEE Congress on Evolutionary Computation (CEC-2008), Hong Kong, 2008, pp. 2785–2792.

[19] P. Hingston, L. Barone, S. Huband, and L. While, “Multi-level rankingfor constrained multi-objective evolutionary optimisation,” in PPSNIX: Proceedings of the 5th International Conference on ParallelProblem Solving from Nature. Springer-Verlag Berlin Heidelberg,2006, pp. 563–572.

[20] C. Floudas, P. Pardalos, C. Adjiman, W. Esposito, Z. Gumus, S. Hard-ing, J. Klepeis, C.Meyer, and C. Schweiger, Handbook of Test Prob-lems in Local and Global Optimization. Kluwer, Dordrecht, TheNetherlands, 1999.

[21] J. J. Liang, T. P. Runarsson, E. Mezura-Montes, M. Clerc, P. Sug-anthan, C. A. Coello Coello, and K. Deb, “Problem definitions andevaluation criteria for the cec 2006 special session on constrained real-parameter optimization,” Nangyang Technological University, Singa-pore, Tech. Rep., 2006.

[22] Y. Jin and J. Branke, “Evolutionary optimization in uncertainenvironments-a survey,” Evolutionary Computation, IEEE Transac-tions on, vol. 9, no. 3, pp. 303–317, June 2005.

[23] J. Branke and H. Schmeck, “Designing evolutionary algorithms fordynamic optimization problems,” in Theory and Application of Evolu-tionary Computation: Recent Trends. Springer-Verlag Berlin, 2002,pp. 239–262.

[24] R. Tinos and S. Yang, “A self-organizing random immigrants geneticalgorithm for dynamic optimization problems,” Genetic Programmingand Evolvable Machines, vol. 8, no. 3, pp. 255–286, 2007.

3134 2009 IEEE Congress on Evolutionary Computation (CEC 2009)