application of modified nsga ii (m-nsga ii) algorithm to ... · nsga with penalty function...

16
Proceedings of the International Conference on Industrial Engineering and Operations Management Riyadh, Saudi Arabia, November 26-28, 2019 © IEOM Society International Application of Modified NSGA II (M-NSGA II) Algorithm to Large Scale Problems Mohamed H. Gadallah, Mechanical Design & Production Department Faculty of Engineering, Cairo University [email protected] Abdel Rahman Ali M. Ahmed Mechanical Design & Production Department Faculty of Engineering, Cairo University [email protected] Abstract In this paper the multi objective optimization problem is studied further with respect to quality indices. In particular, the maximum Pareto front error, accuracy of Pareto frontier, spacing, overall spread, objective spread, maximum spread, and number of distinct choices, cluster, hyper area, dominant area, non- dominated evaluation, and crowding distances are employed to assess the Pareto Front of real case studies. Two algorithms are employed to solve the multi objective problem, namely: the NSGA II and SPEA2 algorithms. These performance indices are analyzed in relation to a set of benchmark problems and conclusions are drawn. Five out of twelve indices can be eliminated. In addition, a new approach to handle both constrained and un-constrained multi objective problems using a modification to the fast elitist multi objective Genetic Algorithm, NSGA II (Non-dominated Sorting Genetic Algorithm II). The approach overcomes shortcomings from previous approaches in handling constrained multi objective problems like Penalty Function Approach, Jimenez-Verdegay-Gomez-Skarmeta's approach, and Ray-Tai-Seow's Method. Comparative studies are conducted using a group of quality indices [2,19] to compare the new approach (M-NSGA II) to two other different algorithms: regular NSGA, NSGA-II with penalty function. Results show that the new approach is the best among the three approaches. Keywords: Multi-objective Problems, NSGA II Algorithm, Quality Indices. Nomenclature MOEAs Multi Optimization Evelutionary Algorithms CLu (P) Number of Cluster on the obtained Pareto Frontier Do Scaled Area: Dominant Area AC Accuracy of observed Pareto Frontier NDEM Non-Dominated Evaluation Metric MPFE Maximum Pareto-Optimal Front Error NDC Number of Distinct Choices CLu (P) Number of Cluster on the obtained Pareto Frontier S Spacing Spread OS overall Pareto spread 798

Upload: others

Post on 28-Jun-2020

10 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Application of Modified NSGA II (M-NSGA II) Algorithm to ... · NSGA with Penalty function approach. with respect to convergence, diversity, and time to assist the practicing engineers

Proceedings of the International Conference on Industrial Engineering and Operations Management Riyadh, Saudi Arabia, November 26-28, 2019

© IEOM Society International

Application of Modified NSGA II (M-NSGA II) Algorithm to Large Scale Problems

Mohamed H. Gadallah, Mechanical Design & Production Department

Faculty of Engineering, Cairo University [email protected]

Abdel Rahman Ali M. Ahmed Mechanical Design & Production Department

Faculty of Engineering, Cairo University [email protected]

Abstract In this paper the multi objective optimization problem is studied further with respect to quality indices. In particular, the maximum Pareto front error, accuracy of Pareto frontier, spacing, overall spread, objective spread, maximum spread, and number of distinct choices, cluster, hyper area, dominant area, non-dominated evaluation, and crowding distances are employed to assess the Pareto Front of real case studies. Two algorithms are employed to solve the multi objective problem, namely: the NSGA II and SPEA2 algorithms. These performance indices are analyzed in relation to a set of benchmark problems and conclusions are drawn. Five out of twelve indices can be eliminated. In addition, a new approach to handle both constrained and un-constrained multi objective problems using a modification to the fast elitist multi objective Genetic Algorithm, NSGA II (Non-dominated Sorting Genetic Algorithm II). The approach overcomes shortcomings from previous approaches in handling constrained multi objective problems like Penalty Function Approach, Jimenez-Verdegay-Gomez-Skarmeta's approach, and Ray-Tai-Seow's Method. Comparative studies are conducted using a group of quality indices [2,19] to compare the new approach (M-NSGA II) to two other different algorithms: regular NSGA, NSGA-II with penalty function. Results show that the new approach is the best among the three approaches.

Keywords: Multi-objective Problems, NSGA II Algorithm, Quality Indices. Nomenclature MOEAs Multi Optimization Evelutionary Algorithms CLu (P) Number of Cluster on the obtained Pareto Frontier Do Scaled Area: Dominant Area AC Accuracy of observed Pareto Frontier NDEM Non-Dominated Evaluation Metric MPFE Maximum Pareto-Optimal Front Error NDC Number of Distinct Choices CLu (P) Number of Cluster on the obtained Pareto Frontier S Spacing ∆ Spread OS overall Pareto spread

798

Page 2: Application of Modified NSGA II (M-NSGA II) Algorithm to ... · NSGA with Penalty function approach. with respect to convergence, diversity, and time to assist the practicing engineers

Proceedings of the International Conference on Industrial Engineering and Operations Management Riyadh, Saudi Arabia, November 26-28, 2019

© IEOM Society International

1. INTRODUCTION Unlike single-objective optimization, Multi-objective optimization faces a big challenge in finding an optimal trade-off solution among conflicting objectives [20]. Quality of Pareto Front varies from technique to another, two important issues are mandatory to achieve simultaneously in multi-objective optimization; convergence close to the true Pareto, and diversity to get a trade-off solution with variety of properties [1]. Lots of quality indices are developed to assess the resulting solutions. Convergence, diversity of solutions, solving time to reach a set of solutions, and the ability of an algorithm to handle constrained or complex problems are measures to the goodness of an algorithm. In this paper; Numerical Analysis of Optimization Quality Indices, twelve of quality indices are used to assess the quality of Pareto-optimal sets using two different algorithms; namely NSGA II and SPEA2, while in the second study; Modification to NSGA II Algorithm, the same twelve indices are used to assess and to compare quality solutions obtained by the new approach M-NSGA II to quality of solutions obtained by two other algorithms, namely; Regular NSGA, and NSGA with Penalty function approach with respect to convergence, diversity, and time to assist the practicing engineers to interpret different solutions. In addition, this paper develops a new selection procedure to modify regular NSGA II with respect to constraint handling. Four bench mark problems are used for comparative analysis. A background is given next [8].

2. EVOLUTIONARY ALGORITHMS [EAS] Evolutionary Algorithms (EAs) mimic natural evolutionary principles to constitute search and optimization procedures. EAs are different from classical search and optimization procedures in a variety of ways [1,3,14]. Evolutionary algorithms seem particularly suitable to solve multi-objective optimization problems, because they deal simultaneously with a set of possible solutions (the so-called population). This allows to find several members of the Pareto optimal set in a single “run” of the algorithm, instead of having to perform a series of separate runs as in the case of the traditional mathematical programming techniques. Additionally, evolutionary algorithms are less susceptible to the shape or continuity of the Pareto front, it can easily deal with discontinuous or concave Pareto fronts[16]. Evolutionary algorithms can be classified into two main categories, non-elitist evolutionary algorithms in which a population of solutions is processed in every iteration, some of the non-elitist evolutionary algorithms are; Vector Evaluated Genetic Algorithm (VEGA), Weight-Based Genetic Algorithm (WBGA), Multiple Objective Genetic Algorithm (MOGA), Niched-Pareto Genetic Algorithm (NPGA), and the elitist evolutionary in which the elitist elite-preserving operator favors the elites of a population by giving them an opportunity to be directly carried over to the next generation, some of these elitist algorithms, Strength Pareto Evolutionary Algorithm SPEA/SPEA2, Elitist non-dominated sorting genetic algorithm (NSGA II)[15]. 3. QUALITY INDICES IN MULTI-OBJECTIVE OPTIMIZATION Quality indices are the tools by which the solver can decide the goodness or quality of obtained solutions by an algorithm. Quality indices can be categorized into three types [5, 9,10,13]:

3.1 METRICS EVALUATING CLOSENESS TO PARETO FRONT, in particular Error Ratio (ER), Two set Coverage Metric, Generational Distance (GD), Maximum Pareto-Optimal Front Error, Accuracy of Pareto Frontier.

3.2 METRICS EVALUATING DIVERSITY AMONG NON-DOMINATED SOLUTIONS, in particular, spacing (S), Spread, Overall Pareto Spread, Kth Objective Pareto spread, Maximum Spread, Number of Distinct Choices NDCµ (P), Cluster (CLμ), Crowding Distance.

3.3 METRICS EVALUATING CLOSENESS AND DIVERSITY, in particular, Hyper Area or Hyper volume (HV), Non-Dominated Evaluation Metric.

4. NUMERICAL ANALYSIS OF OPTIMIZATION QUALITY INDICES [1,2,11,12,23] Twelve Quality metrics; Obj. Spread, Overall Spread (O.S), Max. Spread, crowding distances, Scaled H.V, Scaled Dominant Area, Accuracy of observed Pareto Frontier, Non-Dominated Evaluation Metric, Maximum Pareto-Optimal Front Error, NDC, Spacing, CLu(P), are used to evaluate the goodness of the observed Pareto frontier for 30 bench-Mark problems solved by two famous algorithms, NSGAII, and SPEA2. Both techniques are coded using MATLAB [17]. The results shows that, Overall Spread =1.0 constant, Dominant Area= 0.03063 ̴0.66716, Max. Spread= 0.34815 ̴2458.828, Crowding distances = 0.069665 ̴ ∞, AC =0.29877 ̴ 183.7668, HV =0.002 ̴∞, NDEM =13 ̴ 50, MPFE = 0 ̴5.6061, NDC = 3 ̴ 36, Spacing = 0.29422 ̴ 1283, Clu (P) =1.3 ̴ 1.6.

799

Page 3: Application of Modified NSGA II (M-NSGA II) Algorithm to ... · NSGA with Penalty function approach. with respect to convergence, diversity, and time to assist the practicing engineers

Proceedings of the International Conference on Industrial Engineering and Operations Management Riyadh, Saudi Arabia, November 26-28, 2019

© IEOM Society International

Figure 1 shows a set of 30 benchmark problems [1, 2, 4, 6 ,7] used for comparative analysis versus number of decision variables, objective functions, and number of constraints.

Figure 1: Problem No. vs. No. of Decision Variables, Objective Functions, and Constraints

Figure 2 gives the maximum spread using NSGA II and SPEA 2 respectively. Generally, NSGA II produces maximum spread larger than SPEA2 for problems 1, 3, 5, 6, 7, 8, 9, 10, 11, 14, 15, 16, 18, 19, 20, 21, 22, 23, 24, 26, 27, and 29 respectively. In problems 2, 4, 12, 13, 17, 25, 28, and 30 respectively SPEA2 gives higher maximum Spread than NSGA II.

Figure 2: Maximum Spread vs Problem number

Figure 3 shows the Crowding distances using NSGA II and SPEA 2 for the 30 set of problems. Generally, the Crowding distances are less using NSGA II than SPEA 2 for problems 1, 2, 4, 7, 12, 14, 22, 25, 28 respectively. The opposite is true for problems 3, 5, 6, 9, 8, 10, 11, 13, 15, 16, 17, 18, 19, 20, 21, 23, 24, 26, 27, 29 and 30 respectively.

Figure 3: Crowding distances vs Problem number

0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 300123456789

10111213141516171819202122232425262728293031

Problem No.

No. of

Decisio

n Varia

bles, o

bjectiv

e funct

ions, C

onstrai

nts

Problems vs No. of Decision Variables, objective functionsand No. of Constraints

No. of Decision VariablesNo. of objective functionsNo. of Constraints

0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 300

20

40

60

80

100

120

140

160

180

200

Problem No.

Max S

pread

Max Spread obtained by NSGA II vs SPEA2 Algorithms

NSGA IISPEA2

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 300

10

20

30

40

50

60

70

Problem No.

Crowdi

ng dis

tances

Crowding distances obtained by NSGA II vs SPEA2 Algorithms

NSGA IISPEA2

800

Page 4: Application of Modified NSGA II (M-NSGA II) Algorithm to ... · NSGA with Penalty function approach. with respect to convergence, diversity, and time to assist the practicing engineers

Proceedings of the International Conference on Industrial Engineering and Operations Management Riyadh, Saudi Arabia, November 26-28, 2019

© IEOM Society International

Figure 4 gives the Hyper volume (Hyper area) using NSGA II and SPEA 2 for 30 test bed problems. NSGA II algorithm usually gives higher hyper volume than SPEA2 especially for problems 1, 2, 3, 4, 7, 9, 16, 18, 21, 22, 23, 24, 27, 28 and 30 respectively. The opposite is true for problems 5, 6, 8,10, 11, 12, 13, 14, 15, 17, 19, 20, 25, 26 and 29 respectively.

Figure 4: Hype Volume (Hyper area) vs Problem number

Figure 5 shows the dominant area for the 30 test problems using NSGA II and SPEA 2 respectively. The dominant area is lower using NSGA II than SPEA 2 for problems 1, 2, 3, 7, 9, 11, 16, 18, 21, 22, 23, 24, 27 and 28 respectively. The opposite is true for problems 4, 5, 6, 8, 10, 12, 13, 14, 15, 17, 19, 20, 25, 26, 29 and 30 respectively.

Figure 5: Dominant Area vs Problem number

Figure 6 gives the accuracy of observed Pareto frontier using NSGA II and SPEA 2 Algorithms. Generally, the accuracy of observed Pareto frontier is higher using NSGA II than SPEA 2 for problems 1, 2, 3, 4, 6, 7, 8, 10, 14, 17, 18, 20, 21, 22, 23, 24, 25, 26, 28, 29, and 30 respectively. The opposite is true for problems 5, 9, 11, 12, 13, 15, 16, 19 and 27 respectively.

Figure 6: Accuracy of Observed Pareto Frontier vs Problem number

0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 300

10

20

30

40

50

60

70

80

90

100

Problem No.

Accur

acy of

obser

ved Pa

reto f

rontier

Accuracy of observed Pareto frontier obtained by NSGA II vs SPEA2 Algorithms

NSGA IISPEA2

801

Page 5: Application of Modified NSGA II (M-NSGA II) Algorithm to ... · NSGA with Penalty function approach. with respect to convergence, diversity, and time to assist the practicing engineers

Proceedings of the International Conference on Industrial Engineering and Operations Management Riyadh, Saudi Arabia, November 26-28, 2019

© IEOM Society International

Figure 7 gives Non-Dominated Evaluation Metric using NSGA II and SPEA 2 Algorithms. Generally, Non-Dominated Evaluation Metric is the same for all problems except problems 6, 9, 11, 12, 13, 16, 27 and 28.

Figure 7: Non-Dominated Evaluation Metric vs Problem number

Figure 8: Maximum Pareto-Optimal Front Error vs Problem number

Figure 8 shows the Maximum Pareto-Optimal Front Error vs Problem number. NSGA II gives smaller (better) Maximum Pareto Optimal Front Error for all problems except 5, 9, 11, 12, 13, 14, 16, 17, 18, 20 and 23.

Figure 9: Number of Distinct Choices vs Problem number

0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 300

1

2

3

4

5

6

7

Problem No.

Maxim

um Pa

reto-O

ptima

l Fron

t Erro

r

Maximum Pareto-Optimal Front Error obtained by NSGA II vs SPEA2 Algorithms

NSGA IISPEA2

0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 300

5

10

15

20

25

30

35

Problem No.

Numb

er of

Disti

nct C

hoice

s

Number of Distinct Choices obtained by NSGA II vs SPEA2 Algorithms

NSGA IISPEA2

802

Page 6: Application of Modified NSGA II (M-NSGA II) Algorithm to ... · NSGA with Penalty function approach. with respect to convergence, diversity, and time to assist the practicing engineers

Proceedings of the International Conference on Industrial Engineering and Operations Management Riyadh, Saudi Arabia, November 26-28, 2019

© IEOM Society International

Figure 9 shows Number of Distinct Choices vs Problem number. Mostly NSGA II gives smaller (worse) Number of Distinct Choices than SPEA 2.

Figure 10: Spacing vs Problem number

Figure 10 shows Spacing vs Problem number. Mostly, NSGA II gives higher (better) Spacing than SPEA 2.

Figure 11: Number of Cluster on the obtained Pareto Frontier vs Problem number

Figure 11 shows Number of Cluster on the obtained Pareto Frontier vs Problem number. Mostly, NSGA II gives higher (worse) Number of Clusters than SPEA2.

Figure 12: Maximum Spread vs No. of decision variables for the 30 bench Mark problems

0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 300

10

20

30

40

50

60

70

Problem No.

Spac

ingSpacing obtained by NSGA II vs SPEA2 Algorithms

NSGA IISPEA2

0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 300

0.2

0.4

0.6

0.8

1

1.2

1.4

1.6

Problem No.

Numb

er of

Cluste

r on t

he ob

taine

d Pare

to Fro

ntier

Number of Cluster on the obtained Pareto Frontier by NSGA II vs SPEA2 Algorithms

NSGA IISPEA2

803

Page 7: Application of Modified NSGA II (M-NSGA II) Algorithm to ... · NSGA with Penalty function approach. with respect to convergence, diversity, and time to assist the practicing engineers

Proceedings of the International Conference on Industrial Engineering and Operations Management Riyadh, Saudi Arabia, November 26-28, 2019

© IEOM Society International

Figure 12 shows the relation between No. of decision variables over 30 bench Mark problems on the horizontal axis vs. the Maximum Spread on the vertical axis, it is obviously seen that value of Maximum Spread is not affected by No. of decision variables in the objective functions regardless.

Figure 13: Crowding Distances vs No. of decision variables for the 30 bench Mark problems

From figure 13, the crowding distances is not affected by No. of decision variable, at the same no. of decision variable can be high or low.

Figure 14: Hyper Volume vs No. of decision variables for the 30 bench Mark problems

Figure 14 shows that Hyper Volume is not affected by No. of decision Variables, Hyper volume can be lower or higher at same no. of decision variables.

Figure 15: Accuracy of Pareto Frontier vs No. of decision variables for the 30 bench Mark problems

804

Page 8: Application of Modified NSGA II (M-NSGA II) Algorithm to ... · NSGA with Penalty function approach. with respect to convergence, diversity, and time to assist the practicing engineers

Proceedings of the International Conference on Industrial Engineering and Operations Management Riyadh, Saudi Arabia, November 26-28, 2019

© IEOM Society International

From Figure 15 its obviously shown that, change of No. of decision Variables does not reflect any real effect of the Accuracy of Pareto Frontier value.

Figure 16: NDEM vs No. of decision variables for the 30 bench Mark problems

From Figure 16, NDEM can take two different values (High and low) at the same No. of decision Variables. Also, SPEA2 give almost a constant NDEM over the 30 bench mark problems.

Figure 17: Maximum Pareto Front Error vs No. of decision variables for the 30 bench Mark problems

Figure 17, shows that by changing No. of decision variables, no real change of Maximum Pareto Front error. At same No. of decision variables, MPFE can take different values high and low.

Figure 18: Number of Distinct Choices vs No. of decision variables for the 30 bench Mark problems

From Figure 18, its obviously seen that, at same No. of decision variables, NDC gives different values, therefore, No. of decision variables is not affecting the NDC values.

805

Page 9: Application of Modified NSGA II (M-NSGA II) Algorithm to ... · NSGA with Penalty function approach. with respect to convergence, diversity, and time to assist the practicing engineers

Proceedings of the International Conference on Industrial Engineering and Operations Management Riyadh, Saudi Arabia, November 26-28, 2019

© IEOM Society International

Figure 19: Spacing vs No. of decision variables for the 30 bench Mark problems

Figure 19 shows that number of decision variables is not a factor that affect the value of Spacing.

Figure 20: Clu (P) vs No. of decision variables for the 30 bench Mark problems

Figure 20 shows that number of decision variables is not a factor that affect the value of Clu (P). 5. CASE STUDIES The regular NSGA II, NSGA-II (with penalty approach) and the developed M-NSGA-II are employed to three benchmark problems. In order to measure the quality of the new approach with respect to other algorithms, the following aspects are discussed: 1. Comparison among the three approaches with respect to convergence, diversity and number of iterations to reach

Pareto optimal set. 2. Comparison among the three approaches with respect to quality indices is used as a measure of convergence and

diversity of solutions. Case Study 1: MOPC1[2]

Minimize = � f1(x, y) = 4𝑥𝑥2 + 4𝑦𝑦2 ;f2(x, y) = (x − 5)2 + (y − 5)2;

Subjected to � g1(x, y) = (x − 5)2 + 4.1y2 − 25 ≤ 0 ; g2(x, y) = −(x − 8)2 − (y + 3)2 + 7.7 ≤ 0;

0 ≤ x ≤ 5, 0 ≤ y ≤ 5, Table 1 gives a comparison between the 3 approaches employed and the set of quality indices. Figure 21 gives the Pareto Front of the 3 approaches.

806

Page 10: Application of Modified NSGA II (M-NSGA II) Algorithm to ... · NSGA with Penalty function approach. with respect to convergence, diversity, and time to assist the practicing engineers

Proceedings of the International Conference on Industrial Engineering and Operations Management Riyadh, Saudi Arabia, November 26-28, 2019

© IEOM Society International

Figure 21 M- NSGA II, vs. NSGA II with Penalty and Regular NSGA II for MOPC1

Table 1 Quality indices obtained by Developed NSGA II, NSGA + Penalty, and Regular NSGA for MOPC1

MOPC1

Quality Indices M- NSGA II NSGA II + Penalty Regular NSGA II

1st Obj. 2nd Obj. 1st Obj. 2nd Obj. 1st Obj. 2nd Obj. Min 256 9 52.9049 6.4185 99.9485 0.015932 Max 355.9447 33.9961 123.5625 12.9019 193.9704 20.2255 Rang 99.9447 24.9961 70.6576 6.4834 94.0219 20.2096 StD. 32.7995 7.861 19.9953 1.9925 28.2308 5.1023 Mean 292.6528 17.0758 83.3958 8.7153 147.8387 4.4134 Obj. Spread 1 1 1 1 1 1 O.S 1 1 1 Max. Spread 72.8483 50.1724 68.002 Crowding distances Inf 2.5165 10.8654 Scaled H.V 0.8243 0.70035 0.76159 Scaled Dominant Area 0.15839 0.28141 0.21154 Accuracy of observed Pareto frontier 57.7957 54.8276 37.2242 Non-Dominated Evaluation Metric 50 50 47 Maximum Pareto-Optimal Front Error 0 0 0.19187

NDC(at u= 0.1) 31 31 30 Spacing 140.5009 40.1027 74.8671 CLu(P) 1.6129 1.6129 1.5667

Figure 22 shows the behavior of each algorithm over the entire solution space with respect to diversity and conversion vs. iterations. It is obviously seen that M- NSGA II is the fastest among the three algorithms to converge closer to true Pareto and has the best diversity as well. It goes directly to the optimum solutions set.

Figure 22: Regular NSGA, NSGA + Penalty, M-NSGA II vs. iterations (MOPC1)

50 100 150 200 250 300 350 4000

5

10

15

20

25

30

35

1st Objective

2nd O

bjec

tive

Developed NSGA II, vs. NSGA II with Penalty and Regular NSGA II

Developed New NSGAII ApproachNSGAII with Penalty FunctionRegular NSGAII

807

Page 11: Application of Modified NSGA II (M-NSGA II) Algorithm to ... · NSGA with Penalty function approach. with respect to convergence, diversity, and time to assist the practicing engineers

Proceedings of the International Conference on Industrial Engineering and Operations Management Riyadh, Saudi Arabia, November 26-28, 2019

© IEOM Society International

Quality indices for NSGA II, M-NSGA II, NSGA II with Penalty function Figure 23 shows the behavior of different algorithms over the entire solution space with respect to number of iterations. The developed M-NSGA II is the fastest among the three algorithms to converge to true Pareto and has the best diversity as well.

The quality indices are plotted on x Axis ordered as follow: 1-OS1, 2-OS, 3- Max Spread, 4- Crowding distances, 5- HV, 6- Do, 7- AC, 8- NDEM, 9- MPFE, 10- NDC, 11- Spacing, and 12- CLu (P).

Figure 23. Quality indices plot for MOPC1

Fig. 23 shows the M-NSGA II is superior to the other algorithms with respect to maximum spread, spread, and hyper volume. It can also be concluded that NSGA II with penalty approach can miss its search to solution.

Case Study 2: BinhKorn Function [2]

Minimize =�𝑓𝑓1(𝑥𝑥, 𝑦𝑦) = 4𝑥𝑥2 + 4𝑦𝑦2

𝑓𝑓2(𝑥𝑥, 𝑦𝑦) = (𝑥𝑥 − 5)2 + (𝑦𝑦 − 5)2

Subject to: 𝑔𝑔1 (𝑥𝑥,𝑦𝑦) = (𝑥𝑥 − 5)2 + 𝑦𝑦2 ≤ 25 𝑔𝑔2 (𝑥𝑥,𝑦𝑦) = (𝑥𝑥 − 8)2 + (𝑦𝑦 + 3)2 ≥ 7.7 and 0 ≤ x ≤ 5; 0 ≤ y ≤ 3 Table 2 gives a comparison between the 3 approaches employed and the set of quality indices, While Figure 24 gives the Pareto Front of the 3 approaches.

Figure 24. M-NSGA II, vs. NSGA II with Penalty and Regular NSGA II for BinhKorn Function

0 50 100 150 200 2500

10

20

30

40

50

60

1st Objective

2nd Objec

tive

Developed NSGA II, vs. NSGA II with Penalty and Regular NSGA II

Developed New NSGAII ApproachNSGAII with Penalty FunctionRegular NSGAII

808

Page 12: Application of Modified NSGA II (M-NSGA II) Algorithm to ... · NSGA with Penalty function approach. with respect to convergence, diversity, and time to assist the practicing engineers

Proceedings of the International Conference on Industrial Engineering and Operations Management Riyadh, Saudi Arabia, November 26-28, 2019

© IEOM Society International

Table 2 Reduced Quality indices obtained by Developed NSGA-II, NSGA + Penalty, and regular NSGA for BinhKorn Function

BinhKorn

Quality Indices Developed NSGA II NSGA + Penalty Regular NSGA 1st Obj. 2nd Obj. 1st Obj. 2nd Obj. 1st Obj. 2nd Obj.

Min 0.0032746 7.0687e-006 14.2925 2.0253 0.79522 0.0011349 Max 200.0893 49.6888 211.9004 51.9105 200.9968 44.0043 Rang 200.086 49.6888 197.6079 49.8852 200.2016 44.0032 StD. 60.4636 15.3539 60.1117 16.0311 61.6188 13.7956 Mean 63.2803 17.8917 84.8387 18.3387 71.0469 15.2682 Obj. Spread 1 1 1 1 1 1 O.S 1 1 1 Max. Spread 145.7796 144.1135 144.943 Crowding distances 4.8357 3.004 5.5304 Scaled H.V 0.82112 0.81879 0.80437 Scaled Dominant Area 0.16055 0.1618 0.17833 Accuracy of observed Pareto frontier 54.5631 51.5096 57.808 Non-Dominated Evaluation Metric 50 50 50 Maximum Pareto-Optimal Front Error 0.24156 0.69844 0.374

NDC(at u= 0.1) 33 34 31 Spacing 49.4608 55.0671 52.5274 CLu(P) 1.5152 1.4706 1.6129

Figure 25 shows the behavior of different algorithms over the entire solution space with respect to number of iterations. The developed M-NSGA II is the fastest among the three algorithms to converge to true Pareto and has the best diversity as well.

Figure 25 Regular NSGA, NSGA + Penalty, Developed NSGA II with respect to number of iterations Again, the M-NSGA II gives better and faster convergence and diversity than the regular NSGAII or with penalty functions.

Quality indices for NSGA II, M-NSGA II, NSGA II with Penalty function

Figure 26 gives a comparison between Quality indices obtained by Regular NSGA II, M-NSGA II, and NSGAII+ Penalty function (BinhKorn Function). The quality indices are distributed on x Axis ordered as follow: 1-OS1, 2-OS, 3- Max Spread, 4- Crowding distances, 5- HV, 6- Do, 7- AC, 8- NDEM, 9- MPFE, 10- NDC, 11- Spacing, and 12- CLu (P). Figure 26 shows that M-NSGA II is superior to the other algorithms with respect to maximum spread, crowding distance, spread, and hyper volume. It can also be concluded that NSGA II with penalty approach can miss its search to solution.

809

Page 13: Application of Modified NSGA II (M-NSGA II) Algorithm to ... · NSGA with Penalty function approach. with respect to convergence, diversity, and time to assist the practicing engineers

Proceedings of the International Conference on Industrial Engineering and Operations Management Riyadh, Saudi Arabia, November 26-28, 2019

© IEOM Society International

Figure 26. Quality indices plot for BinhKorn Function

Case Study 3: SRN [2]

Minimize =�𝑓𝑓1(𝑥𝑥,𝑦𝑦) = (x − 2)2 + (y − 1) 2 + 2

𝑓𝑓2(𝑥𝑥,𝑦𝑦) = 9x + (𝑦𝑦 − 1)2

Subject to: 𝑔𝑔1 (𝑥𝑥,𝑦𝑦) = -x - y + 225; 𝑔𝑔2 (𝑥𝑥, 𝑦𝑦) = -x + 3y-10; 0 ≤ x ≤ 6; 0 ≤ y ≤ 2.25 Table 3 gives a comparison between the 3 approaches employed and the set of quality indices. Figure 27 shows the observed Pareto Front for SRN Test problem using the three different algorithms.

Figure 27 M-NSGA II vs. NSGA II with Penalty and Regular NSGA II for SRN Function

Table 3 Reduced Quality indices obtained by M-NSGA II, NSGA + Penalty, and Regular NSGA for SRN Function SRN

Quality Indices Developed NSGA II NSGA + Penalty Regular NSGA 1st Obj. 2nd Obj. 1st Obj. 2nd Obj. 1st Obj. 2nd Obj.

Min 14.1001 -54.6041 53.6315 -53.6533 9.6329 -52.8773 Max 495.3465 38.8549 130.0986 19.0137 65.2305 -6.4547 Rang 481.2464 93.459 76.467 72.667 55.5976 46.4226 StD. 166.8636 30.5476 25.0216 22.5332 16.6956 13.9729 Mean 198.5193 -17.3044 88.3413 -23.2208 33.8199 -30.7653 Obj. Spread 1 1 1 1 1 1 O.S 1 1 1 Max. Spread 346.6502 74.5912 51.216 Crowding distances Inf 4.0856 1.7084 Scaled H.V 0.74119 0.62813 0.57818 Scaled Dominant Area 0.23618 0.34428 0.39555 Accuracy of observed Pareto frontier 44.1954 36.2372 38.0715 Non-Dominated Evaluation Metric 50 50 50 Maximum Pareto-Optimal Front Error 0 0.069263 0.42172 NDC(at u= 0.1) 30 32 33 Spacing 161.2624 60.8617 35.8879 CLu(P) 1.6667 1.5625 1.5152

0 50 100 150 200 250 300 350 400 450 500-60

-50

-40

-30

-20

-10

0

10

20

30

40

1st Objective

2nd Objec

tive

Developed NSGA II, vs. NSGA II with Penalty and Regular NSGA II

Developed New NSGAII ApproachNSGAII with Penalty FunctionRegular NSGAII

810

Page 14: Application of Modified NSGA II (M-NSGA II) Algorithm to ... · NSGA with Penalty function approach. with respect to convergence, diversity, and time to assist the practicing engineers

Proceedings of the International Conference on Industrial Engineering and Operations Management Riyadh, Saudi Arabia, November 26-28, 2019

© IEOM Society International

Comparing between the three approaches with respect to number of iterations. Figure 28 shows the behavior of different algorithms over the entire solution space with respect to number of iterations. The developed M-NSGA II is the fastest among the three algorithms to converge to true Pareto and has the best diversity as well.

Figure 28 Regular NSGA II, NSGA II + Penalty, and M-NSGA II with respect to iterations (SRN)

Quality indices for NSGA-II, M-NSGA-II, NSGA-II with Penalty function Figure 29 gives a comparison between Quality indices for function (SRN), it shows also that M-NSGA II is superior to the other algorithms with respect to maximum spread, crowding distance, spread, Pareto accuracy, Spacing, and hyper volume. It can also be concluded that NSGA II with penalty approach can miss its search to solution.

Figure 29 Quality indices plot for SRN Function

The quality indices are distributed on x Axis ordered as follow: 1-OS1, 2-OS, 3- Max Spread, 4- Crowding distances, 5- HV, 6- Do, 7- AC, 8- NDEM, 9- MPFE, 10- NDC, 11- Spacing, and 12- CLu (P). 6. CONCLUSION Regular NSGA II is modified further to overcome the shortcomings of previous approaches with respect to constraint handling. Comparative studies are conducted using a group of quality indices to compare the developed NSGA II (M-NSGAII) with two different algorithms: regular NSGA II, and NSGA-II with penalty function. Results show that the new modification is the best among the three approaches. Further results include: 1. Twelve indices are studied versus a set of 30 benchmark problems. Several issues can be raised:

• Number of Cluster is almost the same over the 30 test problems. This makes this measure of no significance. • The objective spread is always equal one for all test problems. This is the same for the overall spread, which

makes these measures of little importance. • Accuracy of observed Pareto Frontier is dependent on Hyper Volume and Dominant area; therefore, it is not

a measure for Pareto Front goodness. • Crowding Distances sometimes equal to Infinity, and sometimes have very similar results for different nature

problems. It is advisable to discard this measure as well.

811

Page 15: Application of Modified NSGA II (M-NSGA II) Algorithm to ... · NSGA with Penalty function approach. with respect to convergence, diversity, and time to assist the practicing engineers

Proceedings of the International Conference on Industrial Engineering and Operations Management Riyadh, Saudi Arabia, November 26-28, 2019

© IEOM Society International

2. NSGA II is better than SPEA 2 with respect to diversity and convergence for unconstrained quadratic objective functions [MOP1, MOP3, MOP7, ZDT2, Schaffer2, and OKA2], and unconstrained linear and sinusoidal objective functions [MOP6, and TP_KUR].

3. NSGA II is better than SPEA 2 with respect to diversity and convergence for linear and n-power objective functions with linear constraints [CONSTR_Ex, and Comet], quadratic objective function with quadratic constraints [MOPC1, Binh-Korn, Chakong and Haimes, and DTLZ1-3], and inverse power exponential quadratic objective functions with rational constraints [CTP1].

4. NSGA II is better than SPEA2 with respect to Convergence and Worse than SPEA 2 with respect to Diversity for series functions with and/or without constraints [MOP2, MOP4, Osyczka and Kundu, and Test function].

5. NSGA II is better than SPEA2 with respect to Diversity and Worse than SPEA 2 with respect to Convergence for problems with rational objective functions with linear or without constraints [MOP5, ZDT1, ZDT3, and DTLZ7], for quadratic objective function with quadratic constraints [MOPC2, DTLZ2-2, DTLZ2-3], and for unconstrained quadratic objective functions [MOPC3].

6. SPEA 2 is better than NSGAII with respect to Diversity and Convergence for problems consists of linear and sinusoidal objective functions or constraints [ZDT4, and ZDT6], and for linear and n-power objective functions with linear constraints [Two Bar Truss].

7. The Developed NSGA II is better than regular and NSGA II with penalty approach with respect to Convergence time, Convergence to the Pareto Optimal Set., and Diversity Over the feasible space.

REFERENCES [1] Deb, K., “Multi-Objective Optimization using Evolutionary Algorithms”, John Wiley & Sons, Inc. New York,

USA, ISBN: 047187339X, pp.1-256, 2001. [2] Coello, C. A., Lamont, G. B., Van Veldhuizen, D. A., “Evolutionary Algorithms for Solving Multi-Objective

Problems”, Springer Science Business Media, LLC, 2nd Edition, pp. 198-235, 2007. [3] Wu, J., Azarm, S., “Metrics for Quality Assessment of a Multi-objective Design Optimization Solution Set”,

Transactions of the ASME, Journal of Mechanical Design, Vol. 123, No. 1, pp. 1-8, 2001. [4] Deb, K., Thiele, L., Laumanns, M., Zitzler, E., “Scalable Test Problems for Evolutionary Multi-Objective

Optimization”, TIK-Technical Report No. 112, July 17, pp. 1-27, 2001. [5] Jain, A., Dubes, R. C., “Algorithms for Clustering Data”, Prentice-Hall, Inc. Upper Saddle River, NJ, USA, pp.15-

16, 1998. [6] Hillier, F. S., Lieberman, G. J., “Advance Praise for Introduction to Operations Research”, McGraw-Hill Higher

Education, pp. 658-703, 2001. [7] Magrab, E. B., Azarm S., Balachandran, B., Duncan, J. H., Herold, K. E., Walsh, G. C., “An Engineer’s Guide to

MATLAB With Applications from Mechanical, Aerospace, Electrical, Civil, and Biological Systems Engineering”, Third Edition., Peason Education, pp. 647-648, 2011.

[8] Li, M., Yang, S., Liu, X., “Diversity Comparison of Pareto Front Approximations in Many-Objective Optimization”, Engineering and Physical Sciences Research Council (EPSRC) of U.K, under Grant EP/K001310/1., February, 26, pp.1-17, 2014.

[9] Dharaskar, R. V., Thakare, V. M., Chaudhari, P. M., “Computing the Most Significant Solution from Pareto Front obtained in Multi-objective Evolutionary”, (IJACSA) International Journal of Advanced Computer Science and Applications, Vol. 1, No. 4, pp. 1-6, 2010.

[10] Taghavi, T., Pimentel, A. D., “Design Metrics and Visualization Techniques for Analyzing the Performance of MOEAs in DSE”, Proceeding of the IEEE International Conference on Embedded Computer Systems: Architectures, Modeling and Simulation, 2011.

[11] Rao, S. S., “Engineering Optimization, Theory and Practice”, John Wiley and Sons, Third Edition, 1996. [12] Jiang, S., Ong, Y., Zhang, J., Feng, L., “Consistencies and Contradictions Of Performance Metrics In Multi-

Objective Optimization”, Proceeding of the IEEE Transactions On Cybernetics, pp.1-14, 2014. [13] De Oliveira, L. S., Saramago, S., F., “Multiobjective Optimization Techniques Applied to Engineering Problem“,

Journal of the Brazilian Society of Mechanical Sciences and Engineering, Vol. XXXII, No. 1, pp. 1-12, 2009. [14] Fritsche, G., Strickler, A., Pozo, A., Santanay,R., “Capturing Relationships in Multi-Objective Optimization“,

Proceeding of the Brazilian Conference on Intelligent Systems (BRACIS), At Natal - RN, Volume: 4, pp.1-6. 2015.

[15] Reyes-Sierra, M., Coello, C. A., “Multi-Objective Particle Swarm Optimizers:” A Survey of the State-of-the-Art”, International Journal of Computational Intelligence Research. ISSN 0973-1873 Vol.2, No.3, pp. 287–308, 2006.

812

Page 16: Application of Modified NSGA II (M-NSGA II) Algorithm to ... · NSGA with Penalty function approach. with respect to convergence, diversity, and time to assist the practicing engineers

Proceedings of the International Conference on Industrial Engineering and Operations Management Riyadh, Saudi Arabia, November 26-28, 2019

© IEOM Society International

[16] Deb, K., Pratap, A., Agrawal, S., Meyarivan, T., “A Fast and Elitist Multi-Objective Genetic Algorithms: NSGA-II”, IEEE Transactions on Evolutionary Computation, Vol. 6, no. 2, pp. 182-197, 2002.

[17] Zitzler, E., “Evolutionary Algorithms for Multi Objective Optimization: Methods and Applications”, Ph.D. Dissertations, Swiss Fedral Institute of Technology Zurich, 1999.

[18] Sarker, R., Mohammadian, M., “Evolutionary Optimization”, Kluwer Academic Publishers, pp67-121, 2003. [19] Foulds, L. R., “Classical Optimization”, Springer, New York, NY, pp 257-309, 1981. [20] Langer, H., “Extended Evolutionary Algorithms for Multiobjective and Discrete Design Optimization of

Structures“, Ph.D. Dissertations, Technical University, pp. 25-36, 2005. [21] Naji, N., “A Review of the Metaheursitic Algorithms and their Capabilities (Particle Swarm Optimization, Firefly

and Genetic Algorithms)”, International Journal of Current Engineering & Technology, Vol. 7, No. 3, pp. 921-925, 2017.

[22] Abraham, A., Jain, L., Goldberg, R., (Eds), “Evolutionary Multiobjective Optimization: Theoretical Advances and Applications“, Springer-Verlag, London, 2005.

[23] Lotfi, S., Karimi, F., “A hybrid MOEA/D-TS for solving multi-objective problems”, Journal of Artificial Inteligence and Data Mining, pp1-14, 2016.

[24] Saad, O. M., (Eds), “Goal Programming Approaches For Solving Fuzzy Integer Multi- Criteria Decision-Making Problems”, Springer Science + Business Media, LLC, pp 433, 2008.

[25] Gadallah, M. H., Hegazi, H. A., Mohamed, A. A., “A Modification to Multi Objective NSGA II Optimization Algorithm”, Proceeding of the IEOM washington DC Conference, pp.1221-1232, 2018.

[26] Gadallah, M. H., Hegazi, H. A., Mohamed, A. A., “Multi-objective Optimization Indices: A comparative analysis”, Australian Journal of Basic and Applied Sciences, Vol. 10, Issue 15, ISSN 2309-8414, pp 10-25, October 2016.

[27] Zitzler, E., Deb, K. and Thiele, L., “Comparison of multiobjective evolutionary algorithms: Empirical results”, Journal of Evolutionary Computation, Vol. 8(2), 125-148, 2000.

Biographies Mohamed H. Gadallah, Technical Advisor, Education Development Fund (EDF) - The Egyptian Cabinet of Ministers, Professor of Industrial Engineering & Operations Research, Faculty of Engineering, Cairo University 12613, Tel (Office): +202-35678165, Fax: +202-35693025 Cell: + 0122-213-9310, +0109-760-2342 Email: [email protected], [email protected], https://aucegypt.academia.edu/MohamedHGadallah, http://staff.eng.cu.edu.eg/mhg, http://scholar.cu.edu.eg/mgadallah Abdel Rahman Ali M. Ahmed, Master Student, Mechanical Design & Production Department, Faculty of Engineering, Cairo University, E.mail: [email protected], Contributions: Multi-objective Optimization Indices A comparative analysis

813