unified particle swarm optimization for tackling

7
 UNIFIED PARTICLE SW ARM OPTIMI ZA TION FOR T ACKLING OPERA TIONS RESEARCH PROBLEMS K.E. P arso poulos and M.N. Vrahatis Computational Intelligence Laboratory (CI Lab), Department of Mathematics, and University of Patras Articial Intelligence Research Center (UPAIRC), University of Patras, GR-26110 Patras, Greece kostasp,vrahatis  @math.upatras.gr ABSTRACT W e investigate the perfo rmance of the recently propos ed Unied Particle Swarm Optimization algorithm on two cat- egories of operations research problems, namely minimax and integer programming proble ms. Diff erent vari ants of the algorithm are employed and compared with established variants of the Particle Swarm Optimization algorithm. Sta- tistical hypothesis testing is performed to justify the signif- ican ce of the results . Concl usion s regardi ng the abili ty of the Unied Particle Swarm Optimization method to tackle operations research problems as well as on the performance of each variant are derived and discussed. 1. INTRODUCTION T wo o f the most inte rest ing cate gorie s of probl ems in opera - tions research are minimax and integer programming prob- lems [1 ]. Such prob lems are encounter ed in numero us engi- neering and scientic applications, including optimal con- trol, engineering design, game theory, molecular biology, high energy physics, capital budgeting and portfolio analy- sis [2, 3]. In general, the minimax problem can be dened as (1) where, (2) with  ,  . Moreo ver , a nonli near progra mmin g probl em with inequ alit y const rain ts of the form subject to  (3) can be transformed into the minimax problem where, (4) for  . It has been prov ed that for sufcient ly large  , the optimum point of the minima x problem coin- cides wit h the opti mum point of the nonlinear programming problem [4]. Minimax problems have proved to be difcult to be tac kled through traditional gradient–based algorithms, since, at points where  for two or more values of  , the rst partial derivat ives of  are discontinuous, even if all the functions  ,  , have continuous rst partial derivati ves. The unconstrained integer programming problem is de- ned as (5) where  is the set of inte gers , and  is a not neces sari ly bounded set, which is considered as the feasible region. In this rst investigation we focus to all–integer programming problems, where all variables a re integers. Mixed–integer progr amming problems, where some of the variabl es are real, will be considered in future works. Particle Swarm Optimization (PSO) has proved to be very efcient algorithm for addressing minimax and integer progr ammi ng problems [5, 6]. The performance of popu- lation–based algorithms is heavily dependent on the trade– off between their exploration and exploitation capabilities. To this end, Unied Particle Swarm Optimization (UPSO) was recently introduced as a unied PSO scheme that com- bines the exploration and exploitation properties of differ- ent PSO variants [7]. Preliminary results on static as well as dynamic optimization problems indicate the superiority of UPSO against the standard PSO variants [7, 8]. We investigate the performance of UPSO on minimax and integer programming problems and compare it with the

Upload: do-minh-nam

Post on 02-Nov-2015

233 views

Category:

Documents


0 download

DESCRIPTION

Unified Particle Swarm Optimization for Tackling

TRANSCRIPT

  • UNIFIED PARTICLE SWARM OPTIMIZATION FOR TACKLINGOPERATIONS RESEARCH PROBLEMS

    K.E. Parsopoulos and M.N. Vrahatis

    Computational Intelligence Laboratory (CI Lab), Department of Mathematics,and

    University of Patras Articial Intelligence Research Center (UPAIRC),University of Patras, GR-26110 Patras, Greece

    kostasp,vrahatis @math.upatras.gr

    ABSTRACTWe investigate the performance of the recently proposedUnified Particle Swarm Optimization algorithm on two cat-egories of operations research problems, namely minimaxand integer programming problems. Different variants ofthe algorithm are employed and compared with establishedvariants of the Particle Swarm Optimization algorithm. Sta-tistical hypothesis testing is performed to justify the signif-icance of the results. Conclusions regarding the ability ofthe Unified Particle Swarm Optimization method to tackleoperations research problems as well as on the performanceof each variant are derived and discussed.

    1. INTRODUCTION

    Two of the most interesting categories of problems in opera-tions research are minimax and integer programming prob-lems [1]. Such problems are encountered in numerous engi-neering and scientific applications, including optimal con-trol, engineering design, game theory, molecular biology,high energy physics, capital budgeting and portfolio analy-sis [2, 3].

    In general, the minimax problem can be defined as

    (1)

    where,

    "!

    (2)

    with!

    "#%$'&)(+*-,.(

    , /102435343687

    . Moreover, anonlinear programming problem with inequality constraintsof the form

    9

    subject to :

    =?

    /

    A@?4343536B7C

    (3)

    can be transformed into the minimax problemD9

    E

    9F "!

    6

    where,

    !

    G

    !

    G IHKJ

    :

    (4)

    J

    ML

    =?

    for /@N435343687

    . It has been proved that for sufficientlylarge

    J

    , the optimum point of the minimax problem coin-cides with the optimum point of the nonlinear programmingproblem [4]. Minimax problems have proved to be difficultto be tackled through traditional gradientbased algorithms,since, at points where

    !PO

    Q

    for two or more valuesof RCSUT

    025343536B7WV

    , the first partial derivatives of

    arediscontinuous, even if all the functions

    !

    , /-02435343687

    ,have continuous first partial derivatives.

    The unconstrained integer programming problem is de-fined as

    !

    6M

    S

    $YX[Z

    *

    (5)

    whereZ

    is the set of integers, and$

    is a not necessarilybounded set, which is considered as the feasible region. Inthis first investigation we focus to allinteger programmingproblems, where all variables are integers. Mixedintegerprogramming problems, where some of the variables arereal, will be considered in future works.

    Particle Swarm Optimization (PSO) has proved to bevery efficient algorithm for addressing minimax and integerprogramming problems [5, 6]. The performance of popu-lationbased algorithms is heavily dependent on the tradeoff between their exploration and exploitation capabilities.To this end, Unified Particle Swarm Optimization (UPSO)was recently introduced as a unified PSO scheme that com-bines the exploration and exploitation properties of differ-ent PSO variants [7]. Preliminary results on static as well asdynamic optimization problems indicate the superiority ofUPSO against the standard PSO variants [7, 8].

    We investigate the performance of UPSO on minimaxand integer programming problems and compare it with the

  • performance of the local and global PSO variant on wellknown benchmark functions. Statistical hypothesis testingis conducted to justify the significance of the results. Therest of the paper is organized as follows. PSO and UPSOare briefly described in Section 2 and experimental resultsare reported and discussed in Section 3. The paper closeswith conclusions in Section 4.

    2. UNIFIED PARTICLE SWARM OPTIMIZATION

    PSO is the most common swarm intelligence algorithm fornumerical optimization tasks. It was introduced in 1995 byEberhart and Kennedy [9, 10], drawing inspiration from theemergent behavior in socially organized colonies [11]. PSOis a populationbased algorithm, i.e., it employs a popula-tion, called a swarm, of search points,

    -

    \

    8

    9]

    5343536B

    *

    8^

  • which is mostly based on the global variant, where E

    k

    is a normally distributed parameter with meanvector k and variance matrix

    . Based on the analysis ofMatyas [16] for stochastic optimization algorithms, conver-gence in probability was proved for the schemes of Eqs. (12)and (13) [7].

    3. RESULTS

    UPSO was applied on05=

    minimax and integer program-ming benchmark problems on which PSO was recently tes-ted [5, 6]. These problems are defined as follows [17, 18,19, 20, 21, 22].

    MINIMAX PROBLEMS

    Test Problem 1 [17]. This is a@

    dimensional problem andit consists of functions,

    9

    E

    T

    !

    FV

    /

    _02@?

    !

    ]

    j

    ]

    (14)

    !

    ]

    @HW

    ]

    @HW

    ]

    ]

    !

    @+

    N

    H

  • 2w-

    s

    2|"B

    s

    |8

    |

    2

    s

    |8

    |D

    |

    s

    |8

    s

    |"8

    8

    Test Problem 9 [19]. This is a dimensional problem andit consists of functions,

    9

    N5

    2wEF

    v

    2u4wW88uP

    s

    2w

    s

    B6

    |

    2B6

    |D

    |

    8

    |"BF |

    |D

    BF

    {F

    2w-

    s

    2|"B

    F

    s

    |F

    |D

    |

    |F

    B6 (22)

    2w-

    s

    2|"B

    s

    |F2g|BF

    |

    {6

    2w-

    s

    2|"B

    F

    s

    |

    |F

    F

    B6

    2w-

    s

    2|"B

    s

    |

    F

    s

    2g|F

    |F

    Test Problem 10 [19]. This is a dimensional problem andit consists of

    @?0

    functions,

    E

    T

    !

    4V

    /

    -04353436F@N02

    !

    )

    N

    4

    ]

    IH

    0

    0

    (23)

    H=?3

    /

    H>0

    @=

    3

    For all minimax problems, the search space was H

    =

    =*

    ,where is the dimension of the problem, and the desiredaccuracy for detecting the global minimizer was

    0P=

    f

    . Theswarm size was equal to

    @2=

    for Test Problems (TP)0

    ,@

    , and , and equal to

    =

    for the other problems [5]. In thecases where the transformation of Eq. (4) was employed,the value

    J

    y0P=

    was used for all / [5].

    INTEGER PROGRAMMING PROBLEMS

    Test Problem 1 [20]. This problem is defined as

    _6

    P

    *

    q

    (24)

    where is the dimension. The solution is

    _

    =?53435348=

    ^

    with

    A=

    . We considered this problem for

    =

    .

    Test Problem 2 [20]. This problem is defined as

    ]

    Q

    ^

    "

    *

    ...

    *

    (25)

    where is the dimension. The solution is

    _

    =?53435348=

    ^

    with

    ]

    A=

    .

    Test Problem 3 [21]. This problem is dimensional and itis defined as

    2w B 6 B B |

    m IB IB

    m 6 m mP

    IB m m IB

    mP m m

    IB IB m P

    (26)

    The best known solutions are

    )u=?50202F@2@?40

    ^

    and

    _u=?50P@NF@

    40

    ^

    , with

    -H

    E .

    Test Problem 4 [21]. This problem is@

    dimensional and itis defined as

    -

    ]

    @

    ]

    ]

    H[020

    ]

    ]

    ]

    H

    ]

    3

    (27)

    Its solution is

    e040

    ^

    with

    =

    .

    Test Problem 5 [21]. This problem is dimensional and itis defined as

    G

    05=2

    ]

    ]

    HC

    ]

    ]

    H@E

    05=

    HC

    3

    (28)

    Its solution is

    e

    =8=8=?=

    ^

    with

    A=

    .

    Test Problem 6 [22]. This problem is@

    dimensional and itis defined as

    A@

    ]

    ]

    ]

    ]

    H

    H

    ]

    3

    (29)

    Its solution is

    eu@?4H0

    ^

    with

    -H

    .

    Test Problem 7 [21]. This problem is@

    dimensional and itis defined as

    H

    =

    3

    2

    H[0

    3 =

    H@

    @N3 @

    ]

    0P@

    3 =

    ]

    @2=

    3

    ]

    ]

    0

    @N3@

    ]

    3

    (30)

    Its solution is

    e

    =40

    ^

    with

    yH

    22

    30@

    .

    For all integer programming problems, the search space was

    H05=2=40P=2=E\*

    , where is the dimension of the problem, andthe desired accuracy for detecting the global minimizer was05=

    f

    . The swarm size was problem dependent and it wasequal to

    05==

    for TP0

    , equal to0P=

    for TP@

    and

    , equal to

    =

    for TP , and equal to@=

    for the rest of the problems [6].In order to avoid deterioration of the algorithms dynam-

    ics, we let the particles assume real values and rounded theircomponents to the nearest integer only for the evaluation ofthe objective function.

    In all cases, the default PSO parameters, }=?3

    @

    ,

    ]

    @N3 =

    , were used [14]. Three UPSO variantswere investigated, namely, the main UPSO scheme with"A=?3@

    and"A=?3

    , as well as the scheme with mutation of

  • Table 1. Results for the minimax problems.TP PSOg PSOl UPSO1 UPSO2 UPSOm

    Suc. !"" !#"$" !"$" !"$" !"$" Mean P P PB65 .PBP !%$%$& ' "

    St.D. BF B6FP ( !#) *+' P85 P Suc. !"" !#"$" !"$" !"$" !"$"

    Mean P6P 6 P P .PBP 6 ! *$*$( , "St.D. 65 65 P 6FP ).-/! %+-Suc. !"" !#"$" !"$" !"$" !"$"

    Mean BP 6 P 6 BPBP 6 !- *+, , " B6FP 6St.D. PB P F P ) ,$' &0! P Suc. !"" !#"$" !"$" !"$" !"$"

    Mean F6FP 8F65 F65 1, - ' - "" BP St.D. ! ,* ) * ! 6P 6 P P.B6FP .6FPSuc. !"" !#"$" !"$" !"$" !"$"

    Mean P F6P F6P BP ! * "0! , "St.D. P P F P F F ! ' - %)Suc. !"" !#"$" !"$" !"$" !"$"

    Mean B6FP Q65 -BP % ' - ( ( " BF St.D. P F F6P P -% * %$& P Suc. !"" !#"$" !"$" !"$" !"$"

    Mean )2!%/! "$" 6P PBP .65 .F6P St.D. (+,' )+" B6FP BP 6 P 665 Suc. !"" !#"$" !"$" !"$" !"$"

    Mean P P P -&0! ( "" PBP St.D. F6P F6P BPBP3'( % ($* 65 FSuc. !"" !#"$" !"$" !"$" !"$"

    Mean ! ,$,, "$" 6FP P B6FP .PBP St.D. - ) * (+' BP P F 4 F 65 6Suc. F !"$"

    B Mean F6P 6P 6.P .65 &$&$&) ( "St.D. 6FP 6FP 6 BP85 FP ! *$*$( -4!

    Eq. (12) withWe=?390

    , {

    k

    F

    , k

    =?53435368=

    ^

    ,U65

    ]#7

    ,5 =?3 =?0

    , and7

    being the identity matrix. Thesevariants were selected due to their good performance in un-constrained and dynamic optimization problems [7, 8]. Wewill denote these variants as UPSO1, UPSO2, and UPSOm,respectively.

    UPSOs performance was compared with the performa-nce of the standard local and global PSO variant, which aredenoted as PSOl and PSOg, respectively. For the determi-nation of the search direction of UPSO, as well as for thelocal PSO variant, a ring neighborhood topology was usedwith neighborhood radius equal to

    0

    , in order to take fulladvantage of its exploration properties.

    For each test problem and algorithm,05==

    experimentswere conducted, recording the success rate of each algo-rithm, i.e., the number of experiments at which it detectedthe solution with the desired accuracy, as well as the meanand the standard deviation of the required function evalu-ations. Moreover, a nonparametric Wilcoxon ranked sumtest was performed for each pair of algorithms in order toinvestigate the statistical significance between their perfor-mances in significance level

    28

    .In Table 1, the results for the minimax problems are re-

    ported with the best value per row being boldfaced, while in

    Table 2. Wilcoxon ranked sum tests for the minimax prob-lems.

    PSOg PSOl UPSO1 UPSO2 UPSOmPSOg | | |PSOl | | | |

    UPSO1 | |UPSO2 | | |UPSOm | | | |PSOg | |PSOl | | | |

    UPSO1 | |UPSO2 | |UPSOm | | | |PSOg | | PSOl | | | |

    UPSO1 | | UPSO2 | | | |UPSOm | |PSOg | | |PSOl | | | |

    UPSO1 | | |UPSO2 | | | |UPSOm | | | |PSOg | |PSOl | | | |

    UPSO1 | |UPSO2 | |UPSOm | | | |PSOg | | | |PSOl | | | |

    UPSO1 | | | |UPSO2 | | | |UPSOm | | | |PSOg | | | |PSOl | | | |

    UPSO1 | | | UPSO2 | | | |UPSOm | | |PSOg | | |PSOl | | | |

    UPSO1 | | |UPSO2 | | | |UPSOm | | | |PSOg | | |PSOl | | | |

    UPSO1 | | | UPSO2 | | |UPSOm | | |PSOg |PSOl | |

    B UPSO1 |UPSO2 | UPSOm | | |

    Table 2 the statistical hypothesis tests are given, with denoting statistically significant difference between the per-formances of two algorithms. In all test problems, with theexception of TP

    0P=

    , all algorithms had a success rate of05==28

    . In TP0

    ,@

    and , UPSOm (UPSO with mutation)outperformed the other algorithms, having also a statisti-cally significant difference. In TP , ,

    and , UPSO2(UPSO with

    ['=3

    ) had the best performance, followed

  • Table 3. Results for the integer programming problems.TP PSOg PSOl UPSO1 UPSO2 UPSOm

    Suc. F !#"$" !"$" !"$" !"$" Mean F6P APBP yBPBP !$! *+,$,N "$" BP

    St.D. P F PBP 6 ! * " ( ) ' F 6 P85 6Suc. !"" !"$" !"$"

    Mean PB F 65 P , ! ' " ,St.D. FP BP !&$& %0! F P 6Suc. 6 !#"$" !"$" !"$"

    Mean PBP 6P }PBP 6 -%$% , % ' BPBSt.D. !-% ' *$(1P F B65 4 BPBP P P 6Suc. !"" !#"$" !"$" !"$" !"$"

    Mean P 6P P 6 6FP &$"$% ' "St.D. BP BP BP 85 5 ' % (+,Suc. !#"$" !"$" !"$" !"$"

    Mean BP F6P BP ! ( "$& )$" BP St.D. FP8 FP BP & * ) (+, P FSuc. !"" !#"$" !"$" !"$" !"$"

    Mean PB F P PB ! * ! ( "St.D. P F P F 65B P ( " ) ,Suc. !"" !#"$" !"$" !"$" !"$"

    Mean 6P 45 6 BP 6F & ' - ""St.D. B 8FP B65 BP '* ($*

    by UPSO1 (UPSO withU =?3@

    ), having statistically sig-nificant difference between its performance and the perfor-mance of the other algorithms. The global variant of thestandard PSO (PSOg) proved to be more efficient in TP and

    , although, in the latter, there was no statistically sig-nificant difference between its performance and the perfor-mance of UPSO2. TP

    05=

    proved to be the most difficultfor all algorithms, perhaps due to the large number of func-tions. The success rate was reduced for all algorithms ex-cept UPSOm, which had the best performance. Also, TP

    05=

    was the only case where the standard local PSO (PSOl) out-performed the global variant (PSOg) underlining the differ-ence between their exploration ability, which is crucial insuch difficult problems. Summarizing the results for theminimax problems, UPSOm and UPSO2 proved to be themost promising schemes overall, in accordance with resultsreported for different optimization problems [7, 8]. PSOgproved to be competitive in a few problems, without alwayshaving statistically significant difference from UPSO. Fi-nally, in difficult test problems, UPSOm proved to be themost efficient algorithm.

    In Tables 3 and 4 the corresponding results and sta-tistical hypothesis tests for the integer programming prob-lems are reported. The rounding of the particles compo-nents to the nearest integer for the evaluation of the ob-jective function seems to introduce a peculiarity in the al-gorithms dynamics. Two different real components thatare close can be rounded to the same integer, assigning todifferent particles the same objective function. Obviously,exploitationbased schemes can be affected by this proce-dure, since they are more prone to move the particles fastertowards the best positions detected by the swarm. Thus,

    Table 4. Wilcoxon ranked sum tests for the integer program-ming problems.

    PSOg PSOl UPSO1 UPSO2 UPSOmPSOg | | | |PSOl | | | |

    UPSO1 | | | |UPSO2 | | | |UPSOm | | | |PSOg | | PSOl | | | |

    UPSO1 | | | |UPSO2 | | UPSOm | | PSOg | | |PSOl | | | |

    UPSO1 | | | |UPSO2 | | |UPSOm | | | |PSOg | |PSOl | |

    UPSO1 |UPSO2 |UPSOm | | | |PSOg | | |PSOl | | | |

    UPSO1 | | |UPSO2 | | | |UPSOm | | | |PSOg | |PSOl | |

    UPSO1 |UPSO2 |UPSOm | | | |PSOg PSOl | |

    UPSO1 UPSO2 | UPSOm |

    schemes that promote exploration were expected to havebetter performance. This is indeed reflected in the resultsreported in Table 3. PSOl (local PSO variant) had bettersuccess rates than PSOg (global variant) in all but the lasttest problem. Also, UPSO1 (

    =?3@

    ), which promotes ex-ploration rather than exploitation, was more competitive tothe balanced UPSO2 scheme (

    =3

    ) than in the min-imax problems, having the best performance in TP

    @

    and , while UPSO2 performed better in TP

    0

    and . UPSOmhad the best performance in the other problems, althoughit did not have statistically significant difference in the lasttest problem.

    In general, both in minimax and integer programmingproblems, the standard UPSO variants seemed to follow theperformance pattern of PSO, i.e., in problems where PSOlhas a good performance, UPSO schemes that promote ex-ploration are more efficient, while in the other cases morebalanced schemes, such as UPSO2, proved to have betterperformance. The overall good performance of UPSO with

  • mutation indicates that properly perturbed explorationba-sed schemes of UPSO retain a good balance between explo-ration and exploitation. This behavior is in accordance withpreviously reported results in different static and dynamicoptimization problems [7, 8], thereby rendering UPSOm agood default choice.

    4. CONCLUSIONS

    We investigated the performance of Unified Particle SwarmOptimization on minimax and integer programming prob-lems. Different variants of the algorithm were investigatedand compared with the standard Particle Swarm Optimiza-tion algorithm. The results indicate that UPSO can tackleefficiently problems of both types, outperforming in mostcases the standard PSO. Useful conclusions were also de-rived regarding the performance of different UPSO variants,as well as on the effect of rounding on the algorithms per-formance in integer programming problems.

    ACKNOWLEDGMENT

    We thank the anonymous reviewers for their useful com-ments and suggestions. This work was partially supportedby the PENED 2001 Project awarded by the Greek Secre-tariat of Research and Technology.

    5. REFERENCES

    [1] F. S. Hillier and G. J. Lieberman. Introduction to Op-erations Research. McGrawHill, 1995.

    [2] G. L. Nemhauser, A. H. G. Rinnooy Kan, and M. J.Todd, editors. Handbooks in OR & MS, volume 1.Elsevier, 1989.

    [3] D. Z. Du and P. M. Pardalos. Minimax and Applica-tions. Kluwer, 1995.

    [4] J. W. Bandler and C. Charalambous. Nonlinear pro-gramming using minimax techniques. J. Optim. Th.Appl., 13:607619, 1974.

    [5] E. C. Laskari, K. E. Parsopoulos, and M. N. Vrahatis.Particle swarm optimization for minimax problems.In Proceedings of the IEEE 2002 Congress on Evolu-tionary Computation, pages 15821587, Hawaii (HI),USA, 2002. IEEE Press.

    [6] E. C. Laskari, K. E. Parsopoulos, and M. N. Vrahatis.Particle swarm optimization for integer programming.In Proceedings of the IEEE 2002 Congress on Evolu-tionary Computation, pages 15821587, Hawaii (HI),USA, 2002. IEEE Press.

    [7] K. E. Parsopoulos and M. N. Vrahatis. UPSO: Aunified particle swarm optimization scheme. In Lec-ture Series on Computer and Computational Sciences,

    Vol. 1, Proceedings of the International Conference ofComputational Methods in Sciences and Engineering(ICCMSE 2004), pages 868873. VSP InternationalScience Publishers, Zeist, The Netherlands, 2004.

    [8] K. E. Parsopoulos and M. N. Vrahatis. Unified particleswarm optimization in dynamic environments. Lec-ture Notes in Computer Science (LNCS), 3449:590599, 2005.

    [9] R. C. Eberhart and J. Kennedy. A new optimizer usingparticle swarm theory. In Proceedings Sixth Sympo-sium on Micro Machine and Human Science, pages3943, Piscataway, NJ, 1995. IEEE Service Center.

    [10] J. Kennedy and R. C. Eberhart. Particle swarm opti-mization. In Proc. IEEE Int. Conf. Neural Networks,volume IV, pages 19421948, Piscataway, NJ, 1995.IEEE Service Center.

    [11] J. Kennedy and R. C. Eberhart. Swarm Intelligence.Morgan Kaufmann Publishers, 2001.

    [12] J. Kennedy. Bare bones particle swarms. In Proc.IEEE Swarm Intelligence Symposium, pages 8087,Indianapolis, USA, 2003. IEEE Press.

    [13] R. Mendes, J. Kennedy, and J. Neves. The fully in-formed particle swarm: Simpler, maybe better. IEEETrans. Evol. Comput., 8(3):204210, 2004.

    [14] M. Clerc and J. Kennedy. The particle swarmexplosion, stability, and convergence in a multidimen-sional complex space. IEEE Trans. Evol. Comput.,6(1):5873, 2002.

    [15] I. C. Trelea. The particle swarm optimization algo-rithm: Convergence analysis and parameter selection.Information Processing Letters, 85:317325, 2003.

    [16] J. Matyas. Random optimization. Automatization andRemote Control, 26:244251, 1965.

    [17] S. Xu. Smoothing method for minimax problems.Comp. Optim. Appl., 20:267279, 2001.

    [18] H.-P. Schwefel. Evolution and Optimum Seeking. Wi-ley, New York, 1995.

    [19] L. Luksan and J. Vlcek. Test problems for nonsmoothunconstrained and linearly constrained optimization.Technical Report 798, Institute of Computer Science,Academy of Sciences of the Czech Republic, Prague,Czech Republic, 2000.

    [20] G. Rudolph. An evolutionary algorithm for integerprogramming. In Y. Davidor, H.-P. Schwefel, andR. Manner, editors, Parallel Problem Solving from Na-ture, volume 3, pages 139148. Springer, 1994.

    [21] A. Glankwahmdee, J. S. Liebman, and G. L. Hogg.Unconstrained discrete nonlinear programming. Engi-neering Optimization, 4:95107, 1979.

    [22] S. S. Rao. Engineering OptimizationTheory andPractice. Wiley, 1996.