hybrid constraint programming and metaheuristic methods ... fine terzo... · hybrid constraint...

36
Hybrid Constraint Programming and Metaheuristic methods for Large Scale Optimization Problems Fabio Parisini Tutor: Paola Mello Co-tutor: Michela Milano Final seminars of the XXIII cycle of the doctorate course in Electronics, Computer Science and Telecommunications Parisini (UniBo) Hybrid methods Final seminars 1 / 27

Upload: hahanh

Post on 14-Feb-2019

232 views

Category:

Documents


0 download

TRANSCRIPT

Hybrid Constraint Programming and Metaheuristicmethods for Large Scale Optimization Problems

Fabio Parisini

Tutor: Paola MelloCo-tutor: Michela Milano

Final seminars of the XXIII cycle of the doctorate course inElectronics, Computer Science and Telecommunications

Parisini (UniBo) Hybrid methods Final seminars 1 / 27

Combinatorial optimization problems

Combinatorial optimization (CO) is a topic in theoreticalcomputer science and applied mathematics that consists offinding the least-cost solution to a mathematical problem inwhich each solution is associated with a numerical cost 1.

Combinatorial optimization problems arise in many application areas:

Vehicle routing;

Logistics;

Packing and cutting stock application;

Resource allocation;

Scheduling;

. . .

1WikipediaParisini (UniBo) Hybrid methods Final seminars 2 / 27

Combinatorial optimization problems

Combinatorial optimization (CO) is a topic in theoreticalcomputer science and applied mathematics that consists offinding the least-cost solution to a mathematical problem inwhich each solution is associated with a numerical cost 1.

Combinatorial optimization problems arise in many application areas:

Vehicle routing;

Logistics;

Packing and cutting stock application;

Resource allocation;

Scheduling;

. . .

1WikipediaParisini (UniBo) Hybrid methods Final seminars 2 / 27

Complete and heuristic methods

Two categories of solution approaches to CO problems:

Complete methods: find the optimal solution and prove optimalityat the cost of a high computational effort;

Heuristic methods: find good solutions without any optimalityguarantee.

Complete methods are impracticable when dealing with large scaleoptimization problems.

Parisini (UniBo) Hybrid methods Final seminars 3 / 27

Complete and heuristic methods

Two categories of solution approaches to CO problems:

Complete methods: find the optimal solution and prove optimalityat the cost of a high computational effort;

Heuristic methods: find good solutions without any optimalityguarantee.

Complete methods are impracticable when dealing with large scaleoptimization problems.

Parisini (UniBo) Hybrid methods Final seminars 3 / 27

Feasibility and optimality components

Two aspects coexist within CO problems:

Feasibility component: the constraints and the size of the problemare such that it is computationally expensive to find any feasiblesolution;

Optimality component: it is computationally easy in practice to finda feasible solution, whilst it is very difficult to find the optimal one.

Parisini (UniBo) Hybrid methods Final seminars 4 / 27

Solution techniques

Constraint Programming (CP) is particularly effective whendealing with the feasibility component of a CO problem. On theother hand, CP may present some limitations when dealing with astrong optimality component;

Metaheuristic methods instead are used in the literature to solvelarge scale optimization problems in an incomplete way, i.e. byfinding feasible sub-optimal solutions. Metaheuristic techniquescan thus effectively deal with CO problems where the optimalitycomponent is dominant.

Parisini (UniBo) Hybrid methods Final seminars 5 / 27

Constraint programming

General technique based on tree search;

Exploiting variable and value selection heuristics to guide thesearch;

Filtering and constraint propagation considerably reduce the sizeof the search space;

Suitable for complete approaches to the solution of CO problems;

Impracticable for large scale optimization problems.

Parisini (UniBo) Hybrid methods Final seminars 6 / 27

Metaheuristic methods

Many existing techniques, having common concepts:Intensification and diversification techniques;Neighborhood exploration methods;Definition of local search moves specific for the problem;Usage of a set of elite solutions;Adaptation of the search strategy to the evolution of the searchprocess itself;. . .

Parisini (UniBo) Hybrid methods Final seminars 7 / 27

Motivation

Thesis aimConstraint programming and metaheuristics methods showcomplementary strenghts and weaknesses. Hybrid search techniquescan be designed to exploit the advantages of both approaches. Theaim of my thesis is to integrate metaheuristic concepts, such asneighborhood exploration, intensification, diversification, restarting,within a CP-based tree search.

Parisini (UniBo) Hybrid methods Final seminars 8 / 27

CP modeling of a CO problem

An optimization problem P;

A model for P defined on a set of finite domain integer variablesx = [x1, x2, . . . , xn];

A set of constraints posted on the problem variables;

An incumbent solution x̄ = [x̄1, x̄2, . . . , x̄n] which is a feasibleassignment of values to the problem variables;

The discrepancy ∆ between x and x̄ , which can be computed as:

∆(x , x̄) =n

i=1

di where di =

{

1 if xi 6= x̄i ;0 otherwise.

(1)

Parisini (UniBo) Hybrid methods Final seminars 9 / 27

A known technique: limited discrepancy search (LDS)

Wrong turn: when the heuristics guides the search to a failurestate, backtracking is performed up to an open decision point,where the choice performed by the heuristics is reversed. Thisalternate choice is called a “wrong turn”;

Limited Discrepancy Search (LDS) limits the number of wrongturns (i.e. discrepancies) along the way; it explores regions atincreasing discrepancy value k , the k-distance neighborhoods ofthe solution proposed by the heuristics.

Parisini (UniBo) Hybrid methods Final seminars 10 / 27

LDS in optimization problems

LDS tries to reduce the cost of the incumbent solution x̄ byexploring the k-distance neighborhood of x̄ via tree search.

Incumbent Solution

n

Parisini (UniBo) Hybrid methods Final seminars 11 / 27

LDS in optimization problems

LDS tries to reduce the cost of the incumbent solution x̄ byexploring the k-distance neighborhood of x̄ via tree search.

n

Parisini (UniBo) Hybrid methods Final seminars 11 / 27

LDS in optimization problems

LDS tries to reduce the cost of the incumbent solution x̄ byexploring the k-distance neighborhood of x̄ via tree search.

n

Parisini (UniBo) Hybrid methods Final seminars 11 / 27

LDS in optimization problems

LDS tries to reduce the cost of the incumbent solution x̄ byexploring the k-distance neighborhood of x̄ via tree search.

n

Parisini (UniBo) Hybrid methods Final seminars 11 / 27

LDS in optimization problems

LDS tries to reduce the cost of the incumbent solution x̄ byexploring the k-distance neighborhood of x̄ via tree search.

n

Impossible to reach high

discrepancy values

5

Parisini (UniBo) Hybrid methods Final seminars 11 / 27

Large neighborhood search (LNS)

Large neighborhood search is a technique which is commonly used forimproving the quality of a given solution by exploring its neighborhood:

It iteratively relaxes a fragment of the current solution x̄ and thenre-optimizes it using CP aided tree search;

At a high level it can be read as a hill climber which is executeduntil some time limit is exhausted;

It is a problem-dependent strategy;

Key components of LNS are the methods used to choose thefragments to relax and the methods to re-optimize them.

Parisini (UniBo) Hybrid methods Final seminars 12 / 27

Hybrid methods for neighborhood exploration

Complete neighborhood exploration is impracticable for large scaleoptimization problems:

We introduce methods to explore slices of large discrepancyneighborhoods efficiently using restarts and randomization.

1

2

3

n

Parisini (UniBo) Hybrid methods Final seminars 13 / 27

Hybrid methods for neighborhood exploration

Complete neighborhood exploration is impracticable for large scaleoptimization problems:

We introduce methods to explore slices of large discrepancyneighborhoods efficiently using restarts and randomization.

1

2

3

n

Parisini (UniBo) Hybrid methods Final seminars 13 / 27

Hybrid methods for neighborhood exploration

Complete neighborhood exploration is impracticable for large scaleoptimization problems:

We introduce methods to explore slices of large discrepancyneighborhoods efficiently using restarts and randomization.

1

2

3

n

Parisini (UniBo) Hybrid methods Final seminars 13 / 27

Sliced neighborhood search (SNS)

SNS iteratively explores neighborhoods of an incumbent solutionx̄ by triggering a sequence of restarts;

At each restart a randomly chosen transversal slice of ak − distance neighborhood is chosen and explored;

Transversal slices are identified by randomly selecting whichvariables have to change and which variables have to keep thesame value;

SNS works by posting extra constraints, equality and differenceconstraints, on subsets of variables, thus setting at-least andat-most discrepancy bounds.

Parisini (UniBo) Hybrid methods Final seminars 14 / 27

Sliced neighborhood search (SNS)

Definition (Neighborhood slice)A slice of a k-distance neighborhood of a given reference solution x̄ isdefined on three parameters: the incumbent solution x̄ , a set E ofindices corresponding to variables that keep the same value and a setD of indices corresponding to variables that have to change. Thecardinality of E is n − k :NS(x̄ ,E ,D) = {x |P ∪ {xi = x̄i |∀i ∈ E} ∪ {xi 6= x̄i |∀i ∈ D}}

SNS randomly choses indices in sets E and D and iteratively exploresthe corresponding neighborhood slice.

Parisini (UniBo) Hybrid methods Final seminars 15 / 27

A SNS example

Let’s take the incumbent solution x̄ = [2,4,9,5,2,8]:

LDS first explores exhaustively the search space at discrepancyvalue 1, where just 1 of the 6 variables can change at a time, thenat discrepancy 2, 3 and so on;SNS first fixes a certain number of variables to the incumbentsolution value, then starts the real search:

For example, SNS could set x2 = 4, x3 = 9 and x6 = 8 and performa standard tree search just on x1, x4 and x5;SNS performs many randomized iterations choosing each time adifferent set of variables and using small time limits for eachiteration.

Parisini (UniBo) Hybrid methods Final seminars 16 / 27

Application of SNS

Usage of SNS as a stand alone search strategy:

SNS is given an initial solution x̄ . If x̄ is distant from the optimalsolution SNS can quickly improve over it by performing largeneighborhood search.

SNS within a heuristic framework:

SNS allows the partial exploration of very large neighborhoods, itincludes randomization elements and both intensification anddiversification behaviors;

By conveniently setting at-most k and at-least k bounds it ispossible to constrain the solution x to be found to have the desiredminimum and maximum discrepancy with respect to x̄ , performingeither intensification or diversification.

Parisini (UniBo) Hybrid methods Final seminars 17 / 27

Experimental results

Problem of choice: Asymmetric Travelling Salesman Problemwith Time Windows (ATSPTW)Finding a minimum cost path visiting a set of cities exactly once, whereeach city must be visited within a specific time window.

Two main components coexist, a Travelling Salesman Problem(TSP) and a scheduling problem:

In TSPs, optimization usually results to be the most difficult issue;Scheduling problems with release dates and due dates usually setserious feasibility issues.

Parisini (UniBo) Hybrid methods Final seminars 18 / 27

Experimental results

Problem of choice: Asymmetric Travelling Salesman Problemwith Time Windows (ATSPTW)Finding a minimum cost path visiting a set of cities exactly once, whereeach city must be visited within a specific time window.

Two main components coexist, a Travelling Salesman Problem(TSP) and a scheduling problem:

In TSPs, optimization usually results to be the most difficult issue;Scheduling problems with release dates and due dates usually setserious feasibility issues.

Parisini (UniBo) Hybrid methods Final seminars 18 / 27

SNS performance elements 1/3

Search effectiveness in the sub-treesAfter discrepancy bounds are enforced by posting equality anddifference constraints, search in the sub-trees takes place. Searcheffectiveness in the sub-trees strongly depends on the propagationperformed by such constraints. While equality constraints enablestrong propagation, we expect difference constraints to be lesseffective.

Parisini (UniBo) Hybrid methods Final seminars 19 / 27

SNS performance elements 2/3

Solution Density in the selected discrepancy rangeRegardless of how efficiently the selected discrepancy range isexplored, the success of SNS depends on the actual presence ofimproving solutions in such range and their number. This in turndepends on the problem structure, and on the selected at-least andat-most discrepancy bounds.

Parisini (UniBo) Hybrid methods Final seminars 20 / 27

SNS performance elements 3/3

Effectiveness of the Sample Space explorationSNS is basically sampling the LDS search space; the samplingeffectiveness is measured by the number of collected samples (i.e.SNS iterations) over the size of the Sample Space.Let ∆least be the at-least discrepancy bound, ∆most the at-mostdiscrepancy bound; then the size of the Sample Space (i.e. the overallnumber of third level sub-trees) is given by:

(

nn −∆most

)(

∆most

∆least

)

Parisini (UniBo) Hybrid methods Final seminars 21 / 27

SNS stand alone 1/2

Table: Big instances, more than 50 cities, 300 CPU seconds time limit

Instance basic LDS ∆max =40%Cost Sol Disc % Impr Cost Sol Disc % Impr

rbg050a 430 - 0.00 424 5 37.50rbg050b 570 - 0.00 570 - 0.00rbg050c 563 - 0.00 545 7 66.67rbg055a 814 - 0.00 814 - 0.00rbg067a 1051 4 62.50 1051 7 62.50rbg092a 1208 3 24.34 1150 8 62.50rbg125a 1706 3 15.86 1632 20 36.83rbg132 1882 3 9.06 1815 13 20.73rbg132.2 2152 3 1.11 2040 31 11.47rbg152 2371 3 6.96 2336 23 12.5rbg152.3 2570 - 0.00 2548 42 2.13rbg172a 2942 3 4.11 2873 31 9.9rbg193 3360 3 4.15 3266 31 13.68rbg193.2 3290 3 3.23 3233 45 7.84rbg201a 3694 3 3.46 3650 24 6.29rbg233.2 4103 3 2.12 4059 49 4.52

Parisini (UniBo) Hybrid methods Final seminars 22 / 27

SNS stand alone 2/2

Table: Big instances, more than 50 cities, 300 CPU seconds time limit

Instance ∆(0%, 40%) ∆(10%, 50%) ∆(10%, 40%) ∆(25%, 55%)Cost % Impr Cost % Impr Cost % Impr Cost % Impr

rbg050a 424 37.50 429 6.25 429 6.25 430 0.00rbg050b 570 0.00 570 0.00 570 0.00 570 0.00rbg050c 545 66.67 563 0.00 563 0.00 563 0.00rbg055a 814 - 814 - 814 - 814 -rbg067a 1051 62.50 1048 100 1051 62.50 1051 62.50rbg092a 1150 62.50 1203 27.63 1203 27.63 1217 18.42rbg125a 1632 36.83 1682 22.66 1721 11.61 1762 0.00rbg132 1815 20.73 1925 1.57 1916 3.14 1934 0.00rbg132.2 2040 11.47 2063 9.34 2035 11.93 2108 5.18rbg152 2336 12.5 2377 6.01 2365 7.91 2410 0.79rbg152.3 2548 2.13 2555 1.45 2499 6.89 2528 4.07rbg172a 2873 9.90 2918 6.12 2932 4.95 2991 0.00rbg193 3266 13.68 3278 12.46 3401 0.00 3401 0.00rbg193.2 3233 7.84 3330 0.00 3330 0.00 3330 0.00rbg201a 3650 6.29 3748 0.00 3748 0.00 3748 0.00rbg233.2 4059 4.52 4142 0.00 4142 0.00 4142 0.00

Parisini (UniBo) Hybrid methods Final seminars 23 / 27

SNS within CP-based local branching

Table: Big instances, 7,200 CPU seconds time limit, SNS used forneighborhood exploration and diversification.

Instance Ref value LB Conf SNS ConfValue % Impr Value % Impr

rbg125a 2346 1762 62.33 1484 91.99rbg132.2 2276 1883 32.94 1310 80.97rbg132 2122 1934 24.67 1410 93.44rbg152.3 2675 2397 24.47 1981 61.09rbg152 2771 2281 49.59 1889 89.27rbg172a 3010 2748 21.63 2198 67.05rbg193.2 3365 3143 17.45 2739 49.21rbg193 3440 3217 21.73 2876 54.97rbg201a 3780 3562 13.70 3039 46.57rbg233.2 4219 3886 17.39 3768 23.55

Parisini (UniBo) Hybrid methods Final seminars 24 / 27

Conclusions and future work

SNS is a general and effective search technique to heuristicallyexplore the neighborhood of an incumbent solution x̄ up to highdiscrepancy values, incorporating elements coming from LDS andLNS;

Experimental results support the idea that the best SNSconfigurations obtain consistently better results than LDS;

SNS can be used both as a stand alone search strategy and asintensification and diversification method in a heuristic framework.

Further development:

Use of sampling techniques to derive promising slices to explore;

Introduction of learning processes, to automatically tune the SNSparameters during the search process.

Adoption of SNS as general neighborhood exploration tool withinmetaheuristics frameworks.

Parisini (UniBo) Hybrid methods Final seminars 25 / 27

List of publications I

Z. Kiziltan, A. Lodi, M. Milano, and F. Parisini.Cp-based local branching.Proc. of CP-07, LNCS, 4741:847–855, 2007.

Z. Kiziltan, A. Lodi, M. Milano, and F. Parisini.Bounding, filtering and diversification in cp-based local branching.Technical Report: OR/10/20, DEIS - Università di Bologna, 2010.

F. Parisini.Bi-dimensional domains for the non-overlapping rectanglesconstraint.In ICLP, pages 811–812, 2008.

F. Parisini.Local branching in a constraint programming framework.In ICLP (Technical Communications), pages 286–288, 2010.

Parisini (UniBo) Hybrid methods Final seminars 26 / 27

List of publications II

F. Parisini, M. Lombardi, and M. Milano.Discrepancy-based sliced neighborhood search.In AIMSA, pages 91–100, 2010.

F. Parisini and M. Milano.Improving cp-based local branching via sliced neighborhoodsearch.Accepted for publication: SAC ’11: Proceedings of the 2011 ACMSymposium on Applied Computing.

F. Parisini and M. Milano.Sliced neighborhood search.Submitted for publication to:Expert Systems with Applications.

Parisini (UniBo) Hybrid methods Final seminars 27 / 27