week 7 - exact solving techniques
DESCRIPTION
Slides linear programmingTRANSCRIPT
![Page 1: Week 7 - Exact Solving Techniques](https://reader031.vdocuments.net/reader031/viewer/2022020111/563db790550346aa9a8c3db2/html5/thumbnails/1.jpg)
Solution Techniques
![Page 2: Week 7 - Exact Solving Techniques](https://reader031.vdocuments.net/reader031/viewer/2022020111/563db790550346aa9a8c3db2/html5/thumbnails/2.jpg)
Exact Optimization Methods
![Page 3: Week 7 - Exact Solving Techniques](https://reader031.vdocuments.net/reader031/viewer/2022020111/563db790550346aa9a8c3db2/html5/thumbnails/3.jpg)
Exact Optimization Methods
We will consider exact optimization methods for solving transport
optimization problems in this part of our course.
What are the advantages and disadvantages of exact
optimization methods?
Branch and Cut
Branch and Price
Travelling Salesman Problem
Vehicle Routing Problem and Variants
...
![Page 4: Week 7 - Exact Solving Techniques](https://reader031.vdocuments.net/reader031/viewer/2022020111/563db790550346aa9a8c3db2/html5/thumbnails/4.jpg)
Linear Program – Definition
Linear Program
![Page 5: Week 7 - Exact Solving Techniques](https://reader031.vdocuments.net/reader031/viewer/2022020111/563db790550346aa9a8c3db2/html5/thumbnails/5.jpg)
Introduction to the Simplex Algorithm
The simplex algorithm has been developed in 1947 by George
Dantzig.
Today it is possible to solve linear programs involving several
millions of variables and constraints.
There exist worst-case examples that can not be solved by the
simplex algorithms in polynomial time.
However, the simplex algorithm is the most commonly used in
practice.
There exist polynomial time approaches for solving linear
programs, e.g.: the ellipsoid method.
![Page 6: Week 7 - Exact Solving Techniques](https://reader031.vdocuments.net/reader031/viewer/2022020111/563db790550346aa9a8c3db2/html5/thumbnails/6.jpg)
Introduction to the Simplex Algorithm
− x1 + 3x2 = 12
x1 + x2 = 8
2x1 − x2 = 10
0
3
4
5
6
0 1 4 2 3 5 6
2
1
3x1 + 2x2 = 22
![Page 7: Week 7 - Exact Solving Techniques](https://reader031.vdocuments.net/reader031/viewer/2022020111/563db790550346aa9a8c3db2/html5/thumbnails/7.jpg)
Introduction to the Simplex Algorithm
More details: Algorithmics, Mathematical Programming
R. Vanderbei.
Linear Programming: Foundations and Extensions. Kluwer. 1998.
![Page 8: Week 7 - Exact Solving Techniques](https://reader031.vdocuments.net/reader031/viewer/2022020111/563db790550346aa9a8c3db2/html5/thumbnails/8.jpg)
Integer Program
Integer Program
![Page 9: Week 7 - Exact Solving Techniques](https://reader031.vdocuments.net/reader031/viewer/2022020111/563db790550346aa9a8c3db2/html5/thumbnails/9.jpg)
Total Enumeration
• Class exercise- C1
• Exponential growth!!
![Page 10: Week 7 - Exact Solving Techniques](https://reader031.vdocuments.net/reader031/viewer/2022020111/563db790550346aa9a8c3db2/html5/thumbnails/10.jpg)
LP Relaxation
Relaxation (omitting) of the integrality constraints
Linear programs can be solved efficiently
Lower bound for integer programs
Solution values of the relaxation can hint at solutions of the
integer problem
![Page 11: Week 7 - Exact Solving Techniques](https://reader031.vdocuments.net/reader031/viewer/2022020111/563db790550346aa9a8c3db2/html5/thumbnails/11.jpg)
Branch and Bound
Enumeration methods: Search tree is generated by depth-
first search
Computing lower and upper bounds in order to cut off parts of
the search tree.
Lower bound: LP relaxation
Upper bound: feasible solution
![Page 12: Week 7 - Exact Solving Techniques](https://reader031.vdocuments.net/reader031/viewer/2022020111/563db790550346aa9a8c3db2/html5/thumbnails/12.jpg)
Branch and Bound
Solution of the LP relaxation in the root node
Let xi be a variable with a fraction value xi .
The search tree is then branched as follows:
and
![Page 13: Week 7 - Exact Solving Techniques](https://reader031.vdocuments.net/reader031/viewer/2022020111/563db790550346aa9a8c3db2/html5/thumbnails/13.jpg)
Branch and Bound for Integer Optimization
Example – Branch-and-Bound
We consider the following integer program:
max −7x1 − 3x2 − 4x3
x1 + 2x2 + 3x3 − x4 = 8 3x1 + x2 + x3 − x5 = 5
x1, x2, x3, x4, x5 ≥ 0
x1, x2, x3, x4, x5 ∈ Z
![Page 14: Week 7 - Exact Solving Techniques](https://reader031.vdocuments.net/reader031/viewer/2022020111/563db790550346aa9a8c3db2/html5/thumbnails/14.jpg)
Branch and Bound for Integer Optimization
Example – Branch-and-Bound
The LP relaxation provides the solution:
2 19 x3 = x4 = x5 = 0 , x1 =
5 , x2 =
5
∗ 71 5
with objective value c = − (= −14, 2). We obtain the upper bound −15.
![Page 15: Week 7 - Exact Solving Techniques](https://reader031.vdocuments.net/reader031/viewer/2022020111/563db790550346aa9a8c3db2/html5/thumbnails/15.jpg)
Branch and Bound for Integer Optimization
P0
U = −∞ c∗ = −15
![Page 16: Week 7 - Exact Solving Techniques](https://reader031.vdocuments.net/reader031/viewer/2022020111/563db790550346aa9a8c3db2/html5/thumbnails/16.jpg)
Branch and Bound for Integer Optimization
Example – Branch-and-Bound
Branch on variable x2:
P1 = P0 ∩ {x | x2 ≤ 3}
P2 = P0 ∩ {x | x2 ≥ 4}
P1: Subsequent problem.
An optimal solution for the LP relaxation LP1 is
1 1 x4 = x5 = 0 , x1 =
2 , x2 = 2 , x3 =
2
∗ 29 2
and c = − (with upper bound -15).
![Page 17: Week 7 - Exact Solving Techniques](https://reader031.vdocuments.net/reader031/viewer/2022020111/563db790550346aa9a8c3db2/html5/thumbnails/17.jpg)
Branch and Bound for Integer Optimization
P0
U = −∞ c∗ = −15
x2 ≥ 4 x2 ≤ 3
P1
U = −∞ c∗ = −15 P 2
![Page 18: Week 7 - Exact Solving Techniques](https://reader031.vdocuments.net/reader031/viewer/2022020111/563db790550346aa9a8c3db2/html5/thumbnails/18.jpg)
Branch and Bound for Integer Optimization
Example – Branch-and-Bound
P1 has to be further decomposed:
P3 = P1 ∩ {x | x1 ≤ 0}
P4 = P1 ∩ {x | x1 ≥ 1}
The active problems are: K = {P2, P3, P4}.
Solving LP3 gives
x1 = x5 = 0 , x2 = 3 , x3 = 2 , x4 = 4
and c∗ = −17.
P3 is solved ⇒ the currently best solution has value −17.
![Page 19: Week 7 - Exact Solving Techniques](https://reader031.vdocuments.net/reader031/viewer/2022020111/563db790550346aa9a8c3db2/html5/thumbnails/19.jpg)
Branch and Bound for Integer Optimization
P4 P3
P0
U = −17
U = −∞ c∗ = −15
x2 ≥ 4
P2
x2 ≤ 3
P1
x1 ≥ 1 x1 ≤ 0
U = −∞ c∗ = −15
![Page 20: Week 7 - Exact Solving Techniques](https://reader031.vdocuments.net/reader031/viewer/2022020111/563db790550346aa9a8c3db2/html5/thumbnails/20.jpg)
Branch and Bound for Integer Optimization
Example – Branch-and-Bound
Solving LP4 gives:
1 4 x4 = 0 , x1 = 1 , x2 = 3 , x3 =
3 , x5 =
3
∗ 52 1 3 3
and c = − = −17 .
The upper bound (−18) is worse than the best solution, thereby P4 is
solved.
![Page 21: Week 7 - Exact Solving Techniques](https://reader031.vdocuments.net/reader031/viewer/2022020111/563db790550346aa9a8c3db2/html5/thumbnails/21.jpg)
Branch and Bound for Integer Optimization
P4 P3
P0
U = −17
U = −17 c∗ = −18
U = −∞ c∗ = −15
x2 ≥ 4
P2
x2 ≤ 3
P1
x1 ≥ 1 x1 ≤ 0
U = −∞ c∗ = −15
![Page 22: Week 7 - Exact Solving Techniques](https://reader031.vdocuments.net/reader031/viewer/2022020111/563db790550346aa9a8c3db2/html5/thumbnails/22.jpg)
Branch and Bound for Integer Optimization
• Example – Branch-and-Bound
• Solving P2 gives:
∗ 43 3
and c = − .
P2 is not yet solved, we have to branch:
P5 = P2 ∩ {x | x1 ≤ 0}
P6 = P2 ∩ {x | x1 ≥ 1}
![Page 23: Week 7 - Exact Solving Techniques](https://reader031.vdocuments.net/reader031/viewer/2022020111/563db790550346aa9a8c3db2/html5/thumbnails/23.jpg)
Branch and Bound for Integer Optimization
P4 P3
P0
P5 P6
U = −17 c∗ = −15
U = −17 c∗ = −18
x2 ≤ 3
P1
U = −∞ c∗ = −15
x2 ≥ 4
P2
x1 ≤ 0
x1 ≥ 1 x1 ≤ 0 x1 ≥ 1
U = −∞ c∗ = −15
![Page 24: Week 7 - Exact Solving Techniques](https://reader031.vdocuments.net/reader031/viewer/2022020111/563db790550346aa9a8c3db2/html5/thumbnails/24.jpg)
Branch and Bound for Integer Optimization
Example – Branch-and-Bound
Solving LP5 gives
x1 = x3 = x5 = 0 , x2 = 5 , x4 = 2
and c∗ = −15.
This is a new best solution with value −15. P5 is thereby solved.
P6 does not need to be considered anymore, since LP0 shows that no solution better than −15 is possible.
![Page 25: Week 7 - Exact Solving Techniques](https://reader031.vdocuments.net/reader031/viewer/2022020111/563db790550346aa9a8c3db2/html5/thumbnails/25.jpg)
Branch and Bound for Integer Optimization
P4 P3
P0
P5 P6
U = −17 c∗ = −15
U = −15 c∗ = −15
U = −17 c∗ = −18
x2 ≤ 3
P1
U = −∞ c∗ = −15
x2 ≥ 4
P2
x1 ≤ 0
x1 ≥ 1 x1 ≤ 0 x1 ≥ 1
U = −∞ c∗ = −15
![Page 26: Week 7 - Exact Solving Techniques](https://reader031.vdocuments.net/reader031/viewer/2022020111/563db790550346aa9a8c3db2/html5/thumbnails/26.jpg)
Branch and Bound
Multiple variants:
Node selection
Variable selection
Branching decisions
Possible problems: long run-times, large memory requirements
More details: Algorithmics, Mathematical Programming
L. Wolsey. Integer Programming. Wiley. 1998.
![Page 27: Week 7 - Exact Solving Techniques](https://reader031.vdocuments.net/reader031/viewer/2022020111/563db790550346aa9a8c3db2/html5/thumbnails/27.jpg)
Relaxations
1. Constraint relaxations: New feasible solutions may be allowed but none should be lost!
C2-Exercises! Are the following valid constraint relaxations?
2. Continous relaxations (LP relaxations )
![Page 28: Week 7 - Exact Solving Techniques](https://reader031.vdocuments.net/reader031/viewer/2022020111/563db790550346aa9a8c3db2/html5/thumbnails/28.jpg)
Relaxations
• If an optimal soltion to a relaxation is also feasible in the model it relaxes, the solution is optimal in that original model.
• C3:Exercises!
Compute by inspection optimal solution to each of the following relaxations and determine whether we can conclude that the relaxation optimum is optimal in the original model.
![Page 29: Week 7 - Exact Solving Techniques](https://reader031.vdocuments.net/reader031/viewer/2022020111/563db790550346aa9a8c3db2/html5/thumbnails/29.jpg)
Relaxations • If an optimal solution to a relaxation is also feasible in the model it relaxes, the solution is
optimal in that original model.
• More commonly, things are not that simple BUT: 1) We have the bound value (optimal value of any
relaxation of a maximize model yields an upper bound on the optimal value of the full model . The optimal value of any relaxation of a minimization model yields a lower bound.)
2) Maybe a good starting point for constructing a good heuristic solution.
3) Proving infeasability: If a constraint relaxation infeasible so is the full model it relaxes.
![Page 30: Week 7 - Exact Solving Techniques](https://reader031.vdocuments.net/reader031/viewer/2022020111/563db790550346aa9a8c3db2/html5/thumbnails/30.jpg)
Rounded solutions from relaxations
• Many relaxations produce optimal solutions that are easily rounded to good feasible solutions.
• C4: Exercises: Round the LP relaxation optimum to an approximate solution for the original model. State the best lower and upper bounds on the optimal integer solution.
![Page 31: Week 7 - Exact Solving Techniques](https://reader031.vdocuments.net/reader031/viewer/2022020111/563db790550346aa9a8c3db2/html5/thumbnails/31.jpg)
Stronger LP relaxations
HOW?
1. Different formulations of the same problem
Example : E1
2. Choosing smallest Big-M’s.
C5:Exercise
3. Adding new valid inequality constraints.
![Page 32: Week 7 - Exact Solving Techniques](https://reader031.vdocuments.net/reader031/viewer/2022020111/563db790550346aa9a8c3db2/html5/thumbnails/32.jpg)
Valid inequalities
• A linear inequality is a valid inequality if it holds for all integer feasible solutions to the model.
• To strengthen the solution it must cut off some feasible solutions to the current LP relaxation that are not feasible to the full ILP model.
• Also called cutting planes
• Class exercise : C6
![Page 33: Week 7 - Exact Solving Techniques](https://reader031.vdocuments.net/reader031/viewer/2022020111/563db790550346aa9a8c3db2/html5/thumbnails/33.jpg)
Cutting Plane Method
Exponentially many constraints! In mathematical optimization, the cutting-plane method is an umbrella term for optimization methods which iteratively refine a feasible set or objective function by means of linear inequalities, termed cuts.
Cutting Plane Method – Basic Idea
-Start with a small subset of constraints and solve the resulting LP
-Check whether the obtained solution is feasible for all constraints
-If “yes”: optimal soloution found
-Else find a violated constraint and add it to the LP
-Iterate until no more constraints need to be added
171 / 319
![Page 34: Week 7 - Exact Solving Techniques](https://reader031.vdocuments.net/reader031/viewer/2022020111/563db790550346aa9a8c3db2/html5/thumbnails/34.jpg)
Cutting Plane Method
![Page 35: Week 7 - Exact Solving Techniques](https://reader031.vdocuments.net/reader031/viewer/2022020111/563db790550346aa9a8c3db2/html5/thumbnails/35.jpg)
Branch-and-Cut
Truncated cutting plane method returns a solution for an LP
relaxation
Branch-and-bound is combined with the cutting plane method:
Branch-and-Cut
The subproblems are solved using a truncated cutting plane
method
If no further cuts are found and the solution is not integer
feasible, branching is performed
Every new subproblem is again solved using the truncated
cutting plane method
![Page 36: Week 7 - Exact Solving Techniques](https://reader031.vdocuments.net/reader031/viewer/2022020111/563db790550346aa9a8c3db2/html5/thumbnails/36.jpg)
Branch-and-Cut
Compared to classical branch-and-bound, the search tree in a
branch-and-cut apporach is usually significantly smaller.
The success of branch-and-cut approaches relies on:
1.the use of strong LP relaxations
2.fast separation algorithms (generating cutting planes)
3.a multitude of algorithmic tricks
![Page 37: Week 7 - Exact Solving Techniques](https://reader031.vdocuments.net/reader031/viewer/2022020111/563db790550346aa9a8c3db2/html5/thumbnails/37.jpg)
VRPTW - Bounds
Lower Bounds
• The network lower bound can be obtained by removing the
capacity and time window constraints.
• LP relaxation • Better bounds can be obtained using mode complex algorithms
such as column generation.
Upper bounds
• Route construction
• Route improvement
• Metaheuristics
![Page 38: Week 7 - Exact Solving Techniques](https://reader031.vdocuments.net/reader031/viewer/2022020111/563db790550346aa9a8c3db2/html5/thumbnails/38.jpg)
Column Generation
The basic idea of column generation originates in the simplex
algorithm.
Only variables with negative reduced costs are entering the
basis.
It is sufficient to start with a small number of columns.
As long as variables with negative reduced cost can be
determined, they are added to the problem which is
subsequently solved again.
![Page 39: Week 7 - Exact Solving Techniques](https://reader031.vdocuments.net/reader031/viewer/2022020111/563db790550346aa9a8c3db2/html5/thumbnails/39.jpg)
Column Generation
Subproblem: Search variables with negative reduced cost
(Pricing-Problem).
If a such a variable/column has be found it can be added to the
master problem, which is re-solved.
This process is repeated as long as new columns with negative
reduced cost are found, if no such column existis an optimal LP
solution has been determined.
![Page 40: Week 7 - Exact Solving Techniques](https://reader031.vdocuments.net/reader031/viewer/2022020111/563db790550346aa9a8c3db2/html5/thumbnails/40.jpg)
Reduced Cost
The reduced cost are the cost of a variable in the current simplex
tableau.
Only variables with negative reduced cost can therefore improve
the objective value.
The reduced cost of a variable can be computed from the values
of the dual variables.
Given a linear program {min cx | Ax ≤ b, x ≥ 0}
Then the reduced cost are c − yA, with y the dual variables of
the linear program.
More details can be found in: R. Vanderbei. Linear Programming:
Foundations and Extensions. Kluwer. 1998.
![Page 41: Week 7 - Exact Solving Techniques](https://reader031.vdocuments.net/reader031/viewer/2022020111/563db790550346aa9a8c3db2/html5/thumbnails/41.jpg)
Branch-and-Price
LPs with exponentially many variables can be solved. This
does not give us a solution for the integer problem.
An optimal solution for integer problems consists of a
combination of branch and bound and column generation:
Branch-and-Price
![Page 42: Week 7 - Exact Solving Techniques](https://reader031.vdocuments.net/reader031/viewer/2022020111/563db790550346aa9a8c3db2/html5/thumbnails/42.jpg)
Branch-and-Price
Columns are generated until an optimal LP solution is reached.
If this solution is not integral, the problem is divided (branching).
For each subproblem columns are generated again.
This process is repeated until an optimal solution is reached.
![Page 43: Week 7 - Exact Solving Techniques](https://reader031.vdocuments.net/reader031/viewer/2022020111/563db790550346aa9a8c3db2/html5/thumbnails/43.jpg)
Lagrangian Relaxation
Lagrangian relaxation is a technique well suited for problems where the constraints can be divided into two sets: • “good” constraints, with which the problem is solvable very easily • “bad” constraints that make it very hard to solve. The main idea is to relax the problem by removing the “bad” constraints and putting them into the objective function, assigned with weights (the Lagrangian multiplier). Each weight represents a penalty which is added to a solution that does not satisfy the particular constraint.
![Page 44: Week 7 - Exact Solving Techniques](https://reader031.vdocuments.net/reader031/viewer/2022020111/563db790550346aa9a8c3db2/html5/thumbnails/44.jpg)
Lagrangian Relaxation
We assume that optimizing over the set X can be done very easily, whereas adding the “bad” constraints Ax ≥ b makes the problem intractable.
![Page 45: Week 7 - Exact Solving Techniques](https://reader031.vdocuments.net/reader031/viewer/2022020111/563db790550346aa9a8c3db2/html5/thumbnails/45.jpg)
Lagrangian Relaxation
• Therefore, we introduce a dual variable for every constraint of Ax ≥ b. The vector λ ≥ 0 is the vector of dual variables (the Lagrangian multipliers) that has the same dimension as vector b. For a fixed λ ≥ 0, consider the relaxed problem
![Page 46: Week 7 - Exact Solving Techniques](https://reader031.vdocuments.net/reader031/viewer/2022020111/563db790550346aa9a8c3db2/html5/thumbnails/46.jpg)
Lagrangian Relaxation
• By assumption, we can efficiently compute the optimal value for the relaxed problem with a fixed vector λ.
• Lemma (Weak duality). Z(λ) provides a lower bound on
![Page 47: Week 7 - Exact Solving Techniques](https://reader031.vdocuments.net/reader031/viewer/2022020111/563db790550346aa9a8c3db2/html5/thumbnails/47.jpg)
Lagrangian Relaxation
![Page 48: Week 7 - Exact Solving Techniques](https://reader031.vdocuments.net/reader031/viewer/2022020111/563db790550346aa9a8c3db2/html5/thumbnails/48.jpg)
Lagrangian Relaxation
![Page 49: Week 7 - Exact Solving Techniques](https://reader031.vdocuments.net/reader031/viewer/2022020111/563db790550346aa9a8c3db2/html5/thumbnails/49.jpg)
Lagrangian Relaxation
![Page 50: Week 7 - Exact Solving Techniques](https://reader031.vdocuments.net/reader031/viewer/2022020111/563db790550346aa9a8c3db2/html5/thumbnails/50.jpg)
Solving the lagrangian dual
Mostly used Techniques: -Subgradient optimization method -Multiplier adjustment.
![Page 51: Week 7 - Exact Solving Techniques](https://reader031.vdocuments.net/reader031/viewer/2022020111/563db790550346aa9a8c3db2/html5/thumbnails/51.jpg)
Solving the lagrangian dual Sign restrictions on multipliers: -If the relaxed constraint has form ≥ Multiplier is ≤0 for a maximization model ≥0 for a minimization model -If the relaxed constraint has form ≤ Multiplier is ≥ 0 for a maximization model ≤ 0 for a minimization model -Class Exercise C7
![Page 52: Week 7 - Exact Solving Techniques](https://reader031.vdocuments.net/reader031/viewer/2022020111/563db790550346aa9a8c3db2/html5/thumbnails/52.jpg)
Wrap-up
• Suppose that we have some problem instance of a combinatorial optimisation problem and further suppose that it is a minimisation problem.
• We draw a vertical line representing value (the higher up this line the higher the value) then somewhere on this line is the optimal solution to the problem we are considering.
![Page 53: Week 7 - Exact Solving Techniques](https://reader031.vdocuments.net/reader031/viewer/2022020111/563db790550346aa9a8c3db2/html5/thumbnails/53.jpg)
Wrap-up
• Exactly where on this line this optimal solution
lies we do not know, but it must be somewhere! • Conceptually therefore this optimal solution value
divides our value line into two: • above the optimal solution value are upper
bounds, values which are above the (unknown) optimal solution value
• below the optimal solution value are lower bounds, values which are below the (unknown) optimal solution value.
![Page 54: Week 7 - Exact Solving Techniques](https://reader031.vdocuments.net/reader031/viewer/2022020111/563db790550346aa9a8c3db2/html5/thumbnails/54.jpg)
Wrap-up
• In order to discover the optimal solution value then
any algorithm that we develop must address both these issues i.e. it must concern itself both with upper bounds and with lower bounds.
• In particular the quality of these bounds is important to the computational success of any algorithm:
• we like upper bounds that are as close as possible to the optimal solution, i.e. as small as possible
• we like lower bounds that are as close as possible to the optimal solution, i.e. as large as possible.
![Page 55: Week 7 - Exact Solving Techniques](https://reader031.vdocuments.net/reader031/viewer/2022020111/563db790550346aa9a8c3db2/html5/thumbnails/55.jpg)
Wrap-up Upper bounds Typically upper bounds are found by searching for feasible
solutions to the problem, that is solutions which satisfy the constraints of the problem.
A number of well-known general techniques are available to find feasible solutions to combinatorial optimisation problems, for example:
• interchange
• metaheuristics:
• tabu search
• simulated annealing
• variable neighbourhood search
• genetic algorithms (population heuristics).
In addition, for any particular problem, we may well have techniques which are specific to the problem being solved.
![Page 56: Week 7 - Exact Solving Techniques](https://reader031.vdocuments.net/reader031/viewer/2022020111/563db790550346aa9a8c3db2/html5/thumbnails/56.jpg)
Wrap-up Lower bounds
• One well-known general technique which is available to find lower bounds is linear programming relaxation. In linear programming (LP) relaxation we take an integer (or mixed-integer) programming formulation of the problem and relax the integrality requirement on the variables.
• This gives a linear program which can be solved optimally using a standard algorithm (simplex or interior point)
• The solution value obtained for this linear program gives a lower bound on the optimal solution to the original problem.
• Another well-known (and well-used) technique which is available to find lower bounds is lagrangean relaxation.