linear programming projection theory: part 2 chapter 2 (57...

85
Linear Programming Projection Theory: Part 2 Chapter 2 (57-80) University of Chicago Booth School of Business Kipp Martin September 26, 2017 1

Upload: ngocong

Post on 06-Feb-2018

221 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Linear Programming Projection Theory: Part 2 Chapter 2 (57 …faculty.chicagobooth.edu/.../36900/handouts/projection_part2.pdf · Linear Programming Projection Theory: Part 2 Chapter

Linear ProgrammingProjection Theory: Part 2

Chapter 2 (57-80)

University of ChicagoBooth School of Business

Kipp Martin

September 26, 2017

1

Page 2: Linear Programming Projection Theory: Part 2 Chapter 2 (57 …faculty.chicagobooth.edu/.../36900/handouts/projection_part2.pdf · Linear Programming Projection Theory: Part 2 Chapter

Outline

Duality Theory

Dirty Variables

From Here to Infinity

Complementary Slackness

Sensitivity Analysis

Degeneracy

Consistency Testing

Optimal Value Function

Summary: Key Results

2

Page 3: Linear Programming Projection Theory: Part 2 Chapter 2 (57 …faculty.chicagobooth.edu/.../36900/handouts/projection_part2.pdf · Linear Programming Projection Theory: Part 2 Chapter

Duality Theory

We are solving the linear programming problem using projection.

min{c>x |Ax ≥ b}Rewrite as the system:

z0 − c>x ≥ 0

Ax ≥ b

Project out the x variables and get

z0 ≥ dk , k = 1, . . . , q

0 ≥ dk , k = q + 1, . . . , r

3

Page 4: Linear Programming Projection Theory: Part 2 Chapter 2 (57 …faculty.chicagobooth.edu/.../36900/handouts/projection_part2.pdf · Linear Programming Projection Theory: Part 2 Chapter

Duality Theory

Since

z0 ≥ dk , k = 1, . . . , q

0 ≥ dk , k = q + 1, . . . , r

the optimal value of the objective function is

z0 = max{dk | k = 1, ..., q}

Key Idea: Any of the dk provide a lower bound on the optimalobjective function value since the optimal value is the maximumover all values.

4

Page 5: Linear Programming Projection Theory: Part 2 Chapter 2 (57 …faculty.chicagobooth.edu/.../36900/handouts/projection_part2.pdf · Linear Programming Projection Theory: Part 2 Chapter

Duality Theory

Observation: We projected out the x variables from the system

z0 − c>x ≥ 0

Ax ≥ b

and got

z0 ≥ dk , k = 1, . . . , q

0 ≥ dk , k = q + 1, . . . , r

If we let u be the dual multipliers on Ax ≥ b and u0 the multiplieron z0 − c>x ≥ 0 we have (uk)>b = dk and

(uk)>A− uk0 c = 0

We take without loss uk0 = 1

5

Page 6: Linear Programming Projection Theory: Part 2 Chapter 2 (57 …faculty.chicagobooth.edu/.../36900/handouts/projection_part2.pdf · Linear Programming Projection Theory: Part 2 Chapter

Duality Theory

Let me say this again because it is so important. Each of the

z0 ≥ dk , k = 1, . . . , q

constraints in the projection result from aggregating c>x withAx ≥ b where uk is a vector of multipliers on Ax ≥ b so that

(uk)>A = c

6

Page 7: Linear Programming Projection Theory: Part 2 Chapter 2 (57 …faculty.chicagobooth.edu/.../36900/handouts/projection_part2.pdf · Linear Programming Projection Theory: Part 2 Chapter

Duality Theory

Thus any nonnegative uk with A>uk = c provides a lower boundvalue of

dk = b>uk

on the optimal objective function value. This is known as weakduality.

Lemma 2.28 (Weak Duality) If x is a solution to the systemAx ≥ b and u ≥ 0 is a solution to A>u = c , then c>x ≥ b>u.

7

Page 8: Linear Programming Projection Theory: Part 2 Chapter 2 (57 …faculty.chicagobooth.edu/.../36900/handouts/projection_part2.pdf · Linear Programming Projection Theory: Part 2 Chapter

Duality Theory

Since the minimum value of the linear program is given byz0 = max{dk | k = 1, . . . q} it is necessary to find a nonnegative usuch that A>u = c and b>u is as large as possible.

Finding the largest value of b>u is also a linear program:

max {b>u |A>u = c , u ≥ 0 }.

The linear program min{c>x |Ax ≥ b} is the primal linearprogram.

The linear program max{b>u |A>u = c , u ≥ 0 } that generatesthe multipliers for the lower bounds is the dual linear program.

The x variables are the primal variables and the u variables thedual variables or dual multipliers.

8

Page 9: Linear Programming Projection Theory: Part 2 Chapter 2 (57 …faculty.chicagobooth.edu/.../36900/handouts/projection_part2.pdf · Linear Programming Projection Theory: Part 2 Chapter

Duality Theory

Theorem 2.29 (Strong Duality) If there is an optimal solution tothe primal linear program

min {c>x |Ax ≥ b},

then there is an optimal solution to the dual linear programmax {b>u |A>u = c , u ≥ 0} and the optimal objective functionvalues are equal.

Corollary 2.30 If the primal linear program min{c>x |Ax ≥ b} isunbounded then max{b>u |A>u = c , u ≥ 0} the dual linearprogram is infeasible.

9

Page 10: Linear Programming Projection Theory: Part 2 Chapter 2 (57 …faculty.chicagobooth.edu/.../36900/handouts/projection_part2.pdf · Linear Programming Projection Theory: Part 2 Chapter

Duality Theory

If the primal problem is

min c>x

Ax ≥ b

the dual problem is

max b>u

A>u = c

u ≥ 0

What is the dual of the dual?

10

Page 11: Linear Programming Projection Theory: Part 2 Chapter 2 (57 …faculty.chicagobooth.edu/.../36900/handouts/projection_part2.pdf · Linear Programming Projection Theory: Part 2 Chapter

Duality Theory

If the primal is

min c>x

Ax ≥ b

x ≥ 0

What is the dual?

What is the dual of the dual?

11

Page 12: Linear Programming Projection Theory: Part 2 Chapter 2 (57 …faculty.chicagobooth.edu/.../36900/handouts/projection_part2.pdf · Linear Programming Projection Theory: Part 2 Chapter

Duality Theory

If the primal is

min c>x

Ax = b

x ≥ 0

What is the dual?

What is the dual of the dual?

12

Page 13: Linear Programming Projection Theory: Part 2 Chapter 2 (57 …faculty.chicagobooth.edu/.../36900/handouts/projection_part2.pdf · Linear Programming Projection Theory: Part 2 Chapter

Duality Theory

DO NOT MEMORIZE PRIMAL DUAL PAIRS!

DO NOT MEMORIZE PRIMAL DUAL PAIRS!

DO NOT MEMORIZE PRIMAL DUAL PAIRS!

DO NOT MEMORIZE PRIMAL DUAL PAIRS!

DO NOT MEMORIZE PRIMAL DUAL PAIRS!

DO NOT MEMORIZE PRIMAL DUAL PAIRS!

DO NOT MEMORIZE PRIMAL DUAL PAIRS!

You get the idea.

13

Page 14: Linear Programming Projection Theory: Part 2 Chapter 2 (57 …faculty.chicagobooth.edu/.../36900/handouts/projection_part2.pdf · Linear Programming Projection Theory: Part 2 Chapter

Duality Theory

Table: Primal-Dual Relationships

Primal Dual

inequality primal constraint nonnegative dual variableequality primal constraint unrestricted dual variablenonnegative primal variable inequality dual constraintunrestricted primal variable equality dual constraintmin primal objective function max dual objective function

14

Page 15: Linear Programming Projection Theory: Part 2 Chapter 2 (57 …faculty.chicagobooth.edu/.../36900/handouts/projection_part2.pdf · Linear Programming Projection Theory: Part 2 Chapter

Duality Theory

Proposition 2.31 If the linear program {c>x |Ax ≥ b} has anoptimal solution, then applying projection to the systemz0 − c>x ≥ 0, Ax ≥ b gives all of the extreme points of the dualpolyhedron {u |A>u = c , u ≥ 0} and all of the extreme rays of thedual recession cone {u |A>u = 0, u ≥ 0}.

You are going to basically prove this for homework.

15

Page 16: Linear Programming Projection Theory: Part 2 Chapter 2 (57 …faculty.chicagobooth.edu/.../36900/handouts/projection_part2.pdf · Linear Programming Projection Theory: Part 2 Chapter

Where We Are Headed

We want to solve problems with special structure! Real problemshave special structure! One such structure is

min c>x + f >y

s.t. Ax + By ≥ b

Later we will take advantage of special structure in the A matrixand project out the x variables and solve a problem in only the yvariables.

16

Page 17: Linear Programming Projection Theory: Part 2 Chapter 2 (57 …faculty.chicagobooth.edu/.../36900/handouts/projection_part2.pdf · Linear Programming Projection Theory: Part 2 Chapter

Where We Are Headed

z0 − c>x − f >y ≥ 0

Ax + By ≥ b

Rewrite this as

z0 − c>x ≥ f >y

Ax ≥ b − By

Now project out x

z0 ≥ f >y + (uk)>(b − By), k = 1, . . . , q

0 ≥ (uk)>(b − By), k = q + 1, . . . , r

17

Page 18: Linear Programming Projection Theory: Part 2 Chapter 2 (57 …faculty.chicagobooth.edu/.../36900/handouts/projection_part2.pdf · Linear Programming Projection Theory: Part 2 Chapter

Where We Are Headed

Here is the new formulation

z0 ≥ f >y + (uk)>(b − By), k = 1, . . . , q

0 ≥ (uk)>(b − By), k = q + 1, . . . , r

Any problems with this?

What is the fix?

See the homework for a premonition.

18

Page 19: Linear Programming Projection Theory: Part 2 Chapter 2 (57 …faculty.chicagobooth.edu/.../36900/handouts/projection_part2.pdf · Linear Programming Projection Theory: Part 2 Chapter

Where We Are Headed

We will also use duality to generate bounds in enumerationalgorithms.

19

Page 20: Linear Programming Projection Theory: Part 2 Chapter 2 (57 …faculty.chicagobooth.edu/.../36900/handouts/projection_part2.pdf · Linear Programming Projection Theory: Part 2 Chapter

Dirty Variables (See pages 43-46 in the text.)

Consider the system

x1 − x2 ≥ 1

x1 + x2 ≥ 1

I Can we eliminate x1 using Fourier-Motzkin elimination?

I What is the projection of the polyhedron onto the x2 space?

20

Page 21: Linear Programming Projection Theory: Part 2 Chapter 2 (57 …faculty.chicagobooth.edu/.../36900/handouts/projection_part2.pdf · Linear Programming Projection Theory: Part 2 Chapter

Dirty Variables (See pages 43-46 in the text.)

RecallH+(k) := {i ∈ I | ak(i) > 0}

H−(k) := {i ∈ I | ak(i) < 0}

H0(k) := {i ∈ I | ak(i) = 0}

(1)

Variable k is a dirty variable if H+(k) is empty and H−(k) is notempty, or H+(k) is not empty and H−(k) is empty,

21

Page 22: Linear Programming Projection Theory: Part 2 Chapter 2 (57 …faculty.chicagobooth.edu/.../36900/handouts/projection_part2.pdf · Linear Programming Projection Theory: Part 2 Chapter

Dirty Variables

In general, what is the projection of the system

x1 + ai2x2 + · · ·+ ainxn ≥ bi , i = 1, . . . ,m,

into Rn−1?

What is the projection of the system

x1 + ai2x2 + · · ·+ ainxn ≥ bi , i = 1, . . . ,m1

ai2x2 + · · ·+ ainxn ≥ bi , i = m1 + 1, . . . ,m

into Rn−1?

22

Page 23: Linear Programming Projection Theory: Part 2 Chapter 2 (57 …faculty.chicagobooth.edu/.../36900/handouts/projection_part2.pdf · Linear Programming Projection Theory: Part 2 Chapter

Dirty Variables

If, at some point:

x1 + ai2x2 + · · ·+ ainxn ≥ bi , i = 1, . . . ,m,

Two implications:

I The projection of <n in <n−1 is <n−1 (the entire subspace)

I The only solution to the corresponding u>A = 0, for u ≥ 0 isu = 0

The above is only true if and only if there is a strictly positive(negative) coefficient in every row at some point in the projectionprocess.

23

Page 24: Linear Programming Projection Theory: Part 2 Chapter 2 (57 …faculty.chicagobooth.edu/.../36900/handouts/projection_part2.pdf · Linear Programming Projection Theory: Part 2 Chapter

Dirty Variables

If there are finite number of constraints, dirty variables are not toobad, we just drop the constraints with the dirty variables.

However, if there are an infinite number of constraints, dirtyvariables hide dirty secrets and can be obscene.

24

Page 25: Linear Programming Projection Theory: Part 2 Chapter 2 (57 …faculty.chicagobooth.edu/.../36900/handouts/projection_part2.pdf · Linear Programming Projection Theory: Part 2 Chapter

From Here to Infinity

Theorem: If

Γ = {x : a1(i)x1 + a2(i)x2 + · · ·+ an(i)xn ≥ b(i) for i ∈ I = {1, 2, . . . ,m}}

then Fourier-Motzkin elimination gives

P(Γ; x1) := {(x2, x3, . . . , xn) ∈ Rn−1 : ∃x1 ∈ R s.t. (x1, x2, . . . , xn) ∈ Γ}. (2)

even when x1 is dirty.

What happens when I is not a finite index set? That is, when Γ isa closed convex set, but not necessarily a polyhedron.

25

Page 26: Linear Programming Projection Theory: Part 2 Chapter 2 (57 …faculty.chicagobooth.edu/.../36900/handouts/projection_part2.pdf · Linear Programming Projection Theory: Part 2 Chapter

From Here to Infinity

26

Page 27: Linear Programming Projection Theory: Part 2 Chapter 2 (57 …faculty.chicagobooth.edu/.../36900/handouts/projection_part2.pdf · Linear Programming Projection Theory: Part 2 Chapter

From Here to Infinity

Consider the system

ix1 + x2i ≥ 1, i = 1, . . . ,N

x1 ≥ −1x2 ≥ −1

(3)

Project out x2 (it is a dirty variable) and get

x1 ≥ −1

27

Page 28: Linear Programming Projection Theory: Part 2 Chapter 2 (57 …faculty.chicagobooth.edu/.../36900/handouts/projection_part2.pdf · Linear Programming Projection Theory: Part 2 Chapter

From Here to Infinity

28

Page 29: Linear Programming Projection Theory: Part 2 Chapter 2 (57 …faculty.chicagobooth.edu/.../36900/handouts/projection_part2.pdf · Linear Programming Projection Theory: Part 2 Chapter

The Road to Infinity

Now what about (note the ∞)

ix1 + x2i ≥ 1, i = 1, . . . ,∞

x1 ≥ −1x2 ≥ 0

(4)

Project out x2 (it is a dirty variable) and get

x1 ≥ −1

Is this correct?

29

Page 30: Linear Programming Projection Theory: Part 2 Chapter 2 (57 …faculty.chicagobooth.edu/.../36900/handouts/projection_part2.pdf · Linear Programming Projection Theory: Part 2 Chapter

From Here to Infinity

19

ix1 +

x2i≥ 1 i=1,… ,∞

x1 ≥ −1

x2 ≥ 0

x1 ≥ −1

x2

x1�1

x2

x10

x1 ≥ 0

Typo: x1 > 0 not x1 ≥ 0.

30

Page 31: Linear Programming Projection Theory: Part 2 Chapter 2 (57 …faculty.chicagobooth.edu/.../36900/handouts/projection_part2.pdf · Linear Programming Projection Theory: Part 2 Chapter

From Here to Infinity

31

Page 32: Linear Programming Projection Theory: Part 2 Chapter 2 (57 …faculty.chicagobooth.edu/.../36900/handouts/projection_part2.pdf · Linear Programming Projection Theory: Part 2 Chapter

From Here to InfinityBrief Review: Solve the finite linear programming problem usingprojection.

min{c>x |Ax ≥ b}Rewrite as the system:

z0 − c>x ≥ 0

Ax ≥ b

Project out the x variables and get

z0 ≥ dk , k = 1, . . . , q

0 ≥ dk , k = q + 1, . . . , r

the optimal value of the objective function is

v(LP) = V (LPD) = max{dk | k = 1, ..., q}32

Page 33: Linear Programming Projection Theory: Part 2 Chapter 2 (57 …faculty.chicagobooth.edu/.../36900/handouts/projection_part2.pdf · Linear Programming Projection Theory: Part 2 Chapter

From Here to Infinity

Fourier-Motzkin algorithm modification.

Do Not drop dirty variables.

Pass over them and eliminate only the clean variables

For a formal statement see pages 6-7 in:

http://www.ams.jhu.edu/~abasu9/papers/fm-paper.pdf

33

Page 34: Linear Programming Projection Theory: Part 2 Chapter 2 (57 …faculty.chicagobooth.edu/.../36900/handouts/projection_part2.pdf · Linear Programming Projection Theory: Part 2 Chapter

From Here to Infinity

34

Page 35: Linear Programming Projection Theory: Part 2 Chapter 2 (57 …faculty.chicagobooth.edu/.../36900/handouts/projection_part2.pdf · Linear Programming Projection Theory: Part 2 Chapter

From Here to Infinity

Consider

0 ≥ b(h), h ∈ I1

a`(h)x` + · · ·+ an(h)xn ≥ b(h), h ∈ I2

z ≥ b(h), h ∈ I3

z + a`(h)x` + · · ·+ an(h)xn ≥ b(h), h ∈ I4

(5)

Assume (z , x`, . . . , xn) is a feasible solution to the projectedsystem above.

35

Page 36: Linear Programming Projection Theory: Part 2 Chapter 2 (57 …faculty.chicagobooth.edu/.../36900/handouts/projection_part2.pdf · Linear Programming Projection Theory: Part 2 Chapter

From Here to Infinity

Observation: For the feasible (z , x`, . . . , xn), we must have

z ≥ sup{b(h)− a`(h)x` − · · · − an(h)xn : h ∈ I4}

and

z ≥ sup{b(h) : h ∈ I3}

36

Page 37: Linear Programming Projection Theory: Part 2 Chapter 2 (57 …faculty.chicagobooth.edu/.../36900/handouts/projection_part2.pdf · Linear Programming Projection Theory: Part 2 Chapter

From Here to InfinityLemma: If z is an optimal solution value to the linear programthen

z ≥ limδ→∞

sup{b(h)− δn∑

k=`

|ak(h)| : h ∈ I4}

Proof: the key idea is that if (x`, . . . , xn) is a feasible point in theprojected space then

xk =

{δ, if xk > 0−δ, if xk < 0

k = `, . . . , n

where

δ = max{|xk | : k = `, . . . , n}

is a feasible point in the projected space.37

Page 38: Linear Programming Projection Theory: Part 2 Chapter 2 (57 …faculty.chicagobooth.edu/.../36900/handouts/projection_part2.pdf · Linear Programming Projection Theory: Part 2 Chapter

From Here to InfinityTheorem: If (SILP) is feasible, then the optimal solution value is

v(SILP) = max{S , L}

where

S = sup{b(h) : h ∈ I3}

and

L = limδ→∞

sup{b(h)− δn∑

k=`

|ak(h)| : h ∈ I4}

Corollary When the cardinality of I is finite, L = −∞ andv(SILP) = S .

38

Page 39: Linear Programming Projection Theory: Part 2 Chapter 2 (57 …faculty.chicagobooth.edu/.../36900/handouts/projection_part2.pdf · Linear Programming Projection Theory: Part 2 Chapter

From Here to Infinity

39

Page 40: Linear Programming Projection Theory: Part 2 Chapter 2 (57 …faculty.chicagobooth.edu/.../36900/handouts/projection_part2.pdf · Linear Programming Projection Theory: Part 2 Chapter

Complementary Slackness

Motivation: Most work algorithms work by moving from point topoint until a set of optimality conditions are satisfied.

Generic Algorithm:

Initialization: Start with a point that satisfies a subset of theoptimality conditions.

Iterative Step: Move to a better point.

Termination: Stop when you have satisfied (to numericaltolerances) the optimality conditions.

40

Page 41: Linear Programming Projection Theory: Part 2 Chapter 2 (57 …faculty.chicagobooth.edu/.../36900/handouts/projection_part2.pdf · Linear Programming Projection Theory: Part 2 Chapter

Complementary Slackness

Theorem 2.33 If x is a feasible solution to the primal problemmin {c>x |Ax ≥ b } and u is a feasible solution to the dualproblem max {b>u |A>u = c, u ≥ 0 }, then x , u are primal-dualoptimal if and only if

(b − Ax)>u = 0. primal complementary slackness

Corollary 2.34 If

Ax ≥ b (6)

A>u = c u ≥ 0 (7)

(b − Ax)>u = 0 (8)

then x is optimal to the primal problem min {c>x |Ax ≥ b} and uis optimal to the dual problem max {b>u |A>u = c , u ≥ 0}.

41

Page 42: Linear Programming Projection Theory: Part 2 Chapter 2 (57 …faculty.chicagobooth.edu/.../36900/handouts/projection_part2.pdf · Linear Programming Projection Theory: Part 2 Chapter

Complementary Slackness

Condition (8) is

(b − Ax)>u = 0

What is the economic interpretation?

What is the interpretation in terms of projection?

42

Page 43: Linear Programming Projection Theory: Part 2 Chapter 2 (57 …faculty.chicagobooth.edu/.../36900/handouts/projection_part2.pdf · Linear Programming Projection Theory: Part 2 Chapter

Complementary Slackness

Theorem 2.35 (Strict Complementary Slackness) If there is anoptimal solution to the linear program

min {c>x | (ai )>x ≥ bi , i = 1, . . . ,m},then there is an optimal primal-dual pair (x , u) such that

(bi − (ai )>x) < 0⇒ ui = 0 and (bi − (ai )>x) = 0⇒ ui > 0, i = 1, . . . ,m. (9)

Corollary 2.37 If x , u is an optimal primal-dual pair for the linearprogram min {c>x |Ax ≥ b}, but is not strictly complementary,then there are either alternative primal optima or alternative dualoptima.

This is not good. More on this later.

43

Page 44: Linear Programming Projection Theory: Part 2 Chapter 2 (57 …faculty.chicagobooth.edu/.../36900/handouts/projection_part2.pdf · Linear Programming Projection Theory: Part 2 Chapter

Complementary Slackness

The linear program is min {x1 + x2 | x1 + x2 ≥ 1, x1, x2 ≥ 0}. Solvethe linear program using projection.

z0 − x1 − x2 ≥ 0 (E0)

x1 + x2 ≥ 1 (E1)x1 ≥ 0 (E2)

x2 ≥ 0 (E3)

⇒{

z0 ≥ 1 (E0) + (E1)z0 ≥ 0 (E0) + (E2) + (E3)

}

44

Page 45: Linear Programming Projection Theory: Part 2 Chapter 2 (57 …faculty.chicagobooth.edu/.../36900/handouts/projection_part2.pdf · Linear Programming Projection Theory: Part 2 Chapter

Complementary Slackness

Proof Motivation:

Idea One: It is sufficient to show strict complementary slacknessmust hold for the first constraint.

Why is showing the result for one constraint sufficient?

Idea Two: Apply Corollary 2.21 on page 52 of the text.

45

Page 46: Linear Programming Projection Theory: Part 2 Chapter 2 (57 …faculty.chicagobooth.edu/.../36900/handouts/projection_part2.pdf · Linear Programming Projection Theory: Part 2 Chapter

Complementary Slackness

Corollary (2.21)

If there is no solution to the system

A1x > b1, A2x ≥ b2

then there are nonnegative u1, u2 such that

A>1 u1 + A>2 u

2 = 0, (b1)>u1 + (b2)>u2 ≥ 0, u1 6= 0 (10)

has a solution, or

A>1 u1 + A>2 u

2 = 0, (b1)>u1 + (b2)>u2 > 0, (11)

has a solution. Conversely, if there is no solution to either (10) or(11), there is a solution to A1x > b1, A2x ≥ b2.

46

Page 47: Linear Programming Projection Theory: Part 2 Chapter 2 (57 …faculty.chicagobooth.edu/.../36900/handouts/projection_part2.pdf · Linear Programming Projection Theory: Part 2 Chapter

Complementary Slackness

The complementary slackness results of this section are stated interms of the symmetric primal-dual pair.

If

Ax ≥ b, x ≥ 0 (12)

A>u ≤ c , u ≥ 0 (13)

(b − Ax)>u = 0 (14)

(c − A>u)>x = 0 (15)

then x is optimal to the primal problem min {c>x |Ax ≥ b, x ≥ 0}and u is optimal to the dual problem max {b>u |A>u ≤ c , u ≥ 0}.

47

Page 48: Linear Programming Projection Theory: Part 2 Chapter 2 (57 …faculty.chicagobooth.edu/.../36900/handouts/projection_part2.pdf · Linear Programming Projection Theory: Part 2 Chapter

Complementary Slackness

Conditions (12) - (15) are called optimality conditions.

I Condition (12) – primal feasibility

I Condition (13) – dual feasibility

I Conditions (14)-(15) – complementary slackness

Simplex Algorithm: Enforce conditions (12), (14), and (15) anditerate to satisfy (13) and then stop.

Nonlinear Programming: – Karush-Kuhn-Tucker conditionsgeneralize this to nonlinear programming.

Constructing a primal dual-pair with equal objective functionvalues is a very useful proof technique.

48

Page 49: Linear Programming Projection Theory: Part 2 Chapter 2 (57 …faculty.chicagobooth.edu/.../36900/handouts/projection_part2.pdf · Linear Programming Projection Theory: Part 2 Chapter

Sensitivity Analysis

People are interested not only in primal solutions, but also dualinformation.

What is the practical significance (or utility) of knowing theoptimal values of the dual variables?

Solvers not only give the optimal dual information, but they alsoprovide range analysis.

Let’s look at some actual solver output.

49

Page 50: Linear Programming Projection Theory: Part 2 Chapter 2 (57 …faculty.chicagobooth.edu/.../36900/handouts/projection_part2.pdf · Linear Programming Projection Theory: Part 2 Chapter

Sensitivity Analysis

Consider the linear program we have been working with

MIN 2 X1 - 3 X2

SUBJECT TO

2) X2 >= 2

3) 0.5 X1 - X2 >= - 8

4) - 0.5 X1 + X2 >= - 3

5) X1 - X2 >= - 6

END

50

Page 51: Linear Programming Projection Theory: Part 2 Chapter 2 (57 …faculty.chicagobooth.edu/.../36900/handouts/projection_part2.pdf · Linear Programming Projection Theory: Part 2 Chapter

Sensitivity Analysis (LINDO/LINGO)

OBJECTIVE FUNCTION VALUE

1) -22.00000

VARIABLE VALUE REDUCED COST

X1 4.000000 0.000000

X2 10.000000 0.000000

ROW SLACK OR SURPLUS DUAL PRICES

2) 8.000000 0.000000

3) 0.000000 -2.000000

4) 11.000000 0.000000

5) 0.000000 -1.000000

51

Page 52: Linear Programming Projection Theory: Part 2 Chapter 2 (57 …faculty.chicagobooth.edu/.../36900/handouts/projection_part2.pdf · Linear Programming Projection Theory: Part 2 Chapter

Sensitivity Analysis (LINDO/LINGO)

Terminology: You will see

I Dual price

I Dual variable/value

I Shadow price

52

Page 53: Linear Programming Projection Theory: Part 2 Chapter 2 (57 …faculty.chicagobooth.edu/.../36900/handouts/projection_part2.pdf · Linear Programming Projection Theory: Part 2 Chapter

Sensitivity Analysis (LINDO/LINGO)

RANGES IN WHICH THE BASIS IS UNCHANGED:

OBJ COEFFICIENT RANGES

VARIABLE CURRENT ALLOWABLE ALLOWABLE

COEF INCREASE DECREASE

X1 2.000000 1.000000 0.500000

X2 -3.000000 1.000000 1.000000

RIGHTHAND SIDE RANGES

ROW CURRENT ALLOWABLE ALLOWABLE

RHS INCREASE DECREASE

2 2.000000 8.000000 INFINITY

3 -8.000000 2.000000 INFINITY

4 -3.000000 11.000000 INFINITY

5 -6.000000 INFINITY 2.000000

53

Page 54: Linear Programming Projection Theory: Part 2 Chapter 2 (57 …faculty.chicagobooth.edu/.../36900/handouts/projection_part2.pdf · Linear Programming Projection Theory: Part 2 Chapter

Sensitivity Analysis (Excel Solver)

54

Page 55: Linear Programming Projection Theory: Part 2 Chapter 2 (57 …faculty.chicagobooth.edu/.../36900/handouts/projection_part2.pdf · Linear Programming Projection Theory: Part 2 Chapter

Sensitivity Analysis: Allowable Increase/Decrease

Given an optimal dual solution, u, the allowable increase(decrease) on the right hand side of the constraint

(ak)>x ≥ bk

is the maximum increase (decrease) on the right hand side bk suchthat u is still an optimal dual solution (or the primal becomesinfeasible).

The dual values and the allowable/increase is a natural by-productof projection.

55

Page 56: Linear Programming Projection Theory: Part 2 Chapter 2 (57 …faculty.chicagobooth.edu/.../36900/handouts/projection_part2.pdf · Linear Programming Projection Theory: Part 2 Chapter

Sensitivity Analysis

The result of projection on the linear program is

z0 ≥ −22 (E0) + 2(E2) + (E4) (16)

z0 ≥ −24 (E0) + 3(E2) + 0.5(E5) (17)

z0 ≥ −44 (E0) + 4(E2) + 2(E3) + (E4) (18)

z0 ≥ −35 (E0) + 4(E2) + (E3) + 0.5(E5) (19)

z0 ≥ −30 (E0) + (E1) + 4(E2) (20)

z0 ≥ −32 (E0) + 4(E2) + (E6) (21)

0 ≥ −11 (E2) + (E3) (22)

56

Page 57: Linear Programming Projection Theory: Part 2 Chapter 2 (57 …faculty.chicagobooth.edu/.../36900/handouts/projection_part2.pdf · Linear Programming Projection Theory: Part 2 Chapter

Sensitivity Analysis

Questions:

I Why is the dual price on row (E2) -2?

I Why is the allowable increase on row (E1) two 8?

I Why is the allowable decrease on row (E4) 2?

57

Page 58: Linear Programming Projection Theory: Part 2 Chapter 2 (57 …faculty.chicagobooth.edu/.../36900/handouts/projection_part2.pdf · Linear Programming Projection Theory: Part 2 Chapter

Sensitivity Analysis – 100% Rule

In calculating the allowable/increase for the linear programmingright hand sides we assume that only one right hand side changeswhile the others are held constant.

What if we change more than one b?

If the sum of the percentage increase (decrease) of all the righthand sides in the linear program, min{c>x |Ax ≥ b}, does notexceed 100% then the current optimal dual solution remainsoptimal.

Proof: Homework!

58

Page 59: Linear Programming Projection Theory: Part 2 Chapter 2 (57 …faculty.chicagobooth.edu/.../36900/handouts/projection_part2.pdf · Linear Programming Projection Theory: Part 2 Chapter

Degeneracy

People are interested in knowing if there are alternative optima.Why?

If an optimal solution is NOT strictly complementary, then we havealternative primal or dual solutions. Why?

If an optimal solution is NOT strictly complementary, then therange analysis may not be valid. Argle Bargle!

Can we tell from a solver output if we have primal or dualalternative optima?

It depends. Degenerate solutions mess things up.

59

Page 60: Linear Programming Projection Theory: Part 2 Chapter 2 (57 …faculty.chicagobooth.edu/.../36900/handouts/projection_part2.pdf · Linear Programming Projection Theory: Part 2 Chapter

Degeneracy

Closely related to alternative dual optima is the concept of primaldegeneracy.

A solution x of the primal linear program min {c>x |Ax ≥ b}where A is an m× n matrix is primal degenerate if the submatrix ofA defined by the binding constraints ((ai )>x = bi where ai is row iof A) has rank strictly less than the number of binding constraints.

Primal degeneracy can also lead to incorrect values of theallowable increase/decrease. Argh!!!!

Some authors define primal degeneracy to be alternative dualoptima, but this is not correct.

60

Page 61: Linear Programming Projection Theory: Part 2 Chapter 2 (57 …faculty.chicagobooth.edu/.../36900/handouts/projection_part2.pdf · Linear Programming Projection Theory: Part 2 Chapter

Degeneracy

Consider the linear program

min −x2 (E0)x1 −x2 ≥ −3 (E1)−x1 −x2 ≥ −7 (E2)

−x2 ≥ −2 (E3)−2x1 +x2 ≥ −2 (E4)

x1 ≥ 0 (E5)x2 ≥ 0 (E6)

61

Page 62: Linear Programming Projection Theory: Part 2 Chapter 2 (57 …faculty.chicagobooth.edu/.../36900/handouts/projection_part2.pdf · Linear Programming Projection Theory: Part 2 Chapter

Degeneracy

Applying projection to this linear program and eliminating the xvariables gives

z0 ≥ −2 (E0) +(E3)z0 ≥ −5 (E0) +(1/2)(E1) +(1/2)(E2)z0 ≥ −8 (E0) +2(E1) +(E4)z0 ≥ −7 (E0) +(E2) +(E5)0 ≥ −2 (E3) +(E6)0 ≥ −8 2(E1) +(E4) +(E6)0 ≥ −7 (E2) +(E5) +(E6)0 ≥ −10 (E1) +(E2) +2(E6)0 ≥ −4 (E3) +(E4) +2(E5)0 ≥ −4 (E1) +(E2) +2(E4) +2(E5)0 ≥ −10 2(E1) +2(E4) +2(E5)0 ≥ −9 (E2) +(E4) +3E(5)

62

Page 63: Linear Programming Projection Theory: Part 2 Chapter 2 (57 …faculty.chicagobooth.edu/.../36900/handouts/projection_part2.pdf · Linear Programming Projection Theory: Part 2 Chapter

Degeneracy

Question: What is the allowable decrease on constraint (E3)?

Answer: ??????????

Let’s look at what we get from an LP code.

63

Page 64: Linear Programming Projection Theory: Part 2 Chapter 2 (57 …faculty.chicagobooth.edu/.../36900/handouts/projection_part2.pdf · Linear Programming Projection Theory: Part 2 Chapter

Degeneracy

The LINGO solution (RHS of E3 is -2)

64

Page 65: Linear Programming Projection Theory: Part 2 Chapter 2 (57 …faculty.chicagobooth.edu/.../36900/handouts/projection_part2.pdf · Linear Programming Projection Theory: Part 2 Chapter

Degeneracy

The LINGO Range analysis (RHS of E3 is -2)

65

Page 66: Linear Programming Projection Theory: Part 2 Chapter 2 (57 …faculty.chicagobooth.edu/.../36900/handouts/projection_part2.pdf · Linear Programming Projection Theory: Part 2 Chapter

Degeneracy

What is the allowable decrease on (E3) according to LINGO?

What is the allowable decrease on (E3) according to projection?

Argle bargle!!!!!!!!! What happened? Do we have strictcomplementary slackness?

Let’s look at a picture. In the picture we have decreased the RHSto -4 from -2, that is the constraint is x2 ≤ 4.

66

Page 67: Linear Programming Projection Theory: Part 2 Chapter 2 (57 …faculty.chicagobooth.edu/.../36900/handouts/projection_part2.pdf · Linear Programming Projection Theory: Part 2 Chapter

Degeneracy

-2 2 4 6

-5

-2.5

2.5

5

7.5

10

x1

x2

(E3)

(E4)

(E2)

(E1)

(3.0, 4.0)

67

Page 68: Linear Programming Projection Theory: Part 2 Chapter 2 (57 …faculty.chicagobooth.edu/.../36900/handouts/projection_part2.pdf · Linear Programming Projection Theory: Part 2 Chapter

Degeneracy

The LINGO solution (RHS of E3 is -4)

68

Page 69: Linear Programming Projection Theory: Part 2 Chapter 2 (57 …faculty.chicagobooth.edu/.../36900/handouts/projection_part2.pdf · Linear Programming Projection Theory: Part 2 Chapter

Degeneracy

The LINGO Range analysis (RHS of E3 is -4)

69

Page 70: Linear Programming Projection Theory: Part 2 Chapter 2 (57 …faculty.chicagobooth.edu/.../36900/handouts/projection_part2.pdf · Linear Programming Projection Theory: Part 2 Chapter

Degeneracy Range analysis (RHS of E3 is -4)The primal system (tight constraints):

−x1 − x2 = −7 (E2)

−x2 = −4 (E3)

−2x1 + x2 = −2 (E4)

The dual system:

−u2 − 2u4 = 0

−u2 − u3 + u4 = −1

u1, u2, u3 ≥ 0

A unique dual solution, but alternative primal optima.

u2 = 0 u3 = 1 u4 = 0

70

Page 71: Linear Programming Projection Theory: Part 2 Chapter 2 (57 …faculty.chicagobooth.edu/.../36900/handouts/projection_part2.pdf · Linear Programming Projection Theory: Part 2 Chapter

Degeneracy

The LINGO solution (RHS of E3 is -5)

71

Page 72: Linear Programming Projection Theory: Part 2 Chapter 2 (57 …faculty.chicagobooth.edu/.../36900/handouts/projection_part2.pdf · Linear Programming Projection Theory: Part 2 Chapter

Degeneracy

The LINGO Range analysis (RHS of E3 is -5)

72

Page 73: Linear Programming Projection Theory: Part 2 Chapter 2 (57 …faculty.chicagobooth.edu/.../36900/handouts/projection_part2.pdf · Linear Programming Projection Theory: Part 2 Chapter

Degeneracy Range analysis (RHS of E3 is -5)The primal system (tight constraints):

x1 − x2 = −3 (E1)

−x1 − x2 = −7 (E2)

−x2 = −5 (E3)

The dual system:

u1 − u2 = 0

−u1 − u2 − u3 = −1

u1, u2, u3 ≥ 0

Alternative dual solution, but unique primal optima (x1 = 2,x2 = 5).

−2u1 − u3 = −1

73

Page 74: Linear Programming Projection Theory: Part 2 Chapter 2 (57 …faculty.chicagobooth.edu/.../36900/handouts/projection_part2.pdf · Linear Programming Projection Theory: Part 2 Chapter

Range Analysis Summary

-2 2 4 6

-5

-2.5

2.5

5

7.5

10

x1

x2

(E3)

(E4)

(E2)

(E1)

(3.0, 4.0)

74

Page 75: Linear Programming Projection Theory: Part 2 Chapter 2 (57 …faculty.chicagobooth.edu/.../36900/handouts/projection_part2.pdf · Linear Programming Projection Theory: Part 2 Chapter

Range Analysis Summary

Alternate AlternateConstraint Solution Degenerate Primal Dual

−x2 ≥ −2 x1 = 0, x2 = 2 No Yes No−x2 ≥ −3 x1 = 0, x2 = 3 Yes Yes No−x2 ≥ −4 x1 = 3, x2 = 4 Yes Yes No−x2 ≥ −5 x1 = 2, x2 = 5 Yes No Yes

75

Page 76: Linear Programming Projection Theory: Part 2 Chapter 2 (57 …faculty.chicagobooth.edu/.../36900/handouts/projection_part2.pdf · Linear Programming Projection Theory: Part 2 Chapter

Range Analysis Summary

Here is something to think about while lying in bed at night.

Assume there is an optimal solution to the linear program such that

I the solution is not strictly complementary

I the solution is not degenerate

What can we conclude about alternative optimal primal andalternative optimal dual solutions?

How can Simplex spot a degenerate solution?

76

Page 77: Linear Programming Projection Theory: Part 2 Chapter 2 (57 …faculty.chicagobooth.edu/.../36900/handouts/projection_part2.pdf · Linear Programming Projection Theory: Part 2 Chapter

Degeneracy – Key Take Aways

Here is the problem: Simplex codes calculate the allowableincrease/decrease by how much a right hand side can changebefore the set of positive primal variables and constraints withpositive slack change.

Stated another way: Simplex codes calculate the allowableincrease/decrease by how much a right hand side can changebefore there is a change in the basis.

This may not actually be equal to the true allowable increase (i.e.before the value of the optimal dual variables change.)

There may be several basis changes before the dual solutionchanges.

77

Page 78: Linear Programming Projection Theory: Part 2 Chapter 2 (57 …faculty.chicagobooth.edu/.../36900/handouts/projection_part2.pdf · Linear Programming Projection Theory: Part 2 Chapter

Degeneracy – Key Take Aways

I If there is NOT strict complementary slackness we havealternative primal optima or dual optima.

I If there is NOT strict complementary slackness range analysismay be misleading.

I If there is primal degeneracy knowing which kind ofalternative optima (primal or dual) we have is difficult.

78

Page 79: Linear Programming Projection Theory: Part 2 Chapter 2 (57 …faculty.chicagobooth.edu/.../36900/handouts/projection_part2.pdf · Linear Programming Projection Theory: Part 2 Chapter

Consistency Testing

Objective: It is desirable to characterize for which b ∈ Rm the LPset {x ∈ Rn |Ax ≥ b} is nonempty or consistent.

Let A be an m by n matrix.

We call V : Rm → R an LP-consistency tester for A if for anyb ∈ Rm, V (b) ≤ 0 if and only if {x ∈ Rn : Ax ≥ b} is nonempty.

79

Page 80: Linear Programming Projection Theory: Part 2 Chapter 2 (57 …faculty.chicagobooth.edu/.../36900/handouts/projection_part2.pdf · Linear Programming Projection Theory: Part 2 Chapter

Consistency Testing

The beauty of projection is that it provides an LP-consistencytester.

Projecting out the x variables in the system Ax ≥ b gives

0 ≥ b>uk , k = 1, . . . , q

Define V (b) by

V (b) := max{b>uk , k = 1, . . . , q}

We have seen that Ax ≥ b is consistent if and only if V (b) ≤ 0.

80

Page 81: Linear Programming Projection Theory: Part 2 Chapter 2 (57 …faculty.chicagobooth.edu/.../36900/handouts/projection_part2.pdf · Linear Programming Projection Theory: Part 2 Chapter

Optimal Value Function

What is the efficient frontier in finance?

81

Page 82: Linear Programming Projection Theory: Part 2 Chapter 2 (57 …faculty.chicagobooth.edu/.../36900/handouts/projection_part2.pdf · Linear Programming Projection Theory: Part 2 Chapter

Optimal Value Function

Increasing the right hand side bk , of constraint k, in the linearprogram

min {c>x |Ax ≥ b},will result in infeasibility or a new optimal dual solution. If a newoptimal dual solution is the result then the new dual solution willhave a strictly greater value for component k .

Hurting hurts more and more.

82

Page 83: Linear Programming Projection Theory: Part 2 Chapter 2 (57 …faculty.chicagobooth.edu/.../36900/handouts/projection_part2.pdf · Linear Programming Projection Theory: Part 2 Chapter

Optimal Value Function

If u is an optimal dual solution to min {c>x |Ax ≥ b} and θk isthe allowable increase (decrease) on the right hand side bk thenincreasing (decreasing) bk by more than θk will either result in aninfeasible primal or a new optimal dual solution u whereuk > uk(uk < uk).

Why does this follow from projection?

83

Page 84: Linear Programming Projection Theory: Part 2 Chapter 2 (57 …faculty.chicagobooth.edu/.../36900/handouts/projection_part2.pdf · Linear Programming Projection Theory: Part 2 Chapter

Optimal Value Function

Proposition 2.48: If the projection of the systemz0 − c>x , Ax ≥ b into the z0 variable space is not <1, and is notnull, then there exists a set of vectors, u1, . . . , uq such that

z0(b) = min {c>x |Ax ≥ b} = max {b>uk | k = 1, . . . , q}

for all b such that {x |Ax ≥ b } 6= 0.

Corollary 2.49: The optimal value functionz0(b) = min {c>x |Ax ≥ b} is a piecewise linear convex functionover the domain for which the linear program is feasible.

84

Page 85: Linear Programming Projection Theory: Part 2 Chapter 2 (57 …faculty.chicagobooth.edu/.../36900/handouts/projection_part2.pdf · Linear Programming Projection Theory: Part 2 Chapter

Key Results

I Projection

I Projection does not create or destroy solutions

I Projection yields dual mutlipliers

I Farkas’ Lemma – Theorems of the Alternative

I Weyl’s Theorem

I Solve a linear program with projection

I Weak and Strong Duality

I Complementary Slackness

I Sensitivity Analysis

I Optimal value function

85