chapter 11 · 2019. 8. 7. · ch. 11 - optimization with equality constraints 1 1 chapter 11...

23
Ch. 11 - Optimization with Equality Constraints 1 1 Chapter 11 Optimization with Equality Constraints Harold William Kuhn (1925) Albert William Tucker (1905-1995) Joseph-Louis (Giuseppe Lodovico), comte de Lagrange (1736-1813) 11.1 General Problem Now, a constraint is added to the optimization problem: max x,y u(x, y) s.t x p x + y p y = I, where p x , p y are exogenous prices and I is exogenous income. Different methods to solve this problem: – Substitution Total differential approach Lagrange Multiplier 2

Upload: others

Post on 21-Aug-2021

2 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Chapter 11 · 2019. 8. 7. · Ch. 11 - Optimization with Equality Constraints 1 1 Chapter 11 Optimization with Equality Constraints Albert William Tucker (1905-1995) Harold William

Ch. 11 - Optimization with Equality Constraints

1

1

Chapter 11Optimization with Equality Constraints

Harold William Kuhn (1925)Albert William Tucker (1905-1995)

Joseph-Louis (Giuseppe Lodovico), comte de Lagrange (1736-1813)

11.1 General Problem

• Now, a constraint is added to the optimization problem:

maxx,y u(x, y) s.t x px + y py = I, where px , py are exogenous prices and I is exogenous income.

• Different methods to solve this problem:

– Substitution

– Total differential approach

– Lagrange Multiplier

2

Page 2: Chapter 11 · 2019. 8. 7. · Ch. 11 - Optimization with Equality Constraints 1 1 Chapter 11 Optimization with Equality Constraints Albert William Tucker (1905-1995) Harold William

Ch. 11 - Optimization with Equality Constraints

2

128:for U(.) Value maximum Calculate)4

maximum04

:s.o.c.Check

;14 and ;8;0432

:F.o.c. 3)

2322230

)x, U(xinto ngSubstituti )2

230

for x Solve 1)

2460s.t.2),(

*

21

2

*2

*111

211111

21

12

2

2112121

U

dxUd

xxxdxdU

xxxxxU

xx

xxxxxxxUU

11.1 Substitution Approach

• Easy to use in simple 2x2 systems. Using the constraint, substitute into objective function and optimize as usual.

• Example:

3

xxyyxyxy

xyxy

y

x

y

x

yxyx

gfgfggff

ggdydxffdydx

dy

dx

g

gd

dy

dx

f

fd

dygdxgdBdyfdxfdU

yxgByxfU

;)109

;)87

0

0B;

0

0U)65

0;0)43

,s.t.;,2)-1

11.2 Total Differential Approach

• Total differentiation of objective function and constraints:

• Equation (10) along with the restriction (2) form the basis to solve this optimization problem.

4

Page 3: Chapter 11 · 2019. 8. 7. · Ch. 11 - Optimization with Equality Constraints 1 1 Chapter 11 Optimization with Equality Constraints Albert William Tucker (1905-1995) Harold William

Ch. 11 - Optimization with Equality Constraints

3

12882148)14,8(

8;142211460

22212

21;2

024;02

*1

*222

2121

212121

2112112

*U

xxxx

xxxx

dxdxxxdxdx

dxdxdBdxdxxdxxdU

11.2 Total Differential Approach

• Example: U(x1, x2) = x1x2 + 2x1 s.t. 60 = 4x1 + 2x2

Taking first-order differentials of U and budget constraint (B):

5

11.2 Total-differential approach

• Graph for Utility function and budget constraint:

6

Page 4: Chapter 11 · 2019. 8. 7. · Ch. 11 - Optimization with Equality Constraints 1 1 Chapter 11 Optimization with Equality Constraints Albert William Tucker (1905-1995) Harold William

Ch. 11 - Optimization with Equality Constraints

4

7

),()2

)''10

)'10

yxgB

gf

gf

yy

xx

11.3 Lagrange-multiplier Approach

• To avoid working with (possibly) zero denominators, let λ denote the common value in (10). Rewriting (10) and adding the budget constraint we are left with a 3x3 system of equations:

• There is a convenient function that produces (10’), (10’’) and (2) as a set of f.o.c.: The Lagrangian function, which includes the objective function and the constraint:

• The constraint is multiplied by a variable, λ, called the Lagrange multiplier (LM).

),(λ),( 2121 xxgBxxfL

7

11.3 Lagrange-multiplier Approach

22122

21111

21

222

111

xxxxλx

xxxxλx

λxλxλλ

x

x

21

2121

LL

)5(

0λg)4(

0λg)3(

0),()2(

),(λ),()1(

LLL

LLL

L

H

fL

fL

xxgBL

xxgBxxfL

xx

xx

• Once we form the Lagrangian function, the Lagrange function becomes the new objective function.

8

Page 5: Chapter 11 · 2019. 8. 7. · Ch. 11 - Optimization with Equality Constraints 1 1 Chapter 11 Optimization with Equality Constraints Albert William Tucker (1905-1995) Harold William

Ch. 11 - Optimization with Equality Constraints

5

9

11.3 LM Approach

22122

21111

21

xxxxx

xxxxx

xx gg0

)5(

LLg

LLgH

• Note that Lλ λ = 0Lλ x1 = gx1

Lλ x2 = gx2

.• Then

• If the constraints are linear, the Hessian of the Lagrangian can be seen as the Hessian of the original objective function, bordered by the first derivatives of the constraints. This new Hessian is called bordered Hessian.

10

yyxxxyxyyx

yyyxy

xyxxx

yx

yyyxy

xyxxx

yx

y

y

x

xB

yyy

xxx

yx

yx

yx

yx

UPUPUPP

UUP

UUP

PP

LLL

LLL

LLL

H

P

U

P

UL

PUL

PUL

yPxPL

yPxPByxUL

yPxPB

UUx,yUU

222

0

0

0

0

,

0,

constraintbudget the toSubject

whereUtility Maximize

11.3 LM Approach: Example

Page 6: Chapter 11 · 2019. 8. 7. · Ch. 11 - Optimization with Equality Constraints 1 1 Chapter 11 Optimization with Equality Constraints Albert William Tucker (1905-1995) Harold William

Ch. 11 - Optimization with Equality Constraints

6

y

x

y

x

y

y

x

x

yx

y

yyy

x

xxx

yx

U

U

P

P

B

U

P

U

P

U

yPxPBL

P

UPU

y

L

P

UPU

x

L

yPxPByxUL

,

f.o.c. From

0)4(

0)3(

0)2(

,)1(

*

11.3 LM Approach: Interpretation

Note: From the f.o.c., at the optimal point:- λ = marginal utility of good i, scaled by its price. - Scaled marginal utilities are equal across goods. - The ratio of marginal utilities (MRS) equals relative prices. 11

12

11.3 LM Approach: Second-order conditions

• has no effect on the value of L* because the constraint equals zero but …

• A new set of second-order conditions are needed

• The constraint changes the criterion for a relative max. or min.

0h β 2 iff 0)5(

h β 2)4(

2(3)

yfor constraint thesolve α

(2)

0y x .. 2)1(

22

2

22

2

2

22

baH

β

x baH

α bx

β

α hxaxH

xy

tsbyhxyaxH

Page 7: Chapter 11 · 2019. 8. 7. · Ch. 11 - Optimization with Equality Constraints 1 1 Chapter 11 Optimization with Equality Constraints Albert William Tucker (1905-1995) Harold William

Ch. 11 - Optimization with Equality Constraints

7

13

min0a

0

iff

0y x s.t. definite positive is )3(

bα-h β α 2aβ -a

0

)2(

0bαh β α 2aβ)1(

22

22

bh

h

H

bh

hH

H

11.3 LM Approach: Second-order conditions

4;128;82148)1715

14;28460)1413

8;222460)1211

22;212141)109

21;0λ2)87

2141;0λ42)65

02460)4

246023)

function Lagrangian theForm

02460Bs.t.22)-1

**

*22

*111

1212

11

22

21

21121

21121

2

1

U U

xx –

xxx

xxxx

xxL

xxL

x – x – L

FOC

x – x – λxxxL

x – x – xx xU

x

x

λ

11.3 LM Approach: Example

14

Page 8: Chapter 11 · 2019. 8. 7. · Ch. 11 - Optimization with Equality Constraints 1 1 Chapter 11 Optimization with Equality Constraints Albert William Tucker (1905-1995) Harold William

Ch. 11 - Optimization with Equality Constraints

8

maximum is L definite, negative is ;016)8

;

0

2

60

012

104

240

)7

0λ2)6

0λ42)5

conditionsorder 102460)4

function Lagrangian246023)

constraintBudget 02460B 2)

functionUtility 21)

*2

2

1

1

2

21

21121

21

121

2

1

LdH

x

x

xL

xL

x – x – L

x – x – λxxxL

x – x –

xx xU

x

x

stλ

11.3 LM Approach: Example - Cramer’s rule

15

128821482)16

141622481612841664λ)1513

22424016

012

204

6040

)12

1288120

002

124

2600

)11

64604

010

102

2460

)10

1688

012

104

240

)9

*1

*2

*1

*

*2

*1

*

2

1

xxxU

xx

J

J

J

J

x

x

11.3 LM Approach: Example - Cramer’s rule

16

Page 9: Chapter 11 · 2019. 8. 7. · Ch. 11 - Optimization with Equality Constraints 1 1 Chapter 11 Optimization with Equality Constraints Albert William Tucker (1905-1995) Harold William

Ch. 11 - Optimization with Equality Constraints

9

17

11.3 LM Approach: Restricted Least Squares

• The Lagrangean approach

f.o.c:

• Then, from the 1st equation:

)(2)(),|,( 2, qrxyxyLMin tt

T

1t

0)*(0)*(2

0*)(02)(*)(2 2

qrbqrbL

rbxxyrxbxyL

tttttt

T

1t

T

1t

rxxb

rxxyxxxbrxbxyx

1

11

)'(

)'(')'(*0*)''(

18

11.3 LM Approach: Restricted Least Squares

• b* = b – r (xx)-1

• Premultiply both sides by r and then subtract qrb* - q = rb – r2 (xx)-1 – q

0 = - r2 (xx)-1 + (rb – q)

Solving for ⟹ = [r2 (xx)-1]-1 (rb – q)

Substituting in b* ⟹ b* = b – (xx)-1r [r2 (xx)-1]-1 (rb – q)

Page 10: Chapter 11 · 2019. 8. 7. · Ch. 11 - Optimization with Equality Constraints 1 1 Chapter 11 Optimization with Equality Constraints Albert William Tucker (1905-1995) Harold William

Ch. 11 - Optimization with Equality Constraints

10

19

;

0

H;

0

H

Where

min:)definite positive0,...,0,0,0H

max(:definite negative0)1,...(0,0,0H

361) (p. soc, case variablend)

min:)definite positive0,0H

max(:definite negative0,0H

soc of test variable3c)

min:)definite positive0H

max(:definite negative0H

soc of test variable2b)

constraint than variablemore one bemust therebecause test variable-one No a)

333

222

111

321

3

22212

12111

21

2

432

432

32

32

2

2

ZZZg

ZZZg

ZZZg

ggg

ZZg

ZZg

gg

HHH

HHH

H

H

n

nn

11.3 LM Approach: N-variable Case

11.4 Optimality Conditions – Unconstrained Case

• Let x* be the point that we think is the minimum for f(x).

• Necessary condition (for optimality):

df(x*) = 0

• A point that satisfies the necessary condition is a stationary point

– It can be a minimum, maximum, or saddle point

• Q: How do we know that we have a minimum?

• Answer: Sufficiency Condition:

The sufficient conditions for x* to be a strict local minimum are:

df(x*) = 0

d2f(x*) is positive definite

Page 11: Chapter 11 · 2019. 8. 7. · Ch. 11 - Optimization with Equality Constraints 1 1 Chapter 11 Optimization with Equality Constraints Albert William Tucker (1905-1995) Harold William

Ch. 11 - Optimization with Equality Constraints

11

11.4 Constrained Case – KKT Conditions11.4 Constrained Case – KKT Conditions

• To prove a claim of optimality in constrained minimization (or maximization), we have to check the found point (x*) with respect to the (Karesh) Kuhn Tucker (KKT) conditions.

• Kuhn and Tucker extended the Lagrangian theory to include the general classical single-objective nonlinear programming problem:

minimize f(x)

Subject to gj(x) 0 for j = 1, 2, ..., M

hk(x) = 0 for k = 1, 2, ..., K

x = (x1, x2, ..., xN)

Note: M inequality constraints, K equality constraints.

11.4 Interior versus Exterior Solutions

• Interior: If no constraints are active and (thus) the solution lies at the interior of the feasible space, then the necessary condition for optimality is same as for unconstrained case:

f(x*) = 0 ( difference operator for matrices --“del” )

Example: Minimize

f(x) = 4(x – 1)2 + (y – 2)2

with constraints:

x ≥ -1 & y ≥ -1.

Exterior: If solution lies at the exterior, the condition f(x*) = 0

does not apply because some constraints will block movement to this minimum.

Page 12: Chapter 11 · 2019. 8. 7. · Ch. 11 - Optimization with Equality Constraints 1 1 Chapter 11 Optimization with Equality Constraints Albert William Tucker (1905-1995) Harold William

Ch. 11 - Optimization with Equality Constraints

12

11.4 Interior versus Exterior Solutions

• If solution lies at the exterior, the condition f(x*) = 0 does notapply. Some constraints are active.

Example: Minimize

f(x) = 4(x – 1)2 + (y – 2)2

with constraints:

x + y ≤ 2; x ≥ - 1 & y ≥ - 1.

– We cannot get any more improvement if for x* there does not exist a vector d that is both a descent direction and a feasible direction.

– In other words: the possible feasible directions do not intersect the possible descent directions at all.

11.4 Mathematical Form

• A vector d that is both descending and feasible cannot exist if

-f = i (gi) (with i 0) for all active constraints iI.

– This can be rewritten as 0 = f + i (gi)

– This condition is correct IF feasibility is defined as g(x) 0.

– If feasibility is defined as g(x) 0, then this becomes

-f = i (-gi)

• Again, this only applies for the I active constraints.

• Usually the inactive constraints are included, but the condition j gj

= 0 (with j 0) is added for all inactive constraints jJ.

– This is referred to as the complimentary slackness condition.

24

Page 13: Chapter 11 · 2019. 8. 7. · Ch. 11 - Optimization with Equality Constraints 1 1 Chapter 11 Optimization with Equality Constraints Albert William Tucker (1905-1995) Harold William

Ch. 11 - Optimization with Equality Constraints

13

• Note that the slackness condition is equivalent to stating that j = 0 for inactive constraints -i.e., zero price for non-binding constraints!

• That is, each inequality constraint is either active, and in this case it turns into equality constraint of the Lagrange type, or inactive, and in this case it is void and does not constrains the solution.

• Note that I + J = M, the total number of (inequality) constraints.

• Analysis of the constraints can help to rule out some combinations. However, in general, a ‘brute force’ approach in a problem with Jinequality constraints must be divided into 2J cases. Each case must be solved independently for a minima, and the obtained solution (if any) must be checked to comply with the constrains. A lot of work!

11.4 Mathematical Form

11.4 Necessary KKT Conditions

For the problem:Min f(x)s.t. g(x) 0(n variables, M constraints)

The necessary conditions are:f(x) + i gi(x) = 0 (optimality)gi(x) 0 for i = 1, 2, ..., M (primary feasibility)i gi(x) = 0 for i = 1, 2, ..., M (complementary slackness) i 0 for i = 1, 2, ..., M (non-negativity, dual feasibility)

Note that the first condition gives n equations.

26

Page 14: Chapter 11 · 2019. 8. 7. · Ch. 11 - Optimization with Equality Constraints 1 1 Chapter 11 Optimization with Equality Constraints Albert William Tucker (1905-1995) Harold William

Ch. 11 - Optimization with Equality Constraints

14

11.4 Necessary KKT Conditions - Example

Example: Let’s minimize

f(x) = 4(x – 1)2 + (y – 2)2

with constraints:

x + y ≤ 2; x ≥ - 1 & y ≥ - 1.

Form the Lagrangian:

There are 3 inequality constraints, each can be chosen active/non-active: 8 possible combinations. But, the 3 constraints together:x+y=2 & x=-1 & y=-1 have no solution, and a combination of anytwo of them yields a single intersection point.

p

1jjj

m

1kkk )x(g)x(h)x(f),,x(L

112214),,( 32122 yxyxyxxL

The general case is:

3,1y,x1xand2yx

1,3y,x1yand2yx

1,1y,x1yand1x

y,x1xand1xand2yx

2yx2y1x4,y,xL2yx 22

1y1x2yx2y1x4),,x(L 32122

1x2y1x4,y,xL1x 22

1y2y1x4,y,xL1y 22

We must consider all the combinations of active / non active constraints:

(1)

(2)

(3)

(4)

(5)

(6)

(7)

(8) Unconstrained: 22 2y1x4y,xL

11.4 Necessary KKT Conditions - Example

Page 15: Chapter 11 · 2019. 8. 7. · Ch. 11 - Optimization with Equality Constraints 1 1 Chapter 11 Optimization with Equality Constraints Albert William Tucker (1905-1995) Harold William

Ch. 11 - Optimization with Equality Constraints

15

06.1

8.0,

2.1,8.0,

028842

88

2

042

088

02

2214,,)1( 22

yxf

yx

yy

x

yx

yy

L

xx

L

yxL

yxyxyxL

016

16,

2,1,

2

1688

1

042

088

01

1214,,)2( 22

yxf

yx

y

x

x

yy

L

xx

L

xL

xyxyxL

11.4 Necessary KKT Conditions - Example

(3)

(4)

(5)

(6)

(7)

(8) - beyond the range

06

9y,xf

1,1y,x

6

1x

1y

04y2y

L

08x8x

L

01yL

1y2y1x4,y,xL 22

17y,xf;3,1y,x

25y,xf;1,3y,x

25y,xf;1,1y,x

23yx0y,xf;2,1y,x

y,x

11.4 Necessary KKT Conditions - Example

Page 16: Chapter 11 · 2019. 8. 7. · Ch. 11 - Optimization with Equality Constraints 1 1 Chapter 11 Optimization with Equality Constraints Albert William Tucker (1905-1995) Harold William

Ch. 11 - Optimization with Equality Constraints

16

Finally, we compare among the 8 cases we have studied: case (7)

resulted was over-constrained and had no solutions, case (8) violated

the constraint x + y ≤ 2. Among the cased (1)-(6), it was case (1)

, yielding the lowest value of f(x, y). 8.0y,xf;2.1,8.0y,x

11.4 Necessary KKT Conditions - Example

11.4 Necessary KKT Conditions: General Case

• For the general case (n variables, M Inequalities, L equalities):

Min f(x) s.t.

gi(x) 0 for i = 1, 2, ..., M

hj(x) = 0 for J = 1, 2, ..., L

• In all this, the assumption is that gj(x*) for j belonging to active constraints and hk(x*) for k = 1, ..., K are linearly independent. This is referred to as constraint qualification.

• The necessary conditions are:

f(x) + i gi(x) + j hj(x) = 0 (optimality)

gi(x) 0 for i = 1, 2, ..., M (primary feasibility)

hj(x) = 0 for j = 1, 2, ..., L (primary feasibility)

i gi(x) = 0 for i = 1, 2, ..., M (complementary slackness)

i 0 for i = 1, 2, ..., M (non-negativity, dual feasibility)

(Note: j is unrestricted in sign)

Page 17: Chapter 11 · 2019. 8. 7. · Ch. 11 - Optimization with Equality Constraints 1 1 Chapter 11 Optimization with Equality Constraints Albert William Tucker (1905-1995) Harold William

Ch. 11 - Optimization with Equality Constraints

17

11.4 Necessary KKT Conditions (if g(x)0)

• If the definition of feasibility changes, the optimality and feasibility conditions change.

• The necessary conditions become:

f(x) – i gi(x) + j hj(x) = 0 (optimality)

gi(x) 0 for i = 1, 2, ..., M (feasibility)

hj(x) = 0 for j = 1, 2, ..., L (feasibility)

i gi(x) = 0 for i = 1, 2, ..., M (complementary slackness)

i 0 for i = 1, 2, ..., M (non-negativity, dual feasibility)

33

11.4 Restating the Optimization Problem11.4 Restating the Optimization Problem

• Kuhn Tucker Optimization Problem: Find vectors x(Nx1), (1xM)and (1xK) that satisfy:f(x) + i gi(x) + j hj(x) = 0 (optimality)gi(x) 0 for i = 1, 2, ..., M (feasibility)hj(x) = 0 for j = 1, 2, ..., L (feasibility)i gi(x) = 0 for i = 1, 2, ..., M (complementary

slackness condition)i 0 for i = 1, 2, ..., M (non-negativity)

• If x* is an optimal solution to NLP, then there exists a (*, *) such that (x*, *, *) solves the Kuhn–Tucker problem.

• The above equations not only give the necessary conditions for optimality, but also provide a way of finding the optimal point.

34

Page 18: Chapter 11 · 2019. 8. 7. · Ch. 11 - Optimization with Equality Constraints 1 1 Chapter 11 Optimization with Equality Constraints Albert William Tucker (1905-1995) Harold William

Ch. 11 - Optimization with Equality Constraints

18

11.4 KKT Conditions: Limitations11.4 KKT Conditions: Limitations

• Necessity theorem helps identify points that are not optimal. A point is not optimal if it does not satisfy the Kuhn–Tucker conditions.

• On the other hand, not all points that satisfy the Kuhn-Tucker conditions are optimal points.

• The Kuhn–Tucker sufficiency theorem gives conditions under which a point becomes an optimal solution to a single-objective NLP.

35

11.4 KKT Conditions: Sufficiency Condition11.4 KKT Conditions: Sufficiency Condition

• Sufficient conditions that a point x* is a strict local minimum of the classical single objective NLP problem, where f, gj, and hk are twice differentiable functions are that

1) The necessary KKT conditions are met.

2) The Hessian matrix 2L(x*) = 2f(x*) + i2gi(x*) + j2hj(x*) is positive definite on a subspace of Rn as defined by the condition:

yT 2L(x*) y 0 is met for every vector y(1xN) satisfying:

gj(x*) y = 0 for j belonging to I1 = { j | gj(x*) = 0, uj* > 0} (active constraints)

hk(x*) y = 0 for k = 1, ..., K

y 0

36

Page 19: Chapter 11 · 2019. 8. 7. · Ch. 11 - Optimization with Equality Constraints 1 1 Chapter 11 Optimization with Equality Constraints Albert William Tucker (1905-1995) Harold William

Ch. 11 - Optimization with Equality Constraints

19

11.4 KKT Sufficiency Theorem (Special Case)11.4 KKT Sufficiency Theorem (Special Case)

• Consider the classical single objective NLP problem.

minimize f(x)

Subject to gj(x) 0 for j = 1, 2, ..., J

hk(x) = 0 for k = 1, 2, ..., K

x = (x1, x2, ..., xN)

• Let the objective function f(x) be convex, the inequality constraints gj(x) be all convex functions for j = 1, ..., J, and the equality constraints hk(x) for k = 1, ..., K be linear.

• If this is true, then the necessary KKT conditions are also sufficient.

• Therefore, in this case, if there exists a solution x* that satisfies the KKT necessary conditions, then x* is an optimal solution to the NLP problem.

• In fact, it is a global optimum.

11.4 KKT Conditions: Closing Remarks11.4 KKT Conditions: Closing Remarks

• Kuhn-Tucker Conditions are an extension of Lagrangian function and method.

• They provide powerful means to verify solutions

• But there are limitations…

– Sufficiency conditions are difficult to verify.

– Practical problems do not have required nice properties.

– For example, you will have a problems if you do not know the explicit constraint equations.

• If you have a multi-objective (lexicographic) formulation, then it is suggested to test each priority level separately.

Page 20: Chapter 11 · 2019. 8. 7. · Ch. 11 - Optimization with Equality Constraints 1 1 Chapter 11 Optimization with Equality Constraints Albert William Tucker (1905-1995) Harold William

Ch. 11 - Optimization with Equality Constraints

20

39

)conditions KKT (Meets013/16,13/2839/84,1336

62332;442/3432

4324) From

442344322) and 1) From

:inactive) Constraint(1st 0,0Let :2 Case

)conditions KKT (Violates013/28,13/24,13/1039/30

136*354324233

;2333) From

;43242) and 1) From

:inactive) Constraint (2nd 0,0Let :1 Case

;02312)4

;03x2x-6)3

;02342)2

;03242)1

:F.o.c.

23123x2x-644:Lagrangian Form

0x,;0 23-12 6;3x2x.t.44Minimize

2*1

*2

*222

21

21212

λ1

1*1

*2

*222

21

211

λ2

21λ

21λ

212

211

2122112

22

1

2121212

22

1

1

2

2

1

2

1

xx

xxx

xx

xxxx

L

xx

xxx

xx

xx

L

xxL

L

xL

xL

xxxxL

xxxsxxC

x

x

11.4 KKT Conditions: Example

minimum. afor conditions KKT meet thenot do ,,,

013

10

26

200

13

24

26

480

13

28

26

56

2048,56,26

803

822

620

,

283

082

360

,

208

028

326

,

8

8

6

203

022

320

xand , x,for solve to(3)& (2), (1), eq.s Use

0382 )3

0282 )2

03x2x-6 )1

.0,0,&x,x therefore, 0L 0,L L L Assume

2121

211

2

1

1

211

12

11

21λ

2121λxxλ

211

211

2

1

1

2211

xx

xx

JJJJ

JJJ

x

x

xL

xL

L

xx

xx

x

x

11.4 KKT Conditions: Using Cramer’s rule

40

Page 21: Chapter 11 · 2019. 8. 7. · Ch. 11 - Optimization with Equality Constraints 1 1 Chapter 11 Optimization with Equality Constraints Albert William Tucker (1905-1995) Harold William

Ch. 11 - Optimization with Equality Constraints

21

41

minimum. afor conditions KKT meet the ,,,

013

36

26

720

13

28

26

560

13

16

26

32

7256,32,26

802

823

1230

,

282

083

2120

,

208

028

2312

,

8

8

12

202

023

230

xand , x,for solve to(3)& (2), (1), eq.s Use

0282)3(

0382)2(

02312)1(

.0,0,&x, x therefore, 0L 0,L ,&L ,L Assume

2121

212

2

1

2

212

22

21

21

1221xx

212

212

2

1

2

1221

xx

xx

JJJJ

JJJ

x

x

xL

xL

xxL

xx

xx

x

x

11.4 KKT Conditions: Example

42

11.5 The Constraint Qualification

cuspor point n inflectioan on :Reason

zero.equal should it when

However,max.a at

x-1x-

x and

x-1-xtosubject

xMaximize

pointsboundary at tiesIrregulari - 1 Example

12

1

12

1

11131

1

0131

0&

0

2

1

21

31

2

3

1

1

x

x

L

x

xL

xL

x

Page 22: Chapter 11 · 2019. 8. 7. · Ch. 11 - Optimization with Equality Constraints 1 1 Chapter 11 Optimization with Equality Constraints Albert William Tucker (1905-1995) Harold William

Ch. 11 - Optimization with Equality Constraints

22

43

2

1

2

101

0

0

0

02131

0&

2

0

2121

2

3

21

22

11

223

11

2

2

3

2

1

2

1

, λ, λ, xx

xL

L

L

xL

xxL

x

x

x

x

1

12

112

1

1

12

1

2x-2

x-1-x

2x-2x-1x-

x where

2x and

x-1-xtosubject

xMaximize

pointsboundary theat tiesIrregulari- 2 Example

11.5 The Constraint Qualification

44

zeroequal should it when

x, where , and tosubject

Maximize

cusp no contains problem the of regionfeasible The- 3 Example

2

,1621031

4,,6,2

2

10

1031

1062

210

02010

221

21121

1

3

221

2

2211

21

2

22111

12

3

2211

212

11

3

221

212

2

2

1

2

1

--λL

xx

xL

-x-xL

-x-xλL

λx-x-xλxL

xλ-x-xλ-xxL

xx-x-x

-xxπ

x

λ

x

x

11.5 The Constraint Qualification

Page 23: Chapter 11 · 2019. 8. 7. · Ch. 11 - Optimization with Equality Constraints 1 1 Chapter 11 Optimization with Equality Constraints Albert William Tucker (1905-1995) Harold William

Ch. 11 - Optimization with Equality Constraints

23