karush kuhn tucker slides

Upload: matchman6

Post on 03-Jun-2018

223 views

Category:

Documents


1 download

TRANSCRIPT

  • 8/12/2019 Karush Kuhn Tucker Slides

    1/45

    Lagrange Multipliers and theKarush-Kuhn-Tucker conditions

    March 20, 2012

  • 8/12/2019 Karush Kuhn Tucker Slides

    2/45

    Optimization

    Goal:Want to nd the maximum or minimum of a function subject tosome constraints.

    Formal Statement of Problem:Given functions f , g1, . . . , g m and h1, . . . , h l dened on somedomain R n the optimization problem has the form

    minx f (x ) subject to gi (x )

    0

    i and h j (x ) = 0

    j

  • 8/12/2019 Karush Kuhn Tucker Slides

    3/45

  • 8/12/2019 Karush Kuhn Tucker Slides

    4/45

    Unconstrained Optimization

  • 8/12/2019 Karush Kuhn Tucker Slides

    5/45

    Unconstrained Minimization

    Assume:Let f : R be a continuously differentiable function.

    Necessary and sufficient conditions for a local minimum:x is a local minimum of f (x ) if and only if

    1 f has zero gradient at x:

    x f (x ) = 0

    2 and the Hessian of f at w is positive semi-denite:

    v t ( 2 f (x )) v 0 , v R n

    where

    2 f (x ) =

    2 f ( x )x 21

    2 f ( x )

    x 1 x n

    .... . .

    ... 2 f ( x )

    x n x 1 2 f ( x )

    x 2n

  • 8/12/2019 Karush Kuhn Tucker Slides

    6/45

    Unconstrained Maximization

    Assume:Let f : R be a continuously differentiable function.

    Necessary and sufficient conditions for local maximum:x is a local maximum of f (x ) if and only if

    1 f has zero gradient at x:

    f (x ) = 0

    2 and the Hessian of f at x is negative semi-denite:

    v t ( 2 f (x )) v 0 , v R n

    where

    2 f (x ) =

    2 f ( x )x 21

    2 f ( x )

    x 1 x n

    .... . .

    ... 2 f ( x )

    x n x 1 2 f ( x )

    x 2n

  • 8/12/2019 Karush Kuhn Tucker Slides

    7/45

  • 8/12/2019 Karush Kuhn Tucker Slides

    8/45

    Tutorial Example

    Problem:This is the constrained optimization problem we want to solve

    minx R 2 f (x ) subject to h(x ) = 0

    where

    f (x ) = x1 + x

    2 and h(x ) = x2

    1 + x2

    2 2

  • 8/12/2019 Karush Kuhn Tucker Slides

    9/45

    Tutorial example - Cost function

    x 1

    x2

    x 1 + x 2 = 2

    x 1 + x 2 = 1

    x 1 + x 2 = 0

    x 1 + x 2 = 1

    x 1 + x 2 = 2

    iso-contours of f (x )

    f (x ) = x1 + x2

  • 8/12/2019 Karush Kuhn Tucker Slides

    10/45

    Tutorial example - Feasible region

    x 1

    x2

    x 1 + x 2 = 2

    x 1 + x 2 = 1

    x 1 + x 2 = 0

    x 1 + x 2 = 1

    x 1 + x 2 = 2

    iso-contours of f ( x )

    feasible region: h(x ) = 0

    h (x ) = x21 + x22 2

  • 8/12/2019 Karush Kuhn Tucker Slides

    11/45

    Given a point x F on the constraint surface

    x 1

    x2feasible point x F

  • 8/12/2019 Karush Kuhn Tucker Slides

    12/45

    Given a point x F on the constraint surface

    x 1

    x2

    x

    Find x s.t. h(x F + x ) = 0 and f (x F + x ) < f (x F)?

  • 8/12/2019 Karush Kuhn Tucker Slides

    13/45

    Condition to decrease the cost function

    x 1

    x2 x f (x F )

    At any point x the direction of steepest descent of the costfunction f (x ) is given by x f (x ).

  • 8/12/2019 Karush Kuhn Tucker Slides

    14/45

    Condition to decrease the cost function

    x 1

    x2 x

    Here f ( x F + x ) < f ( x F )

    To move x from x such that f (x + x ) < f (x ) must have

    x ( x f (x )) > 0

  • 8/12/2019 Karush Kuhn Tucker Slides

    15/45

    Condition to remain on the constraint surface

    x 1

    x2

    x h(x F )

    Normals to the constraint surface are given by x h(x )

  • 8/12/2019 Karush Kuhn Tucker Slides

    16/45

    Condition to remain on the constraint surface

    x 1

    x2

    x h(x F )

    Note the direction of the normal is arbitrary as the constraint beimposed as either h(x ) = 0 or h (x ) = 0

  • 8/12/2019 Karush Kuhn Tucker Slides

    17/45

    Condition to remain on the constraint surface

    x 1

    x2

    Direction orthogonal to x h ( x F )

    To move a small x from x and remain on the constraint surfacewe have to move in a direction orthogonal to x h(x ).

  • 8/12/2019 Karush Kuhn Tucker Slides

    18/45

    To summarize...

    If x F lies on the constraint surface: setting x orthogonal to x h(x F ) ensures h(x F + x ) = 0 . And f (x F + x ) < f (x F ) only if

    x ( x f (x F )) > 0

    d f l l

  • 8/12/2019 Karush Kuhn Tucker Slides

    19/45

    Condition for a local optimum

    Consider the case when

    x f (x F ) = x h (x F )

    where is a scalar.

    When this occurs If x is orthogonal to x h (x F ) then

    x ( x F f (x )) = x x h (x F ) = 0

    Cannot move from x F to remain on the constraint surface anddecrease (or increase) the cost function .

    This case corresponds to a constrained local optimum!

    C di i f l l i

  • 8/12/2019 Karush Kuhn Tucker Slides

    20/45

    Condition for a local optimum

    Consider the case when

    x f (x F ) = x h (x F )

    where is a scalar.

    When this occurs If x is orthogonal to x h (x F ) then

    x ( x F f (x )) = x x h (x F ) = 0

    Cannot move from x F to remain on the constraint surface anddecrease (or increase) the cost function .

    This case corresponds to a constrained local optimum!

    C di i f l l i

  • 8/12/2019 Karush Kuhn Tucker Slides

    21/45

    Condition for a local optimum

    x 1

    x2

    critical point

    critical point

    A constrained local optimum occurs at x when x f (x ) and x h(x ) are parallel that is

    x f (x ) = x h(x )

  • 8/12/2019 Karush Kuhn Tucker Slides

    22/45

    F thi f t L g g M lti li k

  • 8/12/2019 Karush Kuhn Tucker Slides

    23/45

    From this fact Lagrange Multipliers make sense

    Remember our constrained optimization problem is

    minx R 2

    f (x ) subject to h(x ) = 0

    Dene the Lagrangian as note L (x , ) = f (x )

    L (x , ) = f (x ) + h (x )

    Then x a local minimum there exists a unique s.t.

    1 x L (x

    ,

    ) = 0 encodes x f (x

    ) =

    x h (x

    )2 L (x , ) = 0 encodes the equality constraint h(x ) = 0

    3 y t ( 2xx L (x , ))y 0 y s.t. x h (x ) t y = 0

    Positive denite Hessian tells us we have a local minimum

    The case of multiple equality constraints

  • 8/12/2019 Karush Kuhn Tucker Slides

    24/45

    The case of multiple equality constraints

    The constrained optimization problem is

    minx R 2

    f (x ) subject to h i (x ) = 0 for i = 1 , . . . , l

    Construct the Lagrangian (introduce a multiplier for each constraint )

    L (x , ) = f (x ) + li=1 i h i (x ) = f (x ) + t h (x )

    Then x a local minimum there exists a unique s.t.

    1 x L (x , ) = 02 L (x , ) = 0

    3 y t ( 2xx L (x , ))y 0 y s.t. x h (x ) t y = 0

  • 8/12/2019 Karush Kuhn Tucker Slides

    25/45

    Constrained Optimization:

    Inequality Constraints

    Tutorial Example Case 1

  • 8/12/2019 Karush Kuhn Tucker Slides

    26/45

    Tutorial Example - Case 1

    Problem:Consider this constrained optimization problem

    minx R 2

    f (x ) subject to g(x ) 0

    where

    f (x ) = x21 + x22 and g(x ) = x21 + x22 1

    Tutorial example Cost function

  • 8/12/2019 Karush Kuhn Tucker Slides

    27/45

    Tutorial example - Cost function

    x 1

    x2

    iso-contours of f (x )

    minimum of f ( x )

    f (x ) = x21 + x22

    Tutorial example - Feasible region

  • 8/12/2019 Karush Kuhn Tucker Slides

    28/45

    Tutorial example - Feasible region

    x 1

    x2

    feasible region: g(x ) 0

    iso-contours of f (x )

    g(x ) = x21 + x22 1

    How do we recognize if x is at a local optimum?

  • 8/12/2019 Karush Kuhn Tucker Slides

    29/45

    How do we recognize if x F is at a local optimum?

    x 1

    x2How can we recognize x Fis at a local minimum?

    Remember x F denotes a feasible point.

    Easy in this case

  • 8/12/2019 Karush Kuhn Tucker Slides

    30/45

    Easy in this case

    x 1

    x2How can we recognize x F

    is at a local minimum?Unconstrained minimum

    of f (x ) lies withinthe feasible region.

    Necessary and sufficient conditions for a constrained localminimum are the same as for an unconstrained local minimum.

    x f (x F ) = 0 and xx f (x F ) is positive denite

    This Tutorial Example has an inactive constraint

  • 8/12/2019 Karush Kuhn Tucker Slides

    31/45

    This Tutorial Example has an inactive constraint

    Problem:Our constrained optimization problem

    minx R 2

    f (x ) subject to g(x ) 0

    where

    f (x ) = x21 + x22 and g(x ) = x21 + x22 1

    Constraint is not active at the local minimum ( g(x ) < 0):Therefore the local minimum is identied by the same conditionsas in the unconstrained case.

    Tutorial Example - Case 2

  • 8/12/2019 Karush Kuhn Tucker Slides

    32/45

    Tutorial Example Case 2

    Problem:This is the constrained optimization problem we want to solve

    minxR 2

    f (x ) subject to g(x ) 0

    where

    f (x ) = ( x 1 1.1)2 + ( x 2 1.1)2 and g(x ) = x21 + x22 1

    Tutorial example - Cost function

  • 8/12/2019 Karush Kuhn Tucker Slides

    33/45

    p

    x 1

    x2

    iso-contours of f (x )

    minimum of f ( x )

    f (x ) = ( x1 1.1)2 + ( x2 1.1)2

    Tutorial example - Feasible region

  • 8/12/2019 Karush Kuhn Tucker Slides

    34/45

    p g

    x 1

    x2

    iso-contours of f (x )

    feasible region: g(x ) 0

    g(x ) = x21 + x22 1

    How do we recognize if x F is at a local optimum?

  • 8/12/2019 Karush Kuhn Tucker Slides

    35/45

    g p

    x1

    x2

    Is x F at a local minimum?

    Remember x F denotes a feasible point.

    How do we recognize if x F is at a local optimum?

  • 8/12/2019 Karush Kuhn Tucker Slides

    36/45

    g p

    x1

    x2

    How can we tell if x Fis at a local minimum?

    Unconstrained localminimum of f (x )lies outside of the

    feasible region.

    the constrained local minimum occurs on the surface of theconstraint surface.

    How do we recognize if x F is at a local optimum?

  • 8/12/2019 Karush Kuhn Tucker Slides

    37/45

    x1

    x2

    How can we tell if x Fis at a local minimum?

    Unconstrained localminimum of f (x )lies outside of the

    feasible region.

    Effectively have an optimization problem with an equalityconstraint : g(x ) = 0 .

    Given an equality constraint

  • 8/12/2019 Karush Kuhn Tucker Slides

    38/45

    x1

    x2

    A local optimum occurs when x f (x ) and x g(x ) are parallel:

    - x f (x ) = x g(x )

  • 8/12/2019 Karush Kuhn Tucker Slides

    39/45

    Want a constrained local minimum...

  • 8/12/2019 Karush Kuhn Tucker Slides

    40/45

    x1

    x2 Is a constrained

    local minimum as x f (x F ) points

    away from thefeasible region

    Constrained local minimum occurs when x f (x ) and x g(x )point in the same direction:

    x f (x ) = x g(x ) and > 0

    Summary of optimization with one inequality constraint

  • 8/12/2019 Karush Kuhn Tucker Slides

    41/45

    Given

    minx R 2

    f (x ) subject to g(x ) 0

    If x corresponds to a constrained local minimum then

    Case 1 :Unconstrained local minimumoccurs in the feasible region.

    1 g(x ) < 0

    2 x f (x ) = 0

    3 xx f (x ) is a positivesemi-denite matrix.

    Case 2 :Unconstrained local minimumlies outside the feasible region.

    1 g(x ) = 0

    2 x f (x

    ) = x g(x

    )with > 0

    3 y t xx L(x ) y 0 for ally orthogonal to x g(x ).

    Karush-Kuhn-Tucker conditions encode these conditions

  • 8/12/2019 Karush Kuhn Tucker Slides

    42/45

    Given the optimization problem

    minx R 2

    f (x ) subject to g(x ) 0

    Dene the Lagrangian as

    L (x , ) = f (x ) + g (x )

    Then x a local minimum there exists a unique s.t.

    1 x L (x , ) = 0

    2

    03 g(x ) = 0

    4 g(x ) 0

    5 Plus positive denite constraints on xx L (x , ).

    These are the KKT conditions .

    Lets check what the KKT conditions imply

  • 8/12/2019 Karush Kuhn Tucker Slides

    43/45

    Case 1 - Inactive constraint: When = 0 then have L (x , ) = f (x ). Condition KKT 1 = x f (x ) = 0 . Condition KKT 4 = x is a feasible point.

    Case 2 - Active constraint:

    When > 0 then have L (x , ) = f (x ) + g(x ). Condition KKT 1 = x f (x ) = x g(x ).

    Condition KKT 3 = g(x ) = 0 . Condition KKT 3 also = L (x , ) = f (x ).

    KKT conditions for multiple inequality constraints

  • 8/12/2019 Karush Kuhn Tucker Slides

    44/45

    Given the optimization problem

    minx R 2 f (x ) subject to g j (x )

    0 for j = 1 , . . . , m

    Dene the Lagrangian as

    L (x , ) = f (x ) + m j =1

    j g j (x ) = f (x ) + t g (x )

    Then x a local minimum there exists a unique s.t.

    1 x L (x , ) = 0

    2 j 0 for j = 1 , . . . , m

    3 j g(x) = 0 for j = 1 , . . . , m

    4 gj (x ) 0 for j = 1 , . . . , m

    5 Plus positive denite constraints on xx L (x , ).

    KKT for multiple equality & inequality constraints

  • 8/12/2019 Karush Kuhn Tucker Slides

    45/45

    Given the constrained optimization problem

    minx R 2

    f (x )

    subject to

    h i (x ) = 0 for i = 1 , . . . , l and gj (x ) 0 for j = 1 , . . . , m

    Dene the Lagrangian asL (x , , ) = f (x ) + t h (x ) + t g (x )

    Then x a local minimum there exists a unique s.t.

    1 x L (x , , ) = 02 j 0 for j = 1 , . . . , m3 j gj (x ) = 0 for j = 1 , . . . , m4 gj (x ) 0 for j = 1 , . . . , m5 h (x ) = 0

    6 Plus positive denite constraints on xx L (x , ).