introduction to optimization (part 1) daniel kirschen

71
Introduction to Optimization (Part 1) Daniel Kirschen

Upload: lambert-daniels

Post on 23-Dec-2015

272 views

Category:

Documents


11 download

TRANSCRIPT

Page 1: Introduction to Optimization (Part 1) Daniel Kirschen

Introduction to Optimization(Part 1)

Daniel Kirschen

Page 2: Introduction to Optimization (Part 1) Daniel Kirschen

2

Economic dispatch problem

• Several generating units serving the load• What share of the load should each

generating unit produce?• Consider the limits of the generating units• Ignore the limits of the network

A B C L

© 2011 D. Kirschen and University of Washington

Page 3: Introduction to Optimization (Part 1) Daniel Kirschen

3

Characteristics of the generating units

© 2011 D. Kirschen and University of Washington

• Thermal generating units• Consider the running costs only• Input / Output curve

– Fuel vs. electric power• Fuel consumption measured by its energy content• Upper and lower limit on output of the generating unit

B T G

(Input)

Electric PowerFuel

(Output)

Output

Pmin Pmax

Inp

ut

J/h

MW

Page 4: Introduction to Optimization (Part 1) Daniel Kirschen

4

Cost Curve

• Multiply fuel input by fuel cost• No-load cost

– Cost of keeping the unit running if it could produce zero MW

OutputPmin Pmax

Cost

$/h

MWNo-load cost

© 2011 D. Kirschen and University of Washington

Page 5: Introduction to Optimization (Part 1) Daniel Kirschen

5

Incremental Cost Curve

• Incremental cost curve

• Derivative of the cost curve• In $/MWh• Cost of the next MWh

© 2011 D. Kirschen and University of Washington

∆F

∆P

Cost [$/h]

MW

Incremental Cost [$/MWh]

MW

Page 6: Introduction to Optimization (Part 1) Daniel Kirschen

6

Mathematical formulation

• Objective function

• Constraints– Load / Generation balance:

– Unit Constraints:

© 2011 D. Kirschen and University of Washington

A B C L

This is an optimization problem

Page 7: Introduction to Optimization (Part 1) Daniel Kirschen

Introduction to Optimization

Page 8: Introduction to Optimization (Part 1) Daniel Kirschen

8

“An engineer can do with one dollar which any bungler can do with two”

A. M. Wellington (1847-1895)

© 2011 D. Kirschen and University of Washington

Page 9: Introduction to Optimization (Part 1) Daniel Kirschen

9

Objective

• Most engineering activities have an objective:– Achieve the best possible design – Achieve the most economical operating conditions

• This objective is usually quantifiable• Examples:

– minimize cost of building a transformer– minimize cost of supplying power– minimize losses in a power system– maximize profit from a bidding strategy

© 2011 D. Kirschen and University of Washington

Page 10: Introduction to Optimization (Part 1) Daniel Kirschen

10

Decision Variables

• The value of the objective is a function of some decision variables:

• Examples of decision variables:– Dimensions of the transformer– Output of generating units, position of taps– Parameters of bids for selling electrical energy

© 2011 D. Kirschen and University of Washington

Page 11: Introduction to Optimization (Part 1) Daniel Kirschen

11

Optimization Problem

• What value should the decision variables take so that

is minimum or maximum?

© 2011 D. Kirschen and University of Washington

Page 12: Introduction to Optimization (Part 1) Daniel Kirschen

12

Example: function of one variable

© 2011 D. Kirschen and University of Washington

x

f(x)

x*

f(x*)

f(x) is maximum for x = x*

Page 13: Introduction to Optimization (Part 1) Daniel Kirschen

13

Minimization and Maximization

© 2011 D. Kirschen and University of Washington

x

f(x)

x*

f(x*)

If x = x* maximizes f(x) then it minimizes - f(x)

-f(x)-f(x*)

Page 14: Introduction to Optimization (Part 1) Daniel Kirschen

14

Minimization and Maximization

• maximizing f(x) is thus the same thing as minimizing g(x) = -f(x)

• Minimization and maximization problems are thus interchangeable

• Depending on the problem, the optimum is either a maximum or a minimum

© 2011 D. Kirschen and University of Washington

Page 15: Introduction to Optimization (Part 1) Daniel Kirschen

15

Necessary Condition for Optimality

© 2011 D. Kirschen and University of Washington

x

f(x)

x*

f(x*)

Page 16: Introduction to Optimization (Part 1) Daniel Kirschen

16

Necessary Condition for Optimality

© 2011 D. Kirschen and University of Washington

x

f(x)

x*

Page 17: Introduction to Optimization (Part 1) Daniel Kirschen

17

Example

© 2011 D. Kirschen and University of Washington

x

f(x)

For what values of x is ?

In other words, for what values of x is the necessary condition for optimality satisfied?

Page 18: Introduction to Optimization (Part 1) Daniel Kirschen

18

Example

• A, B, C, D are stationary points• A and D are maxima• B is a minimum• C is an inflexion point

x

f(x)

A B C D

© 2011 D. Kirschen and University of Washington

Page 19: Introduction to Optimization (Part 1) Daniel Kirschen

19

How can we distinguish minima and maxima?

© 2011 D. Kirschen and University of Washington

x

f(x)

A B C D

The objective function is concave around a maximum

Page 20: Introduction to Optimization (Part 1) Daniel Kirschen

20

How can we distinguish minima and maxima?

x

f(x)

A B C D

The objective function is convex around a minimum

© 2011 D. Kirschen and University of Washington

Page 21: Introduction to Optimization (Part 1) Daniel Kirschen

21

How can we distinguish minima and maxima?

© 2011 D. Kirschen and University of Washington

x

f(x)

A B C D

The objective function is flat around an inflexion point

Page 22: Introduction to Optimization (Part 1) Daniel Kirschen

22

Necessary and Sufficient Conditions of Optimality

• Necessary condition:

• Sufficient condition:– For a maximum:

– For a minimum:

© 2011 D. Kirschen and University of Washington

Page 23: Introduction to Optimization (Part 1) Daniel Kirschen

23

Isn’t all this obvious?

• Can’t we tell all this by looking at the objective function?– Yes, for a simple, one-dimensional case when we

know the shape of the objective function– For complex, multi-dimensional cases (i.e. with

many decision variables) we can’t visualize the shape of the objective function

– We must then rely on mathematical techniques

© 2011 D. Kirschen and University of Washington

Page 24: Introduction to Optimization (Part 1) Daniel Kirschen

24

Feasible Set

• The values that the decision variables can take are usually limited

• Examples:– Physical dimensions of a transformer must be

positive– Active power output of a generator may be limited

to a certain range (e.g. 200 MW to 500 MW)– Reactive power output of a generator may be

limited to a certain range (e.g. -100 MVAr to 150 MVAr)

© 2011 D. Kirschen and University of Washington

Page 25: Introduction to Optimization (Part 1) Daniel Kirschen

25

Feasible Set

x

f(x)

A D xMAXxMIN

Feasible Set

The values of the objective function outside the feasible set do not matter

© 2011 D. Kirschen and University of Washington

Page 26: Introduction to Optimization (Part 1) Daniel Kirschen

26

Interior and Boundary Solutions

• A and D are interior maxima• B and E are interior minima• XMIN is a boundary minimum• XMAX is a boundary maximum

x

f(x)

A D xMAXxMINB E

Do not satisfy theOptimality conditions!

© 2011 D. Kirschen and University of Washington

Page 27: Introduction to Optimization (Part 1) Daniel Kirschen

27

Two-Dimensional Case

x1

x2

f(x1,x2)

x2*

x1*

f(x1,x2) is minimum for x1*, x2

* © 2011 D. Kirschen and University of Washington

Page 28: Introduction to Optimization (Part 1) Daniel Kirschen

28

Necessary Conditions for Optimality

x1

x2

f(x1,x2)

x2*

x1*

© 2011 D. Kirschen and University of Washington

Page 29: Introduction to Optimization (Part 1) Daniel Kirschen

29

Multi-Dimensional Case

At a maximum or minimum value of

we must have:

A point where these conditions are satisfied is called a stationary point

© 2011 D. Kirschen and University of Washington

Page 30: Introduction to Optimization (Part 1) Daniel Kirschen

30

Sufficient Conditions for Optimality

x1

x2

f(x1,x2) minimum maximum

© 2011 D. Kirschen and University of Washington

Page 31: Introduction to Optimization (Part 1) Daniel Kirschen

31

Sufficient Conditions for Optimality

x1

x2

f(x1,x2)

Saddle point

© 2011 D. Kirschen and University of Washington

Page 32: Introduction to Optimization (Part 1) Daniel Kirschen

32

Sufficient Conditions for Optimality

Calculate the Hessian matrix at the stationary point:

© 2011 D. Kirschen and University of Washington

Page 33: Introduction to Optimization (Part 1) Daniel Kirschen

33

Sufficient Conditions for Optimality

• Calculate the eigenvalues of the Hessian matrix at the stationary point

• If all the eigenvalues are greater or equal to zero:– The matrix is positive semi-definite– The stationary point is a minimum

• If all the eigenvalues are less or equal to zero:– The matrix is negative semi-definite– The stationary point is a maximum

• If some or the eigenvalues are positive and other are negative:– The stationary point is a saddle point

© 2011 D. Kirschen and University of Washington

Page 34: Introduction to Optimization (Part 1) Daniel Kirschen

34

Contours

x1

x2

f(x1,x2)

F1 F2

F2

F1

© 2011 D. Kirschen and University of Washington

Page 35: Introduction to Optimization (Part 1) Daniel Kirschen

35

Contours

x1

x2

Minimum or maximum

A contour is the locus of all the point that give the same valueto the objective function

© 2011 D. Kirschen and University of Washington

Page 36: Introduction to Optimization (Part 1) Daniel Kirschen

36

Example 1

is a stationarypoint

© 2011 D. Kirschen and University of Washington

Page 37: Introduction to Optimization (Part 1) Daniel Kirschen

37

Example 1Sufficient conditions for optimality:

must be positive definite (i.e. all eigenvalues must be positive)

The stationary point is a minimum

© 2011 D. Kirschen and University of Washington

Page 38: Introduction to Optimization (Part 1) Daniel Kirschen

38

Example 1

© 2011 D. Kirschen and University of Washington

x1

x2

C=1C=4

C=9

Minimum: C=0

Page 39: Introduction to Optimization (Part 1) Daniel Kirschen

39

Example 2

is a stationarypoint

© 2011 D. Kirschen and University of Washington

Page 40: Introduction to Optimization (Part 1) Daniel Kirschen

40

Example 2Sufficient conditions for optimality:

The stationary point is a saddle point

© 2011 D. Kirschen and University of Washington

Page 41: Introduction to Optimization (Part 1) Daniel Kirschen

41

Example 2

© 2011 D. Kirschen and University of Washington

x1

x2

C=1

C=4

C=9

C=1

C=4

C=9

C=-1 C=-4 C=-9

C=0

C=0

C=-9 C=-4

This stationary point is a saddle point

Page 42: Introduction to Optimization (Part 1) Daniel Kirschen

Optimization with Constraints

Page 43: Introduction to Optimization (Part 1) Daniel Kirschen

43

Optimization with Equality Constraints

• There are usually restrictions on the values that the decision variables can take

© 2011 D. Kirschen and University of Washington

Objective function

Equality constraints

Page 44: Introduction to Optimization (Part 1) Daniel Kirschen

44

Number of Constraints

• N decision variables• M equality constraints• If M > N, the problems is over-constrained

– There is usually no solution• If M = N, the problem is determined

– There may be a solution• If M < N, the problem is under-constrained

– There is usually room for optimization

© 2011 D. Kirschen and University of Washington

Page 45: Introduction to Optimization (Part 1) Daniel Kirschen

45

Example 1

x1

x2

Minimum

© 2011 D. Kirschen and University of Washington

Page 46: Introduction to Optimization (Part 1) Daniel Kirschen

46

Example 2: Economic Dispatch

LG1 G2

x1 x2

Cost of running unit 1

Cost of running unit 2

Total cost

Optimization problem:

© 2011 D. Kirschen and University of Washington

Page 47: Introduction to Optimization (Part 1) Daniel Kirschen

47

Solution by substitution

Unconstrained minimization

© 2011 D. Kirschen and University of Washington

Page 48: Introduction to Optimization (Part 1) Daniel Kirschen

48

Solution by substitution

• Difficult• Usually impossible when constraints are non-

linear• Provides little or no insight into solution

• Solution using Lagrange multipliers

© 2011 D. Kirschen and University of Washington

Page 49: Introduction to Optimization (Part 1) Daniel Kirschen

49

Gradient

© 2011 D. Kirschen and University of Washington

Page 50: Introduction to Optimization (Part 1) Daniel Kirschen

50

Properties of the Gradient

• Each component of the gradient vector indicates the rate of change of the function in that direction

• The gradient indicates the direction in which a function of several variables increases most rapidly

• The magnitude and direction of the gradient usually depend on the point considered

• At each point, the gradient is perpendicular to the contour of the function

© 2011 D. Kirschen and University of Washington

Page 51: Introduction to Optimization (Part 1) Daniel Kirschen

51

Example 3

x

y

© 2011 D. Kirschen and University of Washington

A

B

C

D

Page 52: Introduction to Optimization (Part 1) Daniel Kirschen

52

Example 4

x

y

© 2011 D. Kirschen and University of Washington

Page 53: Introduction to Optimization (Part 1) Daniel Kirschen

53

Lagrange multipliers

© 2011 D. Kirschen and University of Washington

Page 54: Introduction to Optimization (Part 1) Daniel Kirschen

54

Lagrange multipliers

© 2011 D. Kirschen and University of Washington

Page 55: Introduction to Optimization (Part 1) Daniel Kirschen

55

Lagrange multipliers

© 2011 D. Kirschen and University of Washington

Page 56: Introduction to Optimization (Part 1) Daniel Kirschen

56

Lagrange multipliersThe solution must be on the constraint

© 2011 D. Kirschen and University of Washington

A

B

To reduce the value of f, we must move in a direction opposite to the gradient

?

Page 57: Introduction to Optimization (Part 1) Daniel Kirschen

57

Lagrange multipliers• We stop when the gradient of the function

is perpendicular to the constraint because moving further would increase the value of the function

At the optimum, the gradient of the function is parallel to the gradient of the constraint

© 2011 D. Kirschen and University of Washington

A

B

C

Page 58: Introduction to Optimization (Part 1) Daniel Kirschen

58

Lagrange multipliersAt the optimum, we must have:

Which can be expressed as:

is called the Lagrange multiplier

The constraint must also be satisfied:

In terms of the co-ordinates:

© 2011 D. Kirschen and University of Washington

Page 59: Introduction to Optimization (Part 1) Daniel Kirschen

59

Lagrangian functionTo simplify the writing of the conditions for optimality,it is useful to define the Lagrangian function:

The necessary conditions for optimality are then given by the partial derivatives of the Lagrangian:

© 2011 D. Kirschen and University of Washington

Page 60: Introduction to Optimization (Part 1) Daniel Kirschen

60

Example

© 2011 D. Kirschen and University of Washington

Page 61: Introduction to Optimization (Part 1) Daniel Kirschen

61

Example

© 2011 D. Kirschen and University of Washington

Page 62: Introduction to Optimization (Part 1) Daniel Kirschen

62

Example

x1

x2

Minimum

4

1

© 2011 D. Kirschen and University of Washington

Page 63: Introduction to Optimization (Part 1) Daniel Kirschen

63

Important Note!If the constraint is of the form:

It must be included in the Lagrangian as follows:

And not as follows:

© 2011 D. Kirschen and University of Washington

Page 64: Introduction to Optimization (Part 1) Daniel Kirschen

64

Application to Economic Dispatch

LG1 G2

x1 x2

Equal incremental costsolution

© 2011 D. Kirschen and University of Washington

Page 65: Introduction to Optimization (Part 1) Daniel Kirschen

65

Equal incremental cost solution

x1 x2

Cost curves:

x1 x2

Incrementalcost curves:

© 2011 D. Kirschen and University of Washington

Page 66: Introduction to Optimization (Part 1) Daniel Kirschen

66

Interpretation of this solution

x1 x2

L+

--

If < 0, reduce λIf > 0, increase λ

© 2011 D. Kirschen and University of Washington

Page 67: Introduction to Optimization (Part 1) Daniel Kirschen

67

Physical interpretation

x

x

The incremental cost is the cost ofone additional MW for one hour. This cost depends on the output of the generator.

© 2011 D. Kirschen and University of Washington

Page 68: Introduction to Optimization (Part 1) Daniel Kirschen

68

Physical interpretation

© 2011 D. Kirschen and University of Washington

Page 69: Introduction to Optimization (Part 1) Daniel Kirschen

69

Physical interpretation

It pays to increase the output of unit 2 and decrease the output of unit 1 until we have:

The Lagrange multiplier λ is thus the cost of one more MWat the optimal solution.

This is a very important result with many applications in economics.

© 2011 D. Kirschen and University of Washington

Page 70: Introduction to Optimization (Part 1) Daniel Kirschen

70

Generalization

Lagrangian:

• One Lagrange multiplier for each constraint• n + m variables: x1, …, xn and λ1, …, λm

© 2011 D. Kirschen and University of Washington

Page 71: Introduction to Optimization (Part 1) Daniel Kirschen

71

Optimality conditions

n equations

m equations

n + m equations inn + m variables

© 2011 D. Kirschen and University of Washington