chapter 1 - fundamental of optimization

23
7/29/2019 Chapter 1 - Fundamental of Optimization http://slidepdf.com/reader/full/chapter-1-fundamental-of-optimization 1/23  © 2008 Solutions 4U Sdn Bhd. All Rights Reserved Introduction to Optimization Methods Introduction to Non-Linear Optimization

Upload: qasim-al-mansoor

Post on 04-Apr-2018

222 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Chapter 1 - Fundamental of Optimization

7/29/2019 Chapter 1 - Fundamental of Optimization

http://slidepdf.com/reader/full/chapter-1-fundamental-of-optimization 1/23

 © 2008 Solutions 4U Sdn Bhd. All Rights Reserved

Introduction to Optimization Methods

Introduction to Non-Linear

Optimization

Page 2: Chapter 1 - Fundamental of Optimization

7/29/2019 Chapter 1 - Fundamental of Optimization

http://slidepdf.com/reader/full/chapter-1-fundamental-of-optimization 2/23

Fundamental of Optimization

 © 2008 Solutions 4U Sdn Bhd. All Rights Reserved

Optimization in Process Plants

Page 3: Chapter 1 - Fundamental of Optimization

7/29/2019 Chapter 1 - Fundamental of Optimization

http://slidepdf.com/reader/full/chapter-1-fundamental-of-optimization 3/23

Fundamental of Optimization

 © 2008 Solutions 4U Sdn Bhd. All Rights Reserved

Optimization Tree

Figure 1: Optimization tree.

Page 4: Chapter 1 - Fundamental of Optimization

7/29/2019 Chapter 1 - Fundamental of Optimization

http://slidepdf.com/reader/full/chapter-1-fundamental-of-optimization 4/23

Fundamental of Optimization

 © 2008 Solutions 4U Sdn Bhd. All Rights Reserved

What is Optimization? 

 Optimization is an iterative process by which a desired solution

(max/min) of the problem can be found while satisfying all its

constraint or bounded conditions.

Optimization problem could be linear or non-linear.

Non –linear optimization is accomplished by numerical ‘Search Methods’. 

Search methods are used iteratively before a solution is achieved.

The search procedure is termed as algorithm .

Figure 2: Optimum solution is found

while satisfying its constraint (derivative

must be zero at optimum).

Page 5: Chapter 1 - Fundamental of Optimization

7/29/2019 Chapter 1 - Fundamental of Optimization

http://slidepdf.com/reader/full/chapter-1-fundamental-of-optimization 5/23

Fundamental of Optimization

 © 2008 Solutions 4U Sdn Bhd. All Rights Reserved

 Linear problem   – solved by Simplex or Graphical methods.

The solution of the linear problem lies on boundaries of the feasible region.

 Non-linear problem solution lies within and on the boundaries of the

feasible region.

Figure 3: Solution of linear problem Figure 4: Three dimensional solution of 

non-linear problem

What is Optimization?(Cont.)

Page 6: Chapter 1 - Fundamental of Optimization

7/29/2019 Chapter 1 - Fundamental of Optimization

http://slidepdf.com/reader/full/chapter-1-fundamental-of-optimization 6/23

Fundamental of Optimization

 © 2008 Solutions 4U Sdn Bhd. All Rights Reserved

 Constraints • Inequality • Equality 

Fundamentals of Non-Linear Optimization

 Single Objective function f(x) • Maximization • Minimization 

 Design Variables , x i , i=0,1,2,3….. 

Figure 5: Example of design variables and

constraints used in non-linear optimization.

Maximize X1 + 1.5 X2

Subject to:

X1 + X2 ≤ 150 

0.25 X1 + 0.5 X2 ≤ 50 

X1 ≥ 50 

X2 ≥ 25 

X1 ≥0, X2 ≥0

 Optimal points • Local minima/maxima points: A point or Solution x* is at local point

if there is no other x in its Neighborhood less than x*  • Global minima/maxima points: A point or Solution x** is at global

point if there is no other x  in entire search space less than x**  

Page 7: Chapter 1 - Fundamental of Optimization

7/29/2019 Chapter 1 - Fundamental of Optimization

http://slidepdf.com/reader/full/chapter-1-fundamental-of-optimization 7/23

Fundamental of Optimization

 © 2008 Solutions 4U Sdn Bhd. All Rights Reserved

Figure 6: Global versus local optimization. Figure 7: Local point is equal to global point if 

the function is convex.

Fundamentals of Non-Linear Optimization (Cont.)

Page 8: Chapter 1 - Fundamental of Optimization

7/29/2019 Chapter 1 - Fundamental of Optimization

http://slidepdf.com/reader/full/chapter-1-fundamental-of-optimization 8/23

Fundamental of Optimization

 © 2008 Solutions 4U Sdn Bhd. All Rights Reserved

Function f is convex if f( X a  ) is less than value of the corresponding

point joining f( X 1 ) and f( X 2  ).

Convexity condition – Hessian 2nd order derivative) matrix of 

function f must be positive semi definite ( eigen values +ve or zero).

Fundamentals of Non-Linear Optimization (Cont.)

Figure 8: Convex and nonconvex set Figure 9: Convex function

Page 9: Chapter 1 - Fundamental of Optimization

7/29/2019 Chapter 1 - Fundamental of Optimization

http://slidepdf.com/reader/full/chapter-1-fundamental-of-optimization 9/23

Fundamental of Optimization

 © 2008 Solutions 4U Sdn Bhd. All Rights Reserved

Mathematical Background

Slop or gradient of the objective function f  – represent thedirection in which the function will decrease/increase most rapidly

 x

 f 

 x

 x f  x x f 

dx

df 

 x x

00lim

)()(lim

.......)(!2

1)()(

2

2

2

xdx

 f d  x

dx

df  x x f 

 p p x x

 p

 z

g

 y

g

 x

g

 z

 f 

 y

 f 

 x

 f 

 J 

Taylor series expansion

 Jacobian   – matrix of  gradient of f with respect to several variables

Page 10: Chapter 1 - Fundamental of Optimization

7/29/2019 Chapter 1 - Fundamental of Optimization

http://slidepdf.com/reader/full/chapter-1-fundamental-of-optimization 10/23

Fundamental of Optimization

 © 2008 Solutions 4U Sdn Bhd. All Rights Reserved

 Hessian  – Second derivative of function of several variables, Sign

indicates max.(+ve) or min.(-ve)

Second order condition (SOC)• Eigen values of H(X*) are all positive

• Determinants of all lower order of H(X*) are +ve

2

22

2

2

2

 y

 f 

 y x

 f 

 x y

 f 

 x

 f 

 H 

 Slope -First order Condition (FOC) – Provides function’s slope informatio

0*)( X  f 

Mathematical Background (Cont.)

Page 11: Chapter 1 - Fundamental of Optimization

7/29/2019 Chapter 1 - Fundamental of Optimization

http://slidepdf.com/reader/full/chapter-1-fundamental-of-optimization 11/23

Fundamental of Optimization

 © 2008 Solutions 4U Sdn Bhd. All Rights Reserved

 Deterministic - specific rules to move from one iteration to next ,

gradient, Hessian

 Stochastic  – probalistic rules are used for subsequent iteration

 Optimal Design  – Engineering Design based on

optimization algorithm

 Lagrangian method  – sum of objective function and linear 

combination of the constraints.

Optimization Algorithm

Page 12: Chapter 1 - Fundamental of Optimization

7/29/2019 Chapter 1 - Fundamental of Optimization

http://slidepdf.com/reader/full/chapter-1-fundamental-of-optimization 12/23

Fundamental of Optimization

 © 2008 Solutions 4U Sdn Bhd. All Rights Reserved

 Multivariable Techniques ( Make use of Single variable Techniques

specially Golden Section)

Deterministic• Direct Search – Use Objective function values to locate minimum

• Gradient Based – first or second order of objective function.

• Minimization objective function f(x) is used with –ve sign – f(x) for maximization problem.

Single Variable

• Newton – Raphson is Gradient based technique (FOC)

• Golden Search – step size reducing iterative method

• Unconstrained Optimizationa.) Powell Method  – Quadratic (degree 2) objective function polynomial is

non-gradient based.

b.) Gradient Based  – Steepest Descent (FOC) or Least Square minimum

(LMS)

c.) Hessian Based -Conjugate Gradient (FOC) and BFGS (SOC) 

Optimization Methods

Page 13: Chapter 1 - Fundamental of Optimization

7/29/2019 Chapter 1 - Fundamental of Optimization

http://slidepdf.com/reader/full/chapter-1-fundamental-of-optimization 13/23

Fundamental of Optimization

 © 2008 Solutions 4U Sdn Bhd. All Rights Reserved

• Constrained Optimizationa.) Indirect approach – by transforming into unconstrained

problem.

b.) Exterior Penalty Function (EPF) and Augmented LagrangeMultiplier

c.) Direct Method Sequential Linear Programming (SLP), SQP andSteepest Generalized Reduced Gradient Method (GRG)

Figure 10: Descent Gradient or LMS

Optimization Methods - Constrained

Page 14: Chapter 1 - Fundamental of Optimization

7/29/2019 Chapter 1 - Fundamental of Optimization

http://slidepdf.com/reader/full/chapter-1-fundamental-of-optimization 14/23

Fundamental of Optimization

 © 2008 Solutions 4U Sdn Bhd. All Rights Reserved

 Global Optimization  – Stochastic techniques

•  Simulated Annealing (SA) method – minimumenergy principle of cooling metal crystalline structure

• Genetic Algorithm (GA)  – Survival of the fittest

principle based upon evolutionary theory

Optimization Methods (Cont.)

Page 15: Chapter 1 - Fundamental of Optimization

7/29/2019 Chapter 1 - Fundamental of Optimization

http://slidepdf.com/reader/full/chapter-1-fundamental-of-optimization 15/23

Fundamental of Optimization

 © 2008 Solutions 4U Sdn Bhd. All Rights Reserved

Multivariable Gradient based optimization

J is the cost function to be minimized in two

dimension

The contours of the J paraboloid shrinks as it is

decrease

function retval = Example6_1(x)% example 6.1 

retval = 3 + (x(1) - 1.5*x(2))^2 + (x(2) - 2)^2;

>> SteepestDescent('Example6_1', [0.5 0.5], 20,

0.0001, 0, 1, 20)

Where

[0.5 0.5] -initial guess value

20 -No. of iteration

0.001 -Golden search tol.

0 -initial step size

1 -step interval

20 -scanning step

>> ans

2.7585 1.8960

Figure 11: Multivariable Gradient based optimization

Figure 12: Steepest Descent

Optimization Methods (Examples)

Page 16: Chapter 1 - Fundamental of Optimization

7/29/2019 Chapter 1 - Fundamental of Optimization

http://slidepdf.com/reader/full/chapter-1-fundamental-of-optimization 16/23

Fundamental of Optimization

 © 2008 Solutions 4U Sdn Bhd. All Rights Reserved

Numerical Optimization

Newton –Raphson Method1. Root Solver – System of nonlinear equations2. (MATLAB – Optimization – fsolve)

)('

)(

1

1

i

iii

 x  f  

 x  f   x x

2. One dimensional Solver  – (MATLAB - OPTIMIZATION Method )

)(''

)('

1

1

i

iii

 x  f  

 x  f   x x

Steepest Gradient Ascent/DescentMethods

iii

iii

d  x  f   x  f  

d  x  f   x

).(.)(

).(1

 

 

ii d  x  f   ).( Is the magnitude of descent direction.)(

2 i x  f  d  achieve the steepest descent

)(2 i x  f  d  achieve steepest ascent

Page 17: Chapter 1 - Fundamental of Optimization

7/29/2019 Chapter 1 - Fundamental of Optimization

http://slidepdf.com/reader/full/chapter-1-fundamental-of-optimization 17/23

Fundamental of Optimization

 © 2008 Solutions 4U Sdn Bhd. All Rights Reserved

Steepest Gradient

75.0)1(25.2)1(325.23

 y x

 x

 f 

075.1)1(4)1(25.275.1425.2

 y x

 y

 f 

284375.05625.05.0)1,75.01( hhh f 

0)1(25.2)75.0(3

 x

 f 

5625.075.1)1(4)75.0(25.2  y f 

263281.0316406.059375.0)5625.01,75.0( hhh f 

The partial derivatives can be evaluated at the initial guesses, x = 1 andy = 1,

Therefore, the search direction is –0.75i.

This can be differentiated and set equal to zero and solved for h * = 0.33333. Therefore, the result for thefirst iteration is x = 1 – 0.75(0.3333) = 0.75 and y = 1 + 0(0.3333) = 1. For the second iteration, thepartial derivatives can be evaluated as,

Therefore, the search direction is –0.5625 j.

This can be differentiated and set equal to zero and solved for h * = 0.25.Therefore, the result for the second iteration is x = 0.75 + 0(0.25) = 0.75 and y = 1 + ( –0.5625)0.25 = 0.859375.

Solve the following for two step steepest ascent

 y x y xy y x f  25.175.125.2),( 2

0 0.2 0.4 0.6 0.8 1 1.20

0.2

0.4

0.6

0.8

1

1.2

0

2

1

max

Page 18: Chapter 1 - Fundamental of Optimization

7/29/2019 Chapter 1 - Fundamental of Optimization

http://slidepdf.com/reader/full/chapter-1-fundamental-of-optimization 18/23

Fundamental of Optimization

 © 2008 Solutions 4U Sdn Bhd. All Rights Reserved

%chapra14.5 … Contd clearclcClf x2 ww1=0:0.01:1.2;

ww2=ww1;[w1,w2]=meshgrid(ww1,ww2);J=-1.5*w1.^2+2.25*w2.*w1-2*w2.^2+1.75*w2;

cs=contour(w1,w2,J,70);%clabel(cs);holdgridw1=1; w2=1; h=0;for i=1:10

syms h

dfw1=-3*w1(i)+2.25*w2(i);dfw2=2.25*w1(i)-4*w2(i)+1.75;fw1=-1.5*(w1(i)+dfw1*h).^2 + 2.25*(w2(i)+dfw2*h).*(w1(i)+… 

dfw1*h)-2*(w2(i)+dfw2*h).^2+1.75*(w2(i)+dfw2*h);J=-1.5*w1(i)^2+2.25*w2(i)*w1(i)-2*w2(i)^2+1.75*w2(i)g=solve(fw1);h=sum(g)/2;w1(i+1)=w1(i)+dfw1*h;w2(i+1)=w2(i)+dfw2*h;plot(w1,w2) xi 

pause(0.05)

End

MATLAB OPTIMIZATION TOOLBOX 

w1(i), w2(i) function J=chaprafun(x)w1=x(1);w2=x(2)J=-(-1.5*w1^2+2.25*w2*w1-2*w2^2+1.75*w2);

%startchapra.mclcclearx0=[1 1];options=optimset('LargeScale','off','Display','iter','Maxiter',…20,'MaxFunEvals',100,'TolX',1e-3,'TolFun',1e-3);[x,fval]=fminunc(@chaprafun,x0,options)

Page 19: Chapter 1 - Fundamental of Optimization

7/29/2019 Chapter 1 - Fundamental of Optimization

http://slidepdf.com/reader/full/chapter-1-fundamental-of-optimization 19/23

Fundamental of Optimization

 © 2008 Solutions 4U Sdn Bhd. All Rights Reserved

Newton –Raphson – Four Bar MechanismSine and Cosine angle components - All angles referenced from global x - axis

In above equations θ1 = 0 as it is along the x-axis and other three angles are time varying.

1st angular velocity derivative

Page 20: Chapter 1 - Fundamental of Optimization

7/29/2019 Chapter 1 - Fundamental of Optimization

http://slidepdf.com/reader/full/chapter-1-fundamental-of-optimization 20/23

Fundamental of Optimization

 © 2008 Solutions 4U Sdn Bhd. All Rights Reserved

• If input is applied to link2 - DC motor then ω2 would be the input to the syste

In Matrix for

2. Numerical Solution for Non Algebraic Equations

)('

)(

1

_

1

i

iii

 x  f  

 x  f   x x

Page 21: Chapter 1 - Fundamental of Optimization

7/29/2019 Chapter 1 - Fundamental of Optimization

http://slidepdf.com/reader/full/chapter-1-fundamental-of-optimization 21/23

Fundamental of Optimization

 © 2008 Solutions 4U Sdn Bhd. All Rights Reserved

)('

)(_

i

iq  f  

q  f  q

Page 22: Chapter 1 - Fundamental of Optimization

7/29/2019 Chapter 1 - Fundamental of Optimization

http://slidepdf.com/reader/full/chapter-1-fundamental-of-optimization 22/23

Fundamental of Optimization

 © 2008 Solutions 4U Sdn Bhd. All Rights Reserved

)('

)(_

_

i

iii

  f  

  f  

  

      

Newton-Raphson

%fouropt.m

function f=fouropt(x)the = 0;r1=12; r2=4; r3=10; r4=7;f=-[r2*cos(the)+r3*cos(x(1))-r1*cos(0)-r4*cos(x(2));

r2*sin(the)+r3*sin(x(1))-r1*sin(0)-r4*sin(x(2))];

%startfouropt.m

clcclearx0=[0.1 0.1];options=optimset('LargeScale','off','Display','iter','Maxiter',…200,'MaxFunEvals',100,'TolX',1e-8,'TolFun',1e-8);[x,fval]=fsolve(@fouropt,x0,options);theta3=x(1)*57.3theta4=x(2)*57.3

Foursimmechm.m, foursimmech.mdl and possol4.m

Page 23: Chapter 1 - Fundamental of Optimization

7/29/2019 Chapter 1 - Fundamental of Optimization

http://slidepdf.com/reader/full/chapter-1-fundamental-of-optimization 23/23

Fundamental of Optimization

References:

1) Steven C. Chapra , Raymond P. Canale, Numerical Methods for Engineers, McGraw Hill,Singapore, 2006

2) Kalyanmoy Deb, Optimization for Engineering Design , Prentice Hall, New Dehli, 1996

3) Optimization Toolbox for use with MATLAB, User guide Ver. 3 , MathWorks, Natick, MA, USA,

2006