optimization with comsol multiphysics - kesco · pdf fileoptimization with comsol multiphysics...

Download Optimization with COMSOL Multiphysics - KESCO · PDF fileOptimization with COMSOL Multiphysics COMSOL Tokyo Conference 2014 Walter Frei, PhD Applications Engineer

If you can't read please download the document

Upload: votu

Post on 06-Feb-2018

343 views

Category:

Documents


22 download

TRANSCRIPT

  • Optimization with COMSOL MultiphysicsCOMSOL Tokyo Conference 2014

    Walter Frei, PhD Applications Engineer

  • Product Suite COMSOL 5.0

  • Agenda An introduction to optimization

    A lot of concepts, and a little bit of math What do these options mean?

    Demo

    Overview of Examples

  • A quick conceptual introduction, and some terminologies

    - Dimensions - Material Properties- Operating Conditions- etc K()u=b()

    - Performance- Failure criteria- etc

    Constrained Design Variables:

    Objective: f(u())&

    Constraints: g(u())

    Black Boxu()

    Optimization

  • More formally, optimization is

    ( )

    ( )( )( ) 0uh

    0ug0p

    u

    =

    )()(:such that

    )(min

    UL

    f Objective functionSimple bounds on the design variables

    Pointwise constraints on the design variables

    General Equality constraints

    General Inequality constraints

  • The design variables:

    1

    2

    2,L

    2,U

    1,L 1,U

    Design Space

    Upper and Lower Bounds

    p() 0

    Pointwise constraints

  • The design space must be continuous

    1

    2

    Break this up into two separate optimization problems

    Design Space 1 Design

    Space 2

  • Why no equality constraints for ?

    1

    2

    2,L

    2,U

    1,L 1,U

    An equality constraint is equivalent to a different optimization problem with one less design variable

    p() = 0

  • Introduce a new design variable instead

    1

    2

    2,L

    2,U

    1,L 1,UA

    A,L

    A,U

    1= f(A)

    2 = f(A)

  • The design space must be continuous in the real number space

    1

    2

    Optimizing over a set of discrete values is an Integer Programming problem

    LiveLink for MATLAB & LiveLink for Excelcan be used to interface to 3rd party optimizers

  • It is helpful if the design space is convex

    Usually more difficult Every point can see every other design point

  • Now lets look at the objective function:f(u()) or f()

    1

    2

    1 2

  • 1

    2

    f

    We are always starting somewhere

    Initial design

    We want to improve this Tip: Always start

    optimizing from a feasible design

  • Lets first assume a smooth and differentiable objective function with a single minimum

    1

    2 f

    1) Find the gradient

    2) Search along the line

    3) Find the minimum

    4) Repeat

    f

  • Start from a point, find the direction of steepest descent (the gradient) and search in that direction for a minimum

    Repeat

    Once the gradient is zero, or the boundary of the design space is reached, stop

    Repeat until converged

  • Adjoint method is used to compute derivatives

    ( ) ( )( ) ( )( )

    ( ) ( ) ( )

    ( ) ( ) ( )

    u

    u

    uK

    bK

    u

    b

    uKu

    K

    0buK

    0buK

    =

    =

    =

    +

    =

    =

    ff

    1

    Computing this derivative only doubles the computational requirements, regardless of how many design variables there are

    Finite element equations

    Differentiate w.r.t.

    Expand

    Re-arrange

    Assuming f is differentiable w.r.t. u

  • If we hit a constraint, follow itStart from a point, find the direction of steepest descent (the gradient) and search in that direction for a minimum

    Repeat

    Once the gradient is zero, or the boundary of the design space is reached, stop

  • What if we have multiple minima?

    Depends on where you start!

    You are never guaranteed of finding the global minimum, but you can find a local minimum

  • Approximate the shape by evaluating the objective function repeatedly

    For example, Nelder-Mead evaluates n+1 points when optimizing n design variables

    What if the objective function is not smooth or differentiable?

  • Approximate the shape by evaluating the objective function repeatedly

    For example, Nelder-Mead evaluates n+1 points when optimizing n design variables

    What if the objective function is not smooth or differentiable?

  • What if we want to include general equality and inequality constraints?

    1 2

    g(u()) = 0

    1 2

    h(u()) 0

    Strong dependence on initial conditionsHighly constrained design space

  • Summary Design variables must be continuous and real-valued Design space:

    Simple (Cartesian) bounds Pointwise inequality constraints If you want to set up an equality constraint, get rid of one design variable Convex design space is better

    Objective function If it is smooth and differentiable,

    can use the Adjoint method and the gradient-based optimization technique If it is non-smooth or non-differentiable

    Use the gradient-free approach General Equality and Inequality Constraints

    Equality constraints can severely complicate the optimization problem Inequality constraints can make the design space non-continuous

  • Demo: A bracket with a hole

    Load

    Fixed

  • First, minimize the mass by changing the hole radius

    R

    The radius must be greater than zero, and not so large as to cut the bracket in half

  • Next, add a constraint on the maximum stress within the part

    But the location of the peak stress is not known,So we use a maximum coupling operator & a constraint

    < max

  • What about moving & resizing the hole?

  • Lets look at the constraints...

    How can we express these mathematically?

  • Add one more design variable

    A

    R

  • Lets take these constraints a few at a time

    R

    RA

    B = (1-0.25*A)/(1+sqrt(4.25)/2)

    With a bit of (behind the scenes) trigonometry:

    Which leads to the constraint: B-R>0

  • What about the other limits?

    A

    AR>0.2

    R

  • Sometimes we can just ignore a constraint

    Based upon the simulations so far, its likely this constraint will never be an issue

  • The available optimization solversOptimization

    Module

    Gradient-Free Methods

    Gradient-Based Methods

    Monte-Carlo

    Coordinate Search

    MMA Levenberg-Marquardt

    Nelder-Mead BOBYQA

    SNOPT

    COBYLA

  • When to use gradient-free methods? Non-differentiable objective function, and/or constraints

    Few design variables Optimization time increases exponentially with number of variables Aim for less than 10 design variables

    Whenever re-meshing will occur Re-meshing results in a non-smooth objective function

  • The gradient-free solvers COBYLA

    Similar to BOBYQA, but uses a linear approximation Can consider constraints

    BOBYQA Constructs a quadratic approximant to the objective function Probably the fastest, but needs a reasonably smooth objective

    function Nelder-Mead

    Construct a simplex, and improve the worst point Probably the best if the objective function is relatively noisy Can consider constraints

    Coordinate Search Search along one design variable at a time Estimate the gradients along that line, move on to next variable,

    repeat Monte-Carlo

    Random choices of design variables are evaluated Only a very dense statistical sampling can find the global optimum

    Faster

    Slower

    Usually the

    Most Robust

  • When to use gradient-based methods? Differentiable objective function, and/or constraints

    Many design variables Optimization speed does not depend strongly on number of variables 100,000+ design variables are not unreasonable

    Topology Optimization

  • The gradient-based solvers SNOPT

    Sequential Quadratic Programming algorithm

    MMA Linear convergence rate near the optimum Popular in the Topology Optimization community

    Levenberg-Marquardt Only for unconstrained least squares minimization problems Very fast

  • Scaling and Tolerances Specify scales for all control variables

    In Optimization study step for global parameters In Optimization interface features for fields

    All solvers work with rescaled variables Solver tolerances are relative to these

    Keep objectives and constraints close to 1 Solvers may use scaled gradient for termination

  • Comparison of Algorithms

    Gradient-Free Gradient-Based

    ObjectiveFunction

    Any scalar output Must be both smooth and differentiable

    Design Variables Anything, including geometric dimensions

    Anything, but cannot result in remeshing of the geometry

    Allows Remeshing

    Yes No

    Constraints Can only constrain scalar outputs Constraints must be smooth and differentiable, but can be at each point in space

    RelativePerformance

    Increases exponentially with the number of design variables

    Performance is not very sensitive to the number of design variables

  • So what else can you do?

    Parameter Estimation & Curve Fitting

    Shape & Dimension

    Topology

  • Structural Sizing Optimization of joint

    positions in a truck-mounted crane.

    Reduces force on boom lift cylinder for a range of operation conditions

    Uses the MultibodyDynamics Module

  • Multi-study Structural Sizing Weight minimization of

    a mounting bracket. Multi-study constraints

    Maximum stress under static load

    Lowest eigenfrequency

  • Estimating the material properties based upon experimental data

    http://www.comsol.com/model/transient-optimization-fitting-material-properties-of-a-wall-10905

  • Examp