concepts and applications l.
Download
Skip this Video
Loading SlideShow in 5 Seconds..
Engineering Optimization PowerPoint Presentation
Download Presentation
Engineering Optimization

Loading in 2 Seconds...

play fullscreen
1 / 38

Engineering Optimization - PowerPoint PPT Presentation


  • 156 Views
  • Uploaded on

Concepts and Applications. Engineering Optimization. Fred van Keulen Matthijs Langelaar CLA H21.1 A.vanKeulen@tudelft.nl. In practice : additional “tricks” needed to deal with: Multimodality Strong fluctuations Round-off errors Divergence. Summary single variable methods. Bracketing +

loader
I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
capcha
Download Presentation

PowerPoint Slideshow about 'Engineering Optimization' - ringo


An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
concepts and applications
Concepts and ApplicationsEngineering Optimization
  • Fred van Keulen
  • Matthijs Langelaar
  • CLA H21.1
  • A.vanKeulen@tudelft.nl
summary single variable methods

In practice: additional “tricks” needed to deal with:

  • Multimodality
  • Strong fluctuations
  • Round-off errors
  • Divergence
Summary single variable methods
  • Bracketing +
      • Dichotomous sectioning
      • Fibonacci sectioning
      • Golden ratio sectioning
      • Quadratic interpolation
      • Cubic interpolation
      • Bisection method
      • Secant method
      • Newton method

0th order

1st order

2nd order

  • And many, many more!
unconstrained optimization algorithms

Descent methods

Unconstrained optimization algorithms
  • Single-variable methods
  • Multiple variable methods
    • 0th order
    • 1st order
    • 2nd order

Direct search methods

test functions

Examples of test functions:

    • Rosenbrock’s function (“banana function”)

Optimum: (1, 1)

Test functions
  • Comparison of performance of algorithms:
    • Mathematical convergence proofs
    • Performance on benchmark problems (test functions)
test functions 2

Many local optima:

Optimum: (0, 0)

Test functions (2)
  • Quadratic function:

Optimum: (1, 3)

random methods

Random walk method:

    • Generate random unit direction vectors
    • Walk to new point if better
    • Decrease stepsize after N steps
Random methods
  • Random jumping method:(random search)
    • Generate random points, keep the best
simulated annealing metropolis algorithm
Simulated annealing (Metropolis algorithm)
  • Random method inspired by natural process: annealing
    • Heating of metal/glass to relieve stresses
    • Controlled cooling to a state of stable equilibrium with minimal internal stresses
    • Probability of internal energy change (Boltzmann’s probability distribution function)
    • Note, some chance on energy increase exists!
  • S.A. based on this probability concept
simulated annealing algorithm

Obtain f(y). Accept new design if better. If worse, generate random number r, and accept new design when

Note:

  • Stop if design has not changed in several steps. Otherwise, update temperature:
Simulated annealing algorithm
  • Set a starting “temperature” T, pick a starting design x, and obtain f(x)
  • Randomly generate a new design y close to x
simulated annealing 3

Increasingly negative

Simulated annealing (3)
  • As temperature reduces, probability of accepting a bad step reduces as well:

Negative

Reducing

  • Accepting bad steps (energy increase) likely in initial phase, but less likely at the end
  • Temperature zero: basic random jumping method
  • Variants: several steps before test, cooling schemes, …
random methods properties
Random methods properties
  • Very robust: work also for discontinuous / nondifferentiable functions
  • Can find global minimum
  • Last resort: when all else fails
  • S.A. known to perform well on several hard problems (traveling salesman)
  • Quite inefficient, but can be used in initial stage to determine promising starting point
  • Drawback: results not repeatable
cyclic coordinate search
Cyclic coordinate search
  • Search alternatingly in each coordinate direction
  • Perform single-variable optimization along each direction (line search):
  • Directions fixed: can lead to slow convergence
powell s conjugate directions method

Directions for cycle i+1

Steps in cycle i

Powell’s Conjugate Directions method
  • Adjusting search directions improves convergence
  • Idea: replace first direction with combined direction of a cycle:
  • Guaranteed to converge in n cycles for quadratic functions! (theoretically)
nelder and mead simplex method

Gradually move toward minimum by reflection:

f = 10

f = 5

f = 7

Nelder and Mead Simplex method
  • Simplex: figure of n + 1 points in Rn
  • For better performance: expansion/contraction and other tricks
biologically inspired methods
Biologically inspired methods
  • Popular: inspiration for algorithms from biological processes:
    • Genetic algorithms / evolutionary optimization
    • Particle swarms / flocks
    • Ant colony methods
  • Typically make use of population (collection of designs)
  • Computationally intensive
  • Stochastic nature, global optimization properties
genetic algorithms

1

1

0

1

0

0

1

0

1

1

0

0

1

0

1

Genetic algorithms
  • Based on evolution theory of Darwin:Survival of the fittest
  • Objective = fitness function
  • Designs are encoded in chromosomalstrings, ~ genes: e.g. binary strings:

x1

x2

ga flowchart

Test termination criteria

Create new population

Crossover

Mutation

Reproduction

Select individualsfor reproduction

Quit

GA flowchart

Create initial population

Evaluate fitness

of all individuals

ga population operators
GA population operators
  • Reproduction:
    • Exact copy/copies of individual
  • Crossover:
    • Randomly exchange genes of different parents
    • Many possibilities: how many genes, parents, children …
  • Mutation:
    • Randomly flip some bits of a gene string
    • Used sparingly, but important to explore new designs
population operators

Mutation:

1

1

0

1

0

0

1

0

1

1

0

0

1

0

1

1

1

0

1

0

1

1

0

1

1

0

0

1

0

1

Population operators
  • Crossover:

Parent 2

Parent 1

0

1

1

0

1

0

0

1

0

1

1

0

0

0

1

1

1

0

1

0

0

1

0

1

1

0

0

1

0

1

0

1

1

0

0

0

1

0

1

1

0

0

1

0

1

1

1

0

1

1

0

0

1

0

1

1

0

0

0

1

Child 1

Child 2

particle swarms flocks
Particle swarms / flocks
  • No genes and reproduction, but a population that travels through the design space
  • Derived from simulations of flocks/schools in nature
  • Individuals tend to follow the individual with the best fitness value, but also determine their own path
  • Some randomness added to give exploration properties(“craziness parameter”)
pso algorithm

Random numbers between 0 and 1

Control “social behavior” vs “individual behavior”

PSO algorithm
  • Initialize location and speed of individuals (random)
  • Evaluate fitness
  • Update best scores: individual (y) and overall (Y)
  • Update velocity and position:
summary 0 th order methods
Summary 0th order methods
  • Nelder-Mead beats Powell in most cases
  • Robust: most can deal with discontinuity etc.
  • Less attractive for many design variables (>10)
  • Stochastic techniques:
    • Computationally expensive, but
    • Global optimization properties
    • Versatile
  • Population-based algorithms benefit from parallel computing
unconstrained optimization algorithms23
Unconstrained optimization algorithms
  • Single-variable methods
  • Multiple variable methods
    • 0th order
    • 1st order
    • 2nd order
steepest descent method

Taylor:

Best direction:

x2

f = 1.9

-f

  • Example:

f = 0.044

-f

f = 7.2

x1

Steepest descent method
  • Move in direction of largest decrease in f:

Divergence occurs! Remedy: line search

steepest descent convergence
Steepest descent convergence
  • Zig-zag convergence behavior:
effect of scaling

y2

  • Ideal scaling hard to determine (requires Hessian information)

y1

Effect of scaling
  • Scaling variables helps a lot!

x2

x1

fletcher reeves conjugate gradient method
Fletcher-Reeves conjugate gradient method
  • Based on building set of conjugate directions, combined with line searches
  • Conjugate directions:
  • Conjugate directions: guaranteed convergence in N steps for quadratic problems(recall Powell: N cycles of N line searches)
fletcher reeves conjugate gradient method28

Property: searching along conjugate directions yields optimum of quadratic function in N steps (or less):

Optimality:

Fletcher-Reeves Conjugate gradient method
  • Set of N conjugate directions:

(Special case: orthogonal directions, eigenvectors)

conjugate directions 2

?

(definition)

Conjugate directions (2)
  • Optimization by line searches along conjugate directions will converge in N steps (or less):
but how to obtain conjugate directions

Line search:

f = c+1

f = c

-f2

d1

But … how to obtain conjugate directions?
  • How to generate conjugate directions with only gradient information?

Start with steepest descent direction:

conjugate directions 3

But, in general, A is unknown! Remedy:

Line search:

Gradients:

Conjugate directions (3)
  • Condition for conjugate direction:
why that last step

But because

Now use

Why that last step?

By Fletcher-Reeves: starting from Polak-Rebiere version:

three cg variants

Polak-Rebiere:

  • Fletcher-Reeves:
Three CG variants
  • For general non-quadratic problems, three variants exist that are equivalent in the quadratic case:
    • Hestenes-Stiefel:

Generally bestin most cases

cg practical
CG practical
  • Start with abritrary x1
  • Set first search direction:
  • Line search to find next point:
  • Next search direction:
  • Repeat 3
  • Restart every (n+1) steps, using step 2
cg properties

Slower convergence;

> N steps

  • After N steps / bad convergence: restart procedure

etc.

CG properties
  • Theoretically converges in N steps or less for quadratic functions
  • In practice:
    • Non-quadratic functions
    • Finite line search accuracy
    • Round-off errors
application to mechanics fe

Equilibrium:

  • CG:

Line search:

Application to mechanics (FE)
  • Structural mechanics:Quadratic function!
  • Simple operations on element level. Attractive for large N!