Engineering Optimization

1 / 38

# Engineering Optimization - PowerPoint PPT Presentation

Concepts and Applications. Engineering Optimization. Fred van Keulen Matthijs Langelaar CLA H21.1 A.vanKeulen@tudelft.nl. In practice : additional “tricks” needed to deal with: Multimodality Strong fluctuations Round-off errors Divergence. Summary single variable methods. Bracketing +

I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.

## PowerPoint Slideshow about 'Engineering Optimization' - ringo

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.

- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
Concepts and ApplicationsEngineering Optimization
• Fred van Keulen
• Matthijs Langelaar
• CLA H21.1
• A.vanKeulen@tudelft.nl

In practice: additional “tricks” needed to deal with:

• Multimodality
• Strong fluctuations
• Round-off errors
• Divergence
Summary single variable methods
• Bracketing +
• Dichotomous sectioning
• Fibonacci sectioning
• Golden ratio sectioning
• Cubic interpolation
• Bisection method
• Secant method
• Newton method

0th order

1st order

2nd order

• And many, many more!

Descent methods

Unconstrained optimization algorithms
• Single-variable methods
• Multiple variable methods
• 0th order
• 1st order
• 2nd order

Direct search methods

Examples of test functions:

• Rosenbrock’s function (“banana function”)

Optimum: (1, 1)

Test functions
• Comparison of performance of algorithms:
• Mathematical convergence proofs
• Performance on benchmark problems (test functions)

Many local optima:

Optimum: (0, 0)

Test functions (2)

Optimum: (1, 3)

Random walk method:

• Generate random unit direction vectors
• Walk to new point if better
• Decrease stepsize after N steps
Random methods
• Random jumping method:(random search)
• Generate random points, keep the best
Simulated annealing (Metropolis algorithm)
• Random method inspired by natural process: annealing
• Heating of metal/glass to relieve stresses
• Controlled cooling to a state of stable equilibrium with minimal internal stresses
• Probability of internal energy change (Boltzmann’s probability distribution function)
• Note, some chance on energy increase exists!
• S.A. based on this probability concept

Obtain f(y). Accept new design if better. If worse, generate random number r, and accept new design when

Note:

• Stop if design has not changed in several steps. Otherwise, update temperature:
Simulated annealing algorithm
• Set a starting “temperature” T, pick a starting design x, and obtain f(x)
• Randomly generate a new design y close to x

Increasingly negative

Simulated annealing (3)
• As temperature reduces, probability of accepting a bad step reduces as well:

Negative

Reducing

• Accepting bad steps (energy increase) likely in initial phase, but less likely at the end
• Temperature zero: basic random jumping method
• Variants: several steps before test, cooling schemes, …
Random methods properties
• Very robust: work also for discontinuous / nondifferentiable functions
• Can find global minimum
• Last resort: when all else fails
• S.A. known to perform well on several hard problems (traveling salesman)
• Quite inefficient, but can be used in initial stage to determine promising starting point
• Drawback: results not repeatable
Cyclic coordinate search
• Search alternatingly in each coordinate direction
• Perform single-variable optimization along each direction (line search):
• Directions fixed: can lead to slow convergence

Directions for cycle i+1

Steps in cycle i

Powell’s Conjugate Directions method
• Adjusting search directions improves convergence
• Idea: replace first direction with combined direction of a cycle:
• Guaranteed to converge in n cycles for quadratic functions! (theoretically)

Gradually move toward minimum by reflection:

f = 10

f = 5

f = 7

• Simplex: figure of n + 1 points in Rn
• For better performance: expansion/contraction and other tricks
Biologically inspired methods
• Popular: inspiration for algorithms from biological processes:
• Genetic algorithms / evolutionary optimization
• Particle swarms / flocks
• Ant colony methods
• Typically make use of population (collection of designs)
• Computationally intensive
• Stochastic nature, global optimization properties

1

1

0

1

0

0

1

0

1

1

0

0

1

0

1

Genetic algorithms
• Based on evolution theory of Darwin:Survival of the fittest
• Objective = fitness function
• Designs are encoded in chromosomalstrings, ~ genes: e.g. binary strings:

x1

x2

Test termination criteria

Create new population

Crossover

Mutation

Reproduction

Select individualsfor reproduction

Quit

GA flowchart

Create initial population

Evaluate fitness

of all individuals

GA population operators
• Reproduction:
• Exact copy/copies of individual
• Crossover:
• Randomly exchange genes of different parents
• Many possibilities: how many genes, parents, children …
• Mutation:
• Randomly flip some bits of a gene string
• Used sparingly, but important to explore new designs

Mutation:

1

1

0

1

0

0

1

0

1

1

0

0

1

0

1

1

1

0

1

0

1

1

0

1

1

0

0

1

0

1

Population operators
• Crossover:

Parent 2

Parent 1

0

1

1

0

1

0

0

1

0

1

1

0

0

0

1

1

1

0

1

0

0

1

0

1

1

0

0

1

0

1

0

1

1

0

0

0

1

0

1

1

0

0

1

0

1

1

1

0

1

1

0

0

1

0

1

1

0

0

0

1

Child 1

Child 2

Particle swarms / flocks
• No genes and reproduction, but a population that travels through the design space
• Derived from simulations of flocks/schools in nature
• Individuals tend to follow the individual with the best fitness value, but also determine their own path
• Some randomness added to give exploration properties(“craziness parameter”)

Random numbers between 0 and 1

Control “social behavior” vs “individual behavior”

PSO algorithm
• Initialize location and speed of individuals (random)
• Evaluate fitness
• Update best scores: individual (y) and overall (Y)
• Update velocity and position:
Summary 0th order methods
• Nelder-Mead beats Powell in most cases
• Robust: most can deal with discontinuity etc.
• Less attractive for many design variables (>10)
• Stochastic techniques:
• Computationally expensive, but
• Global optimization properties
• Versatile
• Population-based algorithms benefit from parallel computing
Unconstrained optimization algorithms
• Single-variable methods
• Multiple variable methods
• 0th order
• 1st order
• 2nd order

Taylor:

Best direction:

x2

f = 1.9

-f

• Example:

f = 0.044

-f

f = 7.2

x1

Steepest descent method
• Move in direction of largest decrease in f:

Divergence occurs! Remedy: line search

Steepest descent convergence
• Zig-zag convergence behavior:

y2

• Ideal scaling hard to determine (requires Hessian information)

y1

Effect of scaling
• Scaling variables helps a lot!

x2

x1

• Based on building set of conjugate directions, combined with line searches
• Conjugate directions:
• Conjugate directions: guaranteed convergence in N steps for quadratic problems(recall Powell: N cycles of N line searches)

Property: searching along conjugate directions yields optimum of quadratic function in N steps (or less):

Optimality:

• Set of N conjugate directions:

(Special case: orthogonal directions, eigenvectors)

?

(definition)

Conjugate directions (2)
• Optimization by line searches along conjugate directions will converge in N steps (or less):

Line search:

f = c+1

f = c

-f2

d1

But … how to obtain conjugate directions?
• How to generate conjugate directions with only gradient information?

But, in general, A is unknown! Remedy:

Line search:

Conjugate directions (3)
• Condition for conjugate direction:

But because

Now use

Why that last step?

By Fletcher-Reeves: starting from Polak-Rebiere version:

Polak-Rebiere:

• Fletcher-Reeves:
Three CG variants
• For general non-quadratic problems, three variants exist that are equivalent in the quadratic case:
• Hestenes-Stiefel:

Generally bestin most cases

CG practical
• Set first search direction:
• Line search to find next point:
• Next search direction:
• Repeat 3
• Restart every (n+1) steps, using step 2

Slower convergence;

> N steps

• After N steps / bad convergence: restart procedure

etc.

CG properties
• Theoretically converges in N steps or less for quadratic functions
• In practice:
• Finite line search accuracy
• Round-off errors

Equilibrium:

• CG:

Line search:

Application to mechanics (FE)