1 / 28

Random numbers and optimization techniques

second lecture. Random numbers and optimization techniques. Jorge Andre Swieca School Campos do Jordão, January,2003. References. The Nature of Mathematical Modeling, N. Gershenfelder , Cambridge, 1999; Numerical Recipes in C, Second Edition, W.H Press et al. , Cambridge, 1992;

gari
Download Presentation

Random numbers and optimization techniques

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. second lecture Random numbers and optimization techniques Jorge Andre Swieca School Campos do Jordão, January,2003

  2. References • The Nature of Mathematical Modeling, N. Gershenfelder, Cambridge, 1999; • Numerical Recipes in C, Second Edition, W.H Press et al., Cambridge, 1992; • Statistical Data Analysis, G. Cowan, Oxford, 1998 • Computational Physics, Dean Karlen (online), 1998

  3. Random O acaso não existe. car sticker

  4. Random numbers Important task: generate random variables from known probability distributions. random numbers: produced by the computer in a strictly deterministic way – pseudorandom (→ Monte Carlo Method) random number generators: linear congruential generators: c=3, a=5, m=16 ex: 0,2,3,13,4,7,6,1,8,11,10,5,12,15,14,9,0,2….

  5. Generators Arguments from number theory: good values for a, c and m. • longest possible period • individual elements within a period should • follow each other “randomly” Good generators: Ex. 1 RANDU a = 65539, m=231, c=0

  6. Generators

  7. Generators

  8. Generators

  9. Generators “Random numbers fall mainly in the planes” Marsaglia, Proc. Acad. Sci. 61, 25, 1968 What is seen in RANDU is present in any multiplicative congruential generator. In 32 bit machines: maximum number of hyperplanes in the space of d-dimensions is: d=3 2953 d=4 566 d=6 120 d=10 41 RANDU has much less than the maximum!

  10. Generators Ex. 2 Minimal standard generator (Num. Recp. ran0) RAN1 and RAN2, given in the first edition, are “at best mediocre” a = 75 =16807, m=231-1

  11. Generators cernlib (mathlib V115, F.James ) A portable high-quality random generator for lattice field theory simulation, M. Lüscher, Comp. Phys. Comm. 79, 100, 1994 Ex. 3 RANLUX period ≥ 10165 Functional definition: an algorithm that generates uniform numbers in acceptable if it is not rejected by a set of tests. Lots of literature about random generators testing. See a set of programs (Die Hard) http://stat.fsu.edu/~geo/diehard.html • Recommendations: • Do some simple tests. • Check the results with other generator

  12. Inverse transform p(x) uniform probability distribution x: uniform deviate Generate y according to g(y). 1 G(y) uniform deviate in analytically or numerically. x g(y) 0 y out

  13. F(x) 1 0.5 x 0 1 2 3 Inverse transform Ex. 4 discrete case f(x=0)=0.3 f(x=1)=0.2 f(x=2)=0.5 u 0.0 ≤ u ≤ 0.3 x=0 0.3 ≤ u ≤ 0.5 x=1 0.5 ≤ u ≤ 1.0 x=2 for 2000 deviates: x=0 0.2880 X=1 0.2075 X=2 0.5045

  14. Inverse transform • amount of time until a specific event occurs; • time between independent events (time until a part fails); Exponential

  15. ymax f(x) 0 xmin xmax Acceptance-rejection method if (x,y) under the curve, accept the point, else, discard.

  16. Acceptance-rejection method

  17. Importance sampling g(x): random numbers are easy to generate g(x) • generate random x according to g(x) • generate u uniformily between 0 and g(x); • if u < f(x), accept x, if not, reject. f(x)

  18. Optimization (minimization) Choose the best parameters: iteractive search starting from a initial (guess) value. Many strategies for minimization: local search, global search, with derivative evaluation or not, etc. Objective: find an acceptable solution. It is possible that many solutions are actually good. Other sources of uncertainty larger than differences among the solutions;

  19. Downhill simplex method • Nelder-Mead • Most functionality for the least amount of code; • Simplex (triangle in 2D, tetrahedron in 3D, etc); • Random location, evaluate the function at D+1 vertices; • Iterative procedure to improve the vertex with the highest value of the function at each step: reflect, reflect and grow, reflect and shrink, shrink, shrink towards the minimum; • Minimum: stop when there is no more improvement; Num. Rec.: amoeba.c

  20. Downhill simplex method

  21. simplex in one dimension: a pair of points; • find the minimum of a function in a given direction: line minimization: • series of line minimization → multi-dimensional search; • Powell’s method: updating the directions searched to find a set of directions that don’t interfere with each other; • , D line minimizations, : ; • good direction to keep for future minimization if advantageous; • is added to the set of directions used for minimization (replacing the most similar to it) • if gradiente of function available → conjugate gradiente algorithm Powell’s method

  22. Powell’s method

  23. Simulated annealing • Growth of a cristal: difficult optimization problem; • Liquid instantaneously frozen: atoms trapped in the configuration they had in the liquid state (energy barrier: from glassy to crystalline state ) • If liquid slowly cooled: atoms would explore many local arrangements • Thermodynamics: the relative probability of system in state with energy E • trapped in the lowest energy configuration • slower cooling rate, more likely to find the lowest • global energy T=0 lowest state T>0 some chance to be in other state

  24. Metropolis (53) update a simulation Kirkpatrick (80’s) same ideia for other hard problems: Implementation issues: • new state randomly selected (Enew evaluated) • if Enew > E, accept the state, else • if Enew < E, accept the state with prob. Simulated annealing Energy → cost function (simulated annealing) High energy: any move is accepted T →0 lowest minima found • Selection of trial moves: downhill simplex or conjugate gradiente, but Boltzman factor allows mistakes; • Cooling schedule; freeze the system in a bad solution X waste computer time

  25. Simulated annealing

  26. Genetic algorithms • Evolution: a very hard optimization problem; • Explore many options in parallel rather than concentrating on trying many changes around a single design. • Sim. Annealing : one set of search parameters repeatedly updated X G.A: keep an ensemble of sets of parameters. • G.A.: state is given by a population: each member a complete set of parameters for function being searched; • Population updated in generations

  27. Genetic algorithms Update criteria: • Fitness: evaluation of the function for each member of the population; • Reproduction; new population (size fixed) selected based on fitness; low fitness parameters may disappear; • Crossover: members of the ensemble can share parameters; • Mutation: changes in the parameters: random or taking advantage of what is already known to generate good moves;

  28. Genetic algorithms

More Related