RMI Workshop - Genetic Algorithms. Genetic Algorithms and Related Optimization Techniques: Introduction and Applications. Kelly D. Crawford ARCO Crawford Software, Inc. Other Optimization Colleagues. Donald J. MacAllister ARCO Michael D. McCormack Richard F. Stoisits
Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.
Genetic Algorithms and Related Optimization Techniques: Introduction and Applications
Kelly D. Crawford
Crawford Software, Inc.
Donald J. MacAllister
Michael D. McCormack
Richard F. Stoisits
Optimization Associates, Inc.
What every “intro to GAs” talk begins with:
- Survival of the fittest
What I am not going to talk about:
- Survival of the fittest
- Exception: nomenclature/jargon
It’s not about biology - it’s about search!
Given a potential solution vector to some problem: x
Any set of constraints on x: Ax b
And a means to assess the relative worth of that solution: f(x)
(which may be continuous or discrete)
Optimization describes the application of a set of proven techniques
that can find the optimal or near optimal solution to the problem.
Examples of optimization techniques:
Genetic algorithms, genetic programming, simulated annealing,
evolutionary programming, evolution strategies, classifier systems,
linear programming, nonlinear programming, integer programming,
pareto methods, discrete hill climbers, gradient techniques,
random search, brute force (exhaustive search), backtracking,
branch and bound, greedy techniques, etc...
Gas lift optimization (Ashtart):
x: Amount of gas injected into each well
Ax b: Max total gas available, max water produced
f(x): Total oil produced
Technique: Learning bit climber
Free Surface Multiple Suppression:
x: Inverse source wavelet
Ax b: Min/max wavelet amplitudes
f(x): Total seismic energy after wavelet is applied
Technique: Genetic algorithm and learning bit climber
continuous: Gradient search, linear programming
discrete: Integer programming, gradient estimators
Ok for search spaces with asingle peak/trough
Random search, brute force (exhaustive search)
Ok for small search spaces
Hard problems (large search spaces, multiple peaks/troughs)
need both convergent and divergent behaviors
Genetic algorithms, simulated annealing, learning hill climbers, etc.
These techniques can exploit the peaks/troughs, as well as
intelligently explore the search space.
Need a balanced combination of both
convergent ( ) and divergent ( )
behaviors to find solutions in
complicated search spaces.
Genetic Algorithms - Representing a Solution
Genetic Algorithms - Crossover and Mutation
x = 1001010011010……0000101001111010011110……10011100101100
f(x) ==> 19020.234789
Genetic Algorithms - Evaluating a Solution’s Fitness
So just how good are you, kid…?
Total Daily Oil Production for the Field
Genetic Algorithms - The Process
Crossover and Mutation
When you need…
...some way to represent potential solutions to a problem
(representation: bit string, list of integers or floats,
permutation, combinations, etc).
...some way to evaluate a potential solution resulting in a
scalar. This will be used by the GA to rank the worth of
a solution. This fitness (or evaluation) function needs to
be very efficient, as it may need to be called thousands -
even millions - of times.
But you do not need...
...the final solution to be optimal.
...speed (this varies)
...you absolutely must have the optimal solution
to a problem.
...an analytical or empirical method already exists
and works adequately (typically means the problem
is unimodal, having only a single “peak”).
...evaluating a potential solution to your problem
takes a long time to compute.
...there are so few potential solutions that you can
easily check all of them to find the optimum
(small search spaces).
What appears as reality, but isn’t!
After Multiple Removal
Each producer may get fluids from multiple patterns.
Each injector may put fluids into multiple patterns.
This is a diagram of a single pattern showing 16 allocation factors.
The entire field has between 3000 to 7000 allocation factors,
represented using 10 bits each.
…combined into one chromosome
Actual Chromosome Before Normalization
Translated Chromosome After Normalization
A potential solution to this problem consists of a list
containing both allocation factors and pressures, each
of which are floating point values
Any single allocation factor or pressure, x, has a range
of [0..1]. Assuming we need a resolution of ~ 0.01, we
can represent each x using 10 bits.
0.01 0.23 0.82 0.53 ...
0011011010 1010011011 1001101010 1010011010 ...
but we can estimate the gradient
g(a-) vs g(a)
g(a) vs g(a+)
Essentially a hill climber, but there is no analytical
information about what direction is “up” (i.e., no gradient,
or derivative). Instead, you sample neighboring points.
Bit Climber Algorithm:
Randomly generate a string of bits, X
Loop (until stopping criteria satisfied)
Randomly select a bit position, j, in X, and “flip” it
(i.e., if X(j) == 1, set to 0, and vice versa)
Evaluate the new f(X)
If fitness is worse, “unflip” X(j) (put it back like it was)
The bit climber does not attempt to avoid large changes to the chromosome (a single bit flip can result in a large overall change).
10010101 10010101 01010010
A simple heuristic: Assign high probabilities to the
low order bits, low probabilities to the high order bits.
10010101 10010101 01010010
Another simple heuristic: Multiply a bit’s flipping
probability by .25 (give or take) when we flip it.
This decreases the likelihood of ever flipping it again.