B. John Oommen A Joint Work with Luis G. Rueda School of Computer Science Carleton University. Using Pattern Recognition Techniques to Derive a Formal Analysis of Why Heuristic Functions Work. Optimization Problems. Any arbitrary optimization problem: Instances, drawn from a finite set, X,
Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.
A Joint Work with Luis G. Rueda
School of Computer Science
Using Pattern Recognition Techniques to Derive a Formal Analysis of Why Heuristic Functions Work
A Heuristic algorithm is an algorithm
which attempts to find a certain instance X
that maximizes the objective function
It iteratively invokes a Heuristic function.
The heuristic function estimates (or measures) the cost of the solution.
The heuristic itself is a method that performs one or more changes to the current instance.
Consider a Heuristic algorithm that invokes any of
Two Heuristic Functions : H1 and H2
Does it imply that
Pattern Recogniton Modeling
Two heuristic functions : H1 and H2
Probability of choosing a cost value of a Solution:
two independent random variables: X1 and X2
Distribution -- doubly exponential:
Error function is doubly exponential.
Typical in reliability analysis and failure models.
How reliable is a Solution when only estimate known?
Mean cost of Optimal Solution: , then
Main Result (Exponential)
For a particular x, the prob. that x leads to wrong decision by H1 is given by:
if x < c
The total probability that H1 makes the wrong decision for all values of x is:
Similarly, the prob. that H2 makes the wrong decision for all values of x is:
Solving integrals and making p1 p2, we have:
which, using ln x x - 1, implies that p1 p2 QED
F(a1,k) can also be written in terms of a1 and k as:
Taking partial derivatives and solving:
R-ACM / Eq-depth
T-ACM / Eq-width
T-ACM / Eq-depth
G >>> 0, or
p1 <<< p2
R-ACM / T-ACM
Eq-width / Eq-depth
G 0, or p1p2
Minimum in a1 = 0 and 0 k 1
Graphical Analysis (Histograms)
No integration possible for the normal pdf
Shown numerically that p1 p2
l is estimated as where N is the # of samples
l Estimation for Histograms
Similarities of R-ACM and d-Exp
Simulations performed in Query Optimization:
R-ACM vs. Traditional using:
11 bins, 50 values
# of times in which R-ACM yields better QEP
# of times in which Eq-width yields better QEP
# of times in which Eq-depth yields better QEP