1 / 21

Using Pattern Recognition Techniques to Derive a Formal Analysis of Why Heuristic Functions Work

B. John Oommen A Joint Work with Luis G. Rueda School of Computer Science Carleton University. Using Pattern Recognition Techniques to Derive a Formal Analysis of Why Heuristic Functions Work. Optimization Problems. Any arbitrary optimization problem: Instances, drawn from a finite set, X,

margot
Download Presentation

Using Pattern Recognition Techniques to Derive a Formal Analysis of Why Heuristic Functions Work

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. B. John Oommen A Joint Work with Luis G. Rueda School of Computer Science Carleton University Using Pattern Recognition Techniques to Derive a Formal Analysis of Why Heuristic Functions Work

  2. Optimization Problems • Any arbitrary optimization problem: • Instances, drawn from a finite set, X, • An Objective function • Some feasibility functions • The aim: • Find an (hopefully the unique) instance of X, • which leads to a maximum (or minimum) • subject to the feasibility constraints.

  3. An Example • The Traveling Salesman Problem (TSP) • Consider the cities numbered from 1 to n, • The salesman starts from city 1, • visits every city once, and • Returns to city 1. • An instance of X is a permutation of cities: • For example, 1 4 3 2 5, if five cities considered • The objective function: • The sum of the inter-city distances: • 1 4, 4  3, 3  2, 2  5, 5  1

  4. Heuristic Functions A Heuristic algorithm is an algorithm which attempts to find a certain instance X that maximizes the objective function It iteratively invokes a Heuristic function. The heuristic function estimates (or measures) the cost of the solution. The heuristic itself is a method that performs one or more changes to the current instance.

  5. An Open Problem Consider a Heuristic algorithm that invokes any of Two Heuristic Functions : H1 and H2 • used in estimating the solution to an • Optimization problem • If Estimation accuracy of H1 > • Estimation accuracy of H2 Does it imply that • H1 has higher probability of leading to the optimal QEP?

  6. where , and c Pattern Recogniton Modeling Two heuristic functions : H1 and H2 Probability of choosing a cost value of a Solution: two independent random variables: X1 and X2 Distribution -- doubly exponential:

  7. Pattern Recogniton Modeling Our model: Error function is doubly exponential. Typical in reliability analysis and failure models. How reliable is a Solution when only estimate known? Assumptions: Mean cost of Optimal Solution: , then • shift the origin by  E[X] = 0 • Variances: • Estimate X1 better than Estimate of X2

  8. then : Main Result (Exponential) • H1 and H2, two heuristic functions. • X1 and X2, two r.v. optimal solution obtained by H1 and H2 • X1’ and X2’, other two r.v. for sub-optimal solution • Let p1 and p2 the prob. that H1 and H2 respectively make the wrong decision. Shown that:

  9. Proof (Graphical Sketch) For a particular x, the prob. that x leads to wrong decision by H1 is given by: X1(subopt) X1(opt) X2(opt) X2(subopt)

  10. or X1(subopt) X1(opt) X2(opt) X2(subopt) if x < c Proof (Cont’d)

  11. Proof (Cont’d) The total probability that H1 makes the wrong decision for all values of x is: Similarly, the prob. that H2 makes the wrong decision for all values of x is:

  12. Proof (Cont’d) Solving integrals and making p1 p2, we have: which, using ln x  x - 1, implies that p1 p2 QED • where 1=1c and 2=2c • Also 2substituted for k1

  13. and and Second Theorem F(a1,k) can also be written in terms of a1 and k as: • Suppose that a1  0and0  k  1, • then G(a1,k)  0, and • there are two solutions for G(a1,k) = 0 Proof: Taking partial derivatives and solving:

  14. R-ACM / Eq-width R-ACM / Eq-depth T-ACM / Eq-width T-ACM / Eq-depth G >>> 0, or p1 <<< p2 R-ACM / T-ACM Eq-width / Eq-depth G 0, or p1p2 Minimum in a1 = 0 and 0 k  1 Graphical Analysis (Histograms)

  15. Analysis : Normal Distn’s No integration possible for the normal pdf Shown numerically that p1 p2

  16. Plot of the Function G

  17. l is estimated as where N is the # of samples l Estimation for Histograms

  18. Estimated for RACM True d-Exp Similarities of R-ACM and d-Exp

  19. Simulations Details Simulations performed in Query Optimization: • 4 independent runs per simulation. • 100 random Databases per run  400 per simulation. • 6 Relations, • 6 Attributes per relation, • 100 tuples per relation. • Four independent runs on 100 databases: R-ACM vs. Traditional using: 11 bins, 50 values

  20. Empirical Results # of times in which R-ACM yields better QEP # of times in which Eq-width yields better QEP # of times in which Eq-depth yields better QEP

  21. Conclusions • Applied PR Techniques to solve problem of relating Heuristic Function Accuracy and Solution Optimality • Used a reasonable model of accuracy (doubly exponential distribution). • Shown analytically how the high accuracy of heuristic function leads to a superior solutions. • Numerically shown the results for normal distributions • Shown that R-ACM yield better QEPs in a larger number of times than Equi-width and Equi-depth. • Empirical results on randomly generated databases also shown the superiority of R-ACM. • Graphically demonstrated the validity of our model.

More Related