1 / 131

Experimental Study of Algorithms: Understanding Computational Complexity and Runtime Distributions

This masterclass delves into the scientific use of experimentation in studying algorithms, specifically focusing on understanding computational complexity beyond worst-case scenarios and analyzing runtime distributions of complete search methods. Topics covered include benchmarks, phase transition phenomena, heavy-tailed phenomena, tractable sub-structure, backdoors, and performance evaluation of solvers on real-world problems.

jmattes
Download Presentation

Experimental Study of Algorithms: Understanding Computational Complexity and Runtime Distributions

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Master Class on Experimental Study of Algorithms Scientific Use of Experimentation Carla P. Gomes Cornell University CPAIOR Bologna , Italy 2010

  2. Big Picture of Topics Covered in this talk Part I • Understanding computational complexity beyond worst-case complexity • Benchmarks: The role of Random Distributions Random SAT • Typical Case Analysis vs. Worst Case Complexity analysis – phase transition phenomena • Part II • Understanding runtime distributions of complete search methods • Heavy and Fat-Tailed Phenomena in combinatorial search and Restart strategies • Understanding tractable sub-structure • Backdoors and Tractable sub-structure • Formal Models of Heavy-tails and Backdoors • Performance of current state-of-the art solvers on real-world structured problems exploiting backdoors

  3. II - Understanding runtime distributions of complete search methods

  4. Outline • Complete randomized backtrack search methods • Runtime distributions of complete randomized backtrack search methods

  5. Complete Randomized Backtrack search methods

  6. Exact / Complete Backtrack Methods Main Underlying (Search) Mechanisms in: Mathematical Programming (MP) Constraint Programming (CP) Satisfiability  Backtrack Search; Branch & Bound; Branch & Cut; Branch & Price; Davis-Putnam-Logemann-Lovelan Proc.(DPLL) …

  7. 44 1 x1 = 0 x1 = 1 2 3 44 44 4 x2 = 1 x2 = 0 42 44 x2 = 1 44 44 x2 = 0 x3 = 0 x3 = 1 x3 = 0 x3 = 1 x3 = 0 x3 = 1 5 6 7 8 9 10 11 43 43 43 43 44 8 9 10 11 12 - 13 44 14 15 - 44 16 17 - 38 - 18 19 maximize 16x1 + 22x2 + 12x3 + 8x4 +11x5 + 19x6 subject to 5x1 + 7x2 + 4x3 + 3x4 +4x5 + 6x6  14 xj binary for j = 1 to 6

  8. Backtrack Search - Satisfiability ( aOR NOT b OR NOT c ) AND ( b OR NOT c) AND ( a OR c) State-of-the-art complete solvers are based on backtrack search procedures (typically with unit-propagation, learning, randomization, restarts);

  9. Randomization in Complete Backtrack Search Methods

  10. Motivation: Randomization in Local Search The use of randomizationhas been very successful in the area of local search or meta heuristics. Simulated annealing Genetic algorithms Tabu Search Gsat, Walksat and variants. Limitation: inherent incomplete nature of local search methods – cannot prove optimality or inconsistency.

  11. Randomized Backtrack Search What if the we introduce an element of randomness into a complete backtrack search method? Goal: Explore the addition of a stochastic element into a systematic search procedure without losing completeness.

  12. Randomized Backtrack Search Several ways of introducing randomness into a backtrack search method: simple way  randomlybreaking ties in variable and/or value selection. general framework  imposing a probability distribution for value/value selection or other search parameters; Compare with standard lexicographictie-breaking. Note: with simple book-keeping we can maintain the completeness of the backtrack search method;

  13. Notes on Randomizing Backtrack Search Lots of opportunities to introduce randomization  basically at different decisions points of backtrack search: • Variable/value selection • Look-ahead / look-back procedures • E.g.: • When and how to perform domain reduction/propagation • What cuts to add; • Target backtrack points • Restarts Not necessarily tie breaking only  more generally we can define a probability distribution over the set of possible choices at a given decision point

  14. Can wereplay a “randomized” run? yes since we use pseudo random numbers; if we save the “seed”, we can then repeat the run with the same seed; “Deterministic randomization” (Wolfram 2002) – the behavior of some very complex deterministic systems is so unpredictable that it actually appears to be random (e.g., adding nogoods or cutting constraints between restarts  used in the satisfiability community) Walsh 99 Notes on Randomizing Backtrack Search (cont). • What if we cannot randomized the code? • Randomize the input – • Randomly rename the variables • (Motwani and Raghavan 95) • (Walsh (99) applied this technique to study • the runtime distributions of graph-coloring using a deterministic algorithm based on DSATUR implemented by Trick)

  15. Runtime Distributions of Complete Randomized Backtrack search methods

  16. Backtrack Search Two Different Executions ( aOR NOT b OR NOT c ) AND ( b OR NOT c) AND ( a OR c)

  17. Size of Search Trees in Backtrack Search • The size of the search tree varies dramatically , • depending on the order in which we pick the variables to branch on •  Important to choose good heuristics for variable/value selection;

  18. Runtime distributions of Complete Randomized Backtrack search methods When solving instances of a combinatorial problem such as the Satisfiability problem or an Integer Program using a complete randomized search method such as backtrack search or branch and bound - the run time of the randomized backtrack search method, running on single individual instances (i.e.,several runs of the same complete randomized procedure on the same instance) exhibits very high variance.

  19. Time: 11 30 (*) (*) (*) no solution found - reached cutoff: 2000 Randomized Backtrack Search Latin Square (Order 4)

  20. 2000 500 Erratic Behavior of Sample Mean 3500! sample mean Median = 1! number of runs

  21. Heavy-Tailed Distributions … infinite variance … infinite mean Introduced by Pareto in the 1920’s --- “probabilistic curiosity.” Mandelbrot established the use of heavy-tailed distributions to model real-world fractal phenomena. Examples: stock-market, earth-quakes, weather,...

  22. The Pervasiveness of Heavy-Tailed Phenomena in Economics. Science, Engineering, and Computation Annual meeting (2005).b Tsunami 2004 Blackout of August 15th 2003 > 50 Million People Affected Financial Markets with huge crashes Backtrack search … there are a few billionaires

  23. Power Law Decay Exponential Decay Standard Distribution (finite mean & variance)

  24. Decay of Heavy-tailed Distributions Standard --- Exponential Decay e.g. Normal: Heavy-Tailed --- Power Law Decay e.g. Pareto-Levy:

  25. Levy -Power law Decay Cauchy -Power law Decay Normal - Exponential Decay Normal, Cauchy, and Levy

  26. Tail Probabilities (Standard Normal, Cauchy, Levy)

  27. Normal distribution  kurtosis is 3 Fat tailed distribution  when kurtosis > 3 (e.g., exponential, lognormal) second central moment (i.e., variance) fourth central moment Fat tailed distributions Kurtosis =

  28. Fat and Heavy-tailed distributions Exponential decay for standard distributions, e.g. Normal, Logonormal, exponential: Normal Heavy-Tailed Power Law Decay e.g. Pareto-Levy:

  29. How to Visually Check for Heavy-Tailed Behavior Log-log plot of tail of distribution exhibits linear behavior.

  30. How to Check for “Heavy Tails”? Log-Log plot of tail of distribution should be approximately linear. Slope gives value of infinite mean and infinite variance infinite variance

  31. Pareto =1Lognormal 1,1 Lognormal(1,1) Pareto(1) f(x) X Infinite mean and infinite variance.

  32. Survival Function:Pareto and Lognormal

  33. Example of Heavy Tailed Model Random Walk: Start at position 0 Toss a fair coin: with each head take a step up (+1) with each tail take a step down (-1) X --- number of steps the random walk takes to return to position 0.

  34. Long periods without zero crossing Zero crossing The record of 10,000 tosses of an ideal coin (Feller)

  35. 50% Random Walk Median=2 2 Heavy-tails vs. Non-Heavy-Tails Normal (2,1000000) 1-F(x) Unsolved fraction O,1%>200000 Normal (2,1) X - number of steps the walk takes to return to zero (log scale)

  36. 18% unsolved 0.002% unsolved => Infinite mean Heavy-Tailed Behavior in Quasigroup Completion Problem Domain (1-F(x))(log) Unsolved fraction Number backtracks (log)

  37. To Be or Not To Be Heavy-Tailed Gomes, Fernandez, Selman, Bessiere – CP 04

  38. 18% unsolved 0.002% unsolved => Infinite mean Heavy-Tailed Behavior in Quasigroup Completion Problem Domain (1-F(x))(log) Unsolved fraction Number backtracks (log)

  39. Research Questions: Concrete CSP Models Complete Randomized Backtrack Search • Can we provide a characterization of heavy-tailed behavior: when it occurs and it does not occur? • Can we identify different tail regimes across different constrainedness regions? • Can we get further insights into the tail regime by analyzing the concrete search trees produced by the backtrack search method?

  40. Scope of Study • Random Binary CSP Models • Encodings of CSP Models • Randomized Backtrack Search Algorithms • Search Trees • Statistical Tail Regimes Across Constrainedness Regions • Empirical Results • Theoretical Model

  41. Binary Constraint Networks • A finite binary constraint network P = (X, D,C) • a set of n variables X = {x1, x2, …, xn} • For each variable, set of finite domains D = { D(x1), D(x2), …, D(xn)} • A set C of binary constraints between pairs of variables; a constraint Cij, on the ordered set of variables (xi, xj) is a subset of the Cartesian product D(xi) x D(xj) that specifies the allowed combinations of values for the variables xi and xj. • Solution to the constraint network instantiation of the variables such that all constraints are satisfied.

  42. Random Binary CSP Models Model B < N, D, c, t > N – number of variables; D – size of the domains; c – number of constrained pairs of variables; p1 – proportion of binary constraints included in network ; c = p1 N ( N-1)/ 2; t – tightness of constraints; p2 - proportion of forbidden tuples; t = p2 D2 Model E <N, D, p> N – number of variables; D – size of the domains: p – proportion of forbidden pairs (out of D2N ( N-1)/ 2) (Gent et al 1996) N – from 15 to 50; (Achlioptas et al 2000) (Xu and Li 2000)

  43. Typical Case Analysis: Beyond NP-Completeness Phase Transition Phenomenon: Discriminating “easy” vs. “hard” instances % of solvable instances Computational Cost (Mean) Constrainedness Hogg et al 96

  44. Encodings • Direct CSP Binary Encoding • Satisfiability Encoding (direct encoding)

  45. Backtrack Search Algorithms • Look-ahead performed:: • nolook-ahead (simplebacktrackingBT); • removal of values directlyinconsistent with the last instantiation performed (forward-checkingFC); • arc consistency and propagation (maintaining arc consistency, MAC). • Different heuristics forvariable selection (the next variable toinstantiate): • Random (random); • variables pre-ordered by decreasingdegree in the constraint graph (deg); • smallest domain first, ties broken by decreasing degree (dom+deg) • Different heuristics forvariable value selection: • Random • Lexicographic • For the SAT encodings we used the simplified Davis-Putnam-Logemann-Loveland procedure: Variable/Value static and random

  46. Inconsistent Subtrees

  47. Distributions • Runtime distributions of the backtrack search algorithms; • Distribution of the depth of the inconsistency trees found during the search; All runs were performed without censorship.

  48. Main Results 1 - Runtime distributions 2 – Inconsistent Sub-tree Depth Distributions Dramatically different statistical regimes across the constrainedness regions of CSP models;

  49. Runtime distributions

  50. Distribution of Depth of Inconsistent Subtrees

More Related