the smoothed analysis of algorithms
Download
Skip this Video
Download Presentation
The Smoothed Analysis of Algorithms

Loading in 2 Seconds...

play fullscreen
1 / 43

The Smoothed Analysis of Algorithms - PowerPoint PPT Presentation


  • 729 Views
  • Uploaded on

The Smoothed Analysis of Algorithms. Daniel A. Spielman MIT. With Shang-Hua Teng (Boston University) John Dunagan (Microsoft Research) and Arvind Sankar (Goldman Sachs). Outline. Why?. What?. The Simplex Method. Gaussian Elimination. Other Problems. Conclusion. Problem:

loader
I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
capcha
Download Presentation

PowerPoint Slideshow about 'The Smoothed Analysis of Algorithms' - mike_john


An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
the smoothed analysis of algorithms

The Smoothed Analysis of Algorithms

Daniel A. Spielman

MIT

With Shang-Hua Teng (Boston University)

John Dunagan (Microsoft Research) and Arvind Sankar (Goldman Sachs)

outline
Outline

Why?

What?

The Simplex Method

Gaussian Elimination

Other Problems

Conclusion

slide3

Problem:

Heuristics that work in practice,

with no sound theoretical explanation

Exponential worst-case complexity,

but works in practice

Polynomial worst-case complexity,

but much faster in practice

Heuristic speeds up code,

with poor results in worst-case

slide4

Attempted resolution:

Average-case analysis

Measure expected performance

on random inputs

slide5

Random

is not typical

slide6

Critique of

Average-case Analysis

Random objects have

very special properties

with exponentially high probability

Actual inputs might not

look random.

slide7

Smoothed Analysis:

a hybrid of worst and average case

worst case

average case

slide8

Smoothed Analysis:

a hybrid of worst and average case

is Gaussian

of stand dev

worst case

average case

smoothed complexity

slide9

Smoothed Complexity

Interpolates between worst and average case

Considers neighborhood of every input

If low, all bad inputs are unstable

complexity landscape
Complexity Landscape

worst case

run time

average case

input space

smoothed complexity landscape convolved with gaussian
Smoothed Complexity Landscape(convolved with Gaussian)

run time

smoothed

complexity

input space

classical example simplex method for linear programming
Classical Example: Simplex Method for Linear Programming

max

s.t.

Worst-Case: exponential

Average-Case: polynomial

Widely used in practice

classical example simplex method for linear programming15
Classical Example: Simplex Method for Linear Programming

max

s.t.

Worst-Case: exponential

Average-Case: polynomial

Widely used in practice

smoothed analysis of simplex method

max

s.t.

max

s.t.

Smoothed Analysis of Simplex Method

G is Gaussian

Theorem: For all A, b, c, simplex method takes

expected time polynomialin

slide18

Analysis of Simplex Method

Using Shadow-Vertex Pivot Rule

slide21

Polar Form of Linear Programming

c

max 

cÎ ConvexHull(a1, a2, ..., am)

count pairs in different facets
Count pairs in different facets

[

]

Different

Facets

< c/N

Pr

So, expect c Facets

slide27

Angle

Distance

smoothed analysis of simplex method29

max

s.t.

max

s.t.

Smoothed Analysis of Simplex Method

G is Gaussian

Theorem: For all A, b, c, simplex method takes

expected time polynomialin

slide30

Interior Point Methods for Linear Programming

Analysis Method #Iterations

Observation

Worst-Case, upper

Worst-Case, lower

Average-Case

Smoothed, upper

( )

[Dunagan-S-Teng], [S-Teng]

Conjecture

slide31

Gaussian Elimination for Ax = b

>> A = randn(2)

A =

-0.4326 0.1253

-1.6656 0.2877

>> b = randn(2,1)

b =

-1.1465

1.1909

>> x = A \ b

x =

-5.6821

-28.7583

>> norm(A*x - b)

ans =

8.0059e-016

slide32

Gaussian Elimination for Ax = b

>> A = 2*eye(70) - tril(ones(70));

>> A(:,70) = 1;

>> b = randn(70,1);

>> x = A \ b;

>> norm(A*x - b)

ans =

3.5340e+004

Failed!

Perturb A

>> Ap = A + randn(70) / 10^9;

>> x = Ap \ b;

>> norm(Ap*x - b)

ans =

5.8950e-015

slide33

>> A = 2*eye(70) - tril(ones(70));

>> A(:,70) = 1;

>> b = randn(70,1);

>> x = A \ b;

>> norm(A*x - b)

ans =

3.5340e+004

Failed!

Perturb A

>> Ap = A + randn(70) / 10^9;

>> x = Ap \ b;

>> norm(Ap*x - b)

ans =

5.8950e-015

>> norm(A*x - b)

ans =

3.6802e-008

Solved original too!

Gaussian Elimination for Ax = b

slide34

>> A = 2*eye(70) - tril(ones(70));

>> A(:,70) = 1;

>> b = randn(70,1);

>> x = A \ b;

>> norm(A*x - b)

ans =

3.5340e+004

Failed!

Perturb A

>> Ap = A + randn(70) / 10^9;

>> x = Ap \ b;

>> norm(Ap*x - b)

ans =

5.8950e-015

>> norm(A*x - b)

ans =

3.6802e-008

Solved original too!

Gaussian Elimination for Ax = b

slide35

>> A = 2*eye(70) - tril(ones(70));

>> A(:,70) = 1;

>> b = randn(70,1);

>> x = A \ b;

>> norm(A*x - b)

ans =

3.5340e+004

Failed!

Perturb A

>> Ap = A + randn(70) / 10^9;

>> x = Ap \ b;

>> norm(Ap*x - b)

ans =

5.8950e-015

>> norm(A*x - b)

ans =

3.6802e-008

Solved original too!

Gaussian Elimination for Ax = b

slide36

Gaussian Elimination with Partial Pivoting

Fast heuristic for maintaining precision,

by trying to keep entries small

slide37

Gaussian Elimination with Partial Pivoting

Fast heuristic for maintaining precision,

by trying to keep entries small

Pivot not just on zeros,

but to move up entry of largest magnitude

slide38

Gaussian Elimination with Partial Pivoting

“Gaussian elimination with partial pivoting is utterly stable in practice. In fifty years of computing, no matrix problems that excite an explosive instability are know to have arisen under natural circumstances …

Matrices with large growth factors are vanishingly rare in applications.”

Nick Trefethen

slide39

Gaussian Elimination with Partial Pivoting

“Gaussian elimination with partial pivoting is utterly stable in practice. In fifty years of computing, no matrix problems that excite an explosive instability are know to have arisen under natural circumstances …

Matrices with large growth factors are vanishingly rare in applications.”

Nick Trefethen

Theorem:

[Sankar-S-Teng]

slide40

Mesh Generation

Parallel complexity of Ruppert’s Delaunay refinement is O( (log n/s)2)

Spielman-Teng-Üngör

slide41

Other Smoothed Analyses

Perceptron[Blum-Dunagan]

Quicksort[Banderier-Beier-Mehlhorn]

Parallel connectivity in digraphs [Frieze-Flaxman]

Complex Gaussian Elimination [Yeung]Smoothed analysis of K(A) [Wschebor]

On smoothed analysis in dense graphs and formulas

[Krivelevich-Sudakov-Tetali]Smoothed Number of Extreme Points under Uniform Noise

[Damerow-Sohler]

Typical Properties of Winners and Losers in Discrete Optimization [Beier-Vöcking]

Multi-Level Feedback scheduling

[Becchetti-Leonardi-Marchetti-Shäfer-Vredeveld]

Smoothed motion complexity

[Damerow, Meyer auf der Heide, Räcke, Scheideler, Sohler]

slide42

Future Smoothed Analyses

Multilevel graph partitioning

Smoothed Analysis of Chaco and Metis

Differential Evolution

and other optimization heuristics

Computing Nash Equilibria

slide43

Future Smoothed Analyses

Perturb less!

Preserve zeros

Preserve magnitudes of numbers

Property-preserving perturbations

More Discrete smoothed analyses

New algorithms

For more, see the Smoothed Analysis Homepage

ad