- 729 Views
- Uploaded on

Download Presentation
## PowerPoint Slideshow about 'The Smoothed Analysis of Algorithms' - mike_john

**An Image/Link below is provided (as is) to download presentation**

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.

- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -

Presentation Transcript

### The Smoothed Analysis of Algorithms

Daniel A. Spielman

MIT

With Shang-Hua Teng (Boston University)

John Dunagan (Microsoft Research) and Arvind Sankar (Goldman Sachs)

Heuristics that work in practice,

with no sound theoretical explanation

Exponential worst-case complexity,

but works in practice

Polynomial worst-case complexity,

but much faster in practice

Heuristic speeds up code,

with poor results in worst-case

is not typical

Average-case Analysis

Random objects have

very special properties

with exponentially high probability

Actual inputs might not

look random.

a hybrid of worst and average case

is Gaussian

of stand dev

worst case

average case

smoothed complexity

Interpolates between worst and average case

Considers neighborhood of every input

If low, all bad inputs are unstable

Classical Example: Simplex Method for Linear Programming

max

s.t.

Worst-Case: exponential

Average-Case: polynomial

Widely used in practice

Classical Example: Simplex Method for Linear Programming

max

s.t.

Worst-Case: exponential

Average-Case: polynomial

Widely used in practice

s.t.

max

s.t.

Smoothed Analysis of Simplex MethodG is Gaussian

Theorem: For all A, b, c, simplex method takes

expected time polynomialin

Using Shadow-Vertex Pivot Rule

Distance

s.t.

max

s.t.

Smoothed Analysis of Simplex MethodG is Gaussian

Theorem: For all A, b, c, simplex method takes

expected time polynomialin

Interior Point Methods for Linear Programming

Analysis Method #Iterations

Observation

Worst-Case, upper

Worst-Case, lower

Average-Case

Smoothed, upper

( )

[Dunagan-S-Teng], [S-Teng]

Conjecture

Gaussian Elimination for Ax = b

>> A = randn(2)

A =

-0.4326 0.1253

-1.6656 0.2877

>> b = randn(2,1)

b =

-1.1465

1.1909

>> x = A \ b

x =

-5.6821

-28.7583

>> norm(A*x - b)

ans =

8.0059e-016

Gaussian Elimination for Ax = b

>> A = 2*eye(70) - tril(ones(70));

>> A(:,70) = 1;

>> b = randn(70,1);

>> x = A \ b;

>> norm(A*x - b)

ans =

3.5340e+004

Failed!

Perturb A

>> Ap = A + randn(70) / 10^9;

>> x = Ap \ b;

>> norm(Ap*x - b)

ans =

5.8950e-015

>> A = 2*eye(70) - tril(ones(70));

>> A(:,70) = 1;

>> b = randn(70,1);

>> x = A \ b;

>> norm(A*x - b)

ans =

3.5340e+004

Failed!

Perturb A

>> Ap = A + randn(70) / 10^9;

>> x = Ap \ b;

>> norm(Ap*x - b)

ans =

5.8950e-015

>> norm(A*x - b)

ans =

3.6802e-008

Solved original too!

Gaussian Elimination for Ax = b

>> A = 2*eye(70) - tril(ones(70));

>> A(:,70) = 1;

>> b = randn(70,1);

>> x = A \ b;

>> norm(A*x - b)

ans =

3.5340e+004

Failed!

Perturb A

>> Ap = A + randn(70) / 10^9;

>> x = Ap \ b;

>> norm(Ap*x - b)

ans =

5.8950e-015

>> norm(A*x - b)

ans =

3.6802e-008

Solved original too!

Gaussian Elimination for Ax = b

>> A = 2*eye(70) - tril(ones(70));

>> A(:,70) = 1;

>> b = randn(70,1);

>> x = A \ b;

>> norm(A*x - b)

ans =

3.5340e+004

Failed!

Perturb A

>> Ap = A + randn(70) / 10^9;

>> x = Ap \ b;

>> norm(Ap*x - b)

ans =

5.8950e-015

>> norm(A*x - b)

ans =

3.6802e-008

Solved original too!

Gaussian Elimination for Ax = b

Gaussian Elimination with Partial Pivoting

Fast heuristic for maintaining precision,

by trying to keep entries small

Gaussian Elimination with Partial Pivoting

Fast heuristic for maintaining precision,

by trying to keep entries small

Pivot not just on zeros,

but to move up entry of largest magnitude

Gaussian Elimination with Partial Pivoting

“Gaussian elimination with partial pivoting is utterly stable in practice. In fifty years of computing, no matrix problems that excite an explosive instability are know to have arisen under natural circumstances …

Matrices with large growth factors are vanishingly rare in applications.”

Nick Trefethen

Gaussian Elimination with Partial Pivoting

“Gaussian elimination with partial pivoting is utterly stable in practice. In fifty years of computing, no matrix problems that excite an explosive instability are know to have arisen under natural circumstances …

Matrices with large growth factors are vanishingly rare in applications.”

Nick Trefethen

Theorem:

[Sankar-S-Teng]

Parallel complexity of Ruppert’s Delaunay refinement is O( (log n/s)2)

Spielman-Teng-Üngör

Perceptron[Blum-Dunagan]

Quicksort[Banderier-Beier-Mehlhorn]

Parallel connectivity in digraphs [Frieze-Flaxman]

Complex Gaussian Elimination [Yeung]Smoothed analysis of K(A) [Wschebor]

On smoothed analysis in dense graphs and formulas

[Krivelevich-Sudakov-Tetali]Smoothed Number of Extreme Points under Uniform Noise

[Damerow-Sohler]

Typical Properties of Winners and Losers in Discrete Optimization [Beier-Vöcking]

Multi-Level Feedback scheduling

[Becchetti-Leonardi-Marchetti-Shäfer-Vredeveld]

Smoothed motion complexity

[Damerow, Meyer auf der Heide, Räcke, Scheideler, Sohler]

Multilevel graph partitioning

Smoothed Analysis of Chaco and Metis

Differential Evolution

and other optimization heuristics

Computing Nash Equilibria

Perturb less!

Preserve zeros

Preserve magnitudes of numbers

Property-preserving perturbations

More Discrete smoothed analyses

New algorithms

For more, see the Smoothed Analysis Homepage

Download Presentation

Connecting to Server..