Graph sparsification by effective resistances
Download
1 / 76

Graph Sparsification by Effective Resistances - PowerPoint PPT Presentation


  • 260 Views
  • Updated On :

Graph Sparsification by Effective Resistances. Daniel Spielman Nikhil Srivastava Yale. Sparsification. Approximate any graph G by a sparse graph H . Nontrivial statement about G H is faster to compute with than G. G. H. Cut Sparsifiers [Benczur-Karger’96]. H approximates G if

Related searches for Graph Sparsification by Effective Resistances

loader
I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
capcha
Download Presentation

PowerPoint Slideshow about 'Graph Sparsification by Effective Resistances' - libitha


An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
Graph sparsification by effective resistances l.jpg

Graph Sparsification by Effective Resistances

Daniel Spielman

Nikhil Srivastava

Yale


Sparsification l.jpg
Sparsification

Approximate any graph G by a sparse graph H.

  • Nontrivial statement about G

  • H is faster to compute with than G

G

H


Cut sparsifiers benczur karger 96 l.jpg
Cut Sparsifiers [Benczur-Karger’96]

H approximates G if

for every cut S½V

sum of weights of edges leaving S is preserved

Can find H with O(nlogn/2) edges in time

S

S


The laplacian quick review l.jpg
The Laplacian (quick review)

Quadratic form

Positive semidefinite

Ker(LG)=span(1) if G is connected


Cuts and the quadratic form l.jpg
Cuts and the Quadratic Form

For characteristic vector

So BK says:


A stronger notion l.jpg
A Stronger Notion

For characteristic vector

So BK says:



1 all eigenvalues are preserved l.jpg
1. All eigenvalues are preserved

By Courant-Fischer,

G and H have similar eigenvalues.

For spectral purposes, G and H are equivalent.


1 all eigenvalues are preserved9 l.jpg
1. All eigenvalues are preserved

By Courant-Fischer,

G and H have similar eigenvalues.

For spectral purposes, G and H are equivalent.

cf. matrix sparsifiers [AM01,FKV04,AHK05]


2 linear system solvers l.jpg
2. Linear System Solvers

Conj. Gradient solves in

ignore

(time to

mult. by A)


2 preconditioning l.jpg
2. Preconditioning

Find easy that approximates .

Solve instead.

Time to solve

(mult.by )


2 preconditioning12 l.jpg
2. Preconditioning

Use B=LH ?

Find easy that approximates .

Solve instead.

?

Time to solve

(mult.by )


2 preconditioning13 l.jpg
2. Preconditioning

Spielman-Teng

[STOC ’04]

Nearly linear time.

Find easy that approximates .

Solve instead.

Time to solve

(mult.by )



Slide15 l.jpg

Example: Sparsify Complete Graph by

Ramanujan Expander

G is complete on n vertices.

H is d-regular Ramanujan graph.


Slide16 l.jpg

Example: Sparsify Complete Graph by

Ramanujan Expander

G is complete on n vertices.

H is d-regular Ramanujan graph.

So, is a good sparsifier for G.

Each edge has weight (n/d)


Slide17 l.jpg

Example: Dumbell

Kn

Kn

1

d-regular

Ramanujan,

times n/d

d-regular

Ramanujan,

times n/d

1


Slide18 l.jpg

G2

Kn

Kn

G1

G3

1

F2

d-regular

Ramanujan,

times n/d

d-regular

Ramanujan,

times n/d

1

F1

F3

Example: Dumbell


Slide19 l.jpg

Example: Dumbell. Must include cut edge

Kn

Kn

e

Only this edge contributes to

If



Main theorem l.jpg
Main Theorem

Every G=(V,E,c) contains H=(V,F,d) with O(nlogn/2) edges such that:


Main theorem22 l.jpg
Main Theorem

Every G=(V,E,c) contains H=(V,F,d) with O(nlogn/2) edges such that:

Can find H in time by random sampling.


Main theorem23 l.jpg
Main Theorem

Every G=(V,E,c) contains H=(V,F,d) with O(nlogn/2) edges such that:

Can find H in time by random sampling.

Improves [BK’96]

Improves O(nlogc n) sparsifiers [ST’04]


Slide24 l.jpg

How?

Electrical Flows.


Slide25 l.jpg

Effective Resistance

Identify each edge of G with a unit resistor

is resistance between endpoints of e

1

v

u

1

1

a


Slide26 l.jpg

Effective Resistance

Identify each edge of G with a unit resistor

is resistance between endpoints of e

1

v

u

Resistance of

path is 2

1

1

a


Slide27 l.jpg

Effective Resistance

Identify each edge of G with a unit resistor

is resistance between endpoints of e

1

v

u

Resistance from u to v

is

Resistance of

path is 2

1

1

a


Slide28 l.jpg

Effective Resistance

Identify each edge of G with a unit resistor

is resistance between endpoints of e

-1

+1

v

u

1/3

-1/3

a

0


Slide29 l.jpg

Effective Resistance

Identify each edge of G with a unit resistor

is resistance between endpoints of e

V

+1

-1

= potential difference between endpoints when

flow one unit from one endpoint to other


Slide30 l.jpg

Effective Resistance

V

+1

-1

[Chandra et al. STOC ’89]


The algorithm l.jpg
The Algorithm

Sample edges of G with probability

If chosen, include in H with weight

Take q=O(nlogn/2) samples with replacement

Divide all weights by q.


An algebraic expression for l.jpg
An algebraic expression for

Orient G arbitrarily.


An algebraic expression for33 l.jpg
An algebraic expression for

Orient G arbitrarily.

Signed incidence matrix Bm£ n :


An algebraic expression for34 l.jpg
An algebraic expression for

Orient G arbitrarily.

Signed incidence matrix Bm£ n :

Write Laplacian as


Slide35 l.jpg

V

+1

-1

An algebraic expression for




Slide38 l.jpg

An algebraic expression for

Then

Reduce thm. to statement about 


Slide39 l.jpg
Goal

Want



Reduction to l.jpg
Reduction to

Lemma.


New goal l.jpg
New Goal

Lemma.


The algorithm43 l.jpg
The Algorithm

Sample edges of G with probability

If chosen, include in H with weight

Take q=O(nlogn/2) samples with replacement

Divide all weights by q.


The algorithm44 l.jpg
The Algorithm

Sample columns of with probability

If chosen, include in with weight

Take q=O(nlogn/2) samples with replacement

Divide all weights by q.


The algorithm45 l.jpg
The Algorithm

Sample columns of with probability

If chosen, include in with weight

Take q=O(nlogn/2) samples with replacement

Divide all weights by q.


The algorithm46 l.jpg
The Algorithm

Sample columns of with probability

If chosen, include in with weight

Take q=O(nlogn/2) samples with replacement

Divide all weights by q.


The algorithm47 l.jpg
The Algorithm

Sample columns of with probability

If chosen, include in with weight

Take q=O(nlogn/2) samples with replacement

Divide all weights by q.

cf. low-rank approx.

[FKV04,RV07]



A concentration result49 l.jpg
A Concentration Result

So with prob. ½:


A concentration result50 l.jpg
A Concentration Result

So with prob. ½:



The algorithm52 l.jpg
The Algorithm

Sample edges of G with probability

If chosen, include in H with weight

Take q=O(nlogn/2) samples with replacement

Divide all weights by q.


The algorithm53 l.jpg
The Algorithm

Sample edges of G with probability

If chosen, include in H with weight

Take q=O(nlogn/2) samples with replacement

Divide all weights by q.



Nearly linear time55 l.jpg
Nearly Linear Time

So care about distances between cols. of BL-1


Nearly linear time56 l.jpg
Nearly Linear Time

So care about distances between cols. of BL-1

Johnson-Lindenstrauss! Take random Qlogn£ m

Set Z=QBL-1



Nearly linear time58 l.jpg
Nearly Linear Time

Find rows of Zlog n£ n by

Z=QBL-1

ZL=QB

ziL=(QB)i


Nearly linear time59 l.jpg
Nearly Linear Time

Find rows of Zlog n£ n by

Z=QBL-1

ZL=QB

ziL=(QB)i

Solve O(logn) linear systems in L using Spielman-Teng ’04 solver

which uses combinatorial O(nlogcn) sparsifier.

Can show approximate Reff suffice.


Main conjecture l.jpg
Main Conjecture

Sparsifiers with O(n) edges.


Slide61 l.jpg

Example: Another edge to include

m-1

1

k-by-k

complete

bipartite

0

m

k-by-k

complete

bipartite

1

m-1

61


The projection matrix l.jpg
The Projection Matrix

Lemma.

  •  is a projection matrix

  • im()=im(B)

  • Tr()=n-1

  • (e,e)=||(e,-)||2






Last steps67 l.jpg
Last Steps

We also have

and

since ||e||2=(e,e).


Reduction to68 l.jpg
Reduction to

Goal:


Reduction to69 l.jpg
Reduction to

Goal:

Write

Then


Reduction to70 l.jpg
Reduction to

Goal:

Write

Then

Goal:







Reduction to76 l.jpg
Reduction to

Lemma.

Proof.  is the projection onto im(B).


ad