1 / 76

# graph sparsification by effective resistances - PowerPoint PPT Presentation

Graph Sparsification by Effective Resistances. Daniel Spielman Nikhil Srivastava Yale. Sparsification. Approximate any graph G by a sparse graph H . Nontrivial statement about G H is faster to compute with than G. G. H. Cut Sparsifiers [Benczur-Karger’96]. H approximates G if

Related searches for graph sparsification by effective resistances

I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.

## PowerPoint Slideshow about 'graph sparsification by effective resistances' - libitha

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.

- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript

### Graph Sparsification by Effective Resistances

Daniel Spielman

Nikhil Srivastava

Yale

Approximate any graph G by a sparse graph H.

• H is faster to compute with than G

G

H

H approximates G if

for every cut S½V

sum of weights of edges leaving S is preserved

Can find H with O(nlogn/2) edges in time

S

S

Positive semidefinite

Ker(LG)=span(1) if G is connected

For characteristic vector

So BK says:

For characteristic vector

So BK says:

By Courant-Fischer,

G and H have similar eigenvalues.

For spectral purposes, G and H are equivalent.

By Courant-Fischer,

G and H have similar eigenvalues.

For spectral purposes, G and H are equivalent.

cf. matrix sparsifiers [AM01,FKV04,AHK05]

ignore

(time to

mult. by A)

Find easy that approximates .

Time to solve

(mult.by )

Use B=LH ?

Find easy that approximates .

?

Time to solve

(mult.by )

Spielman-Teng

[STOC ’04]

Nearly linear time.

Find easy that approximates .

Time to solve

(mult.by )

Ramanujan Expander

G is complete on n vertices.

H is d-regular Ramanujan graph.

Ramanujan Expander

G is complete on n vertices.

H is d-regular Ramanujan graph.

So, is a good sparsifier for G.

Each edge has weight (n/d)

Kn

Kn

1

d-regular

Ramanujan,

times n/d

d-regular

Ramanujan,

times n/d

1

G2

Kn

Kn

G1

G3

1

F2

d-regular

Ramanujan,

times n/d

d-regular

Ramanujan,

times n/d

1

F1

F3

Example: Dumbell

Kn

Kn

e

Only this edge contributes to

If

Every G=(V,E,c) contains H=(V,F,d) with O(nlogn/2) edges such that:

Every G=(V,E,c) contains H=(V,F,d) with O(nlogn/2) edges such that:

Can find H in time by random sampling.

Every G=(V,E,c) contains H=(V,F,d) with O(nlogn/2) edges such that:

Can find H in time by random sampling.

Improves [BK’96]

Improves O(nlogc n) sparsifiers [ST’04]

Electrical Flows.

Identify each edge of G with a unit resistor

is resistance between endpoints of e

1

v

u

1

1

a

Identify each edge of G with a unit resistor

is resistance between endpoints of e

1

v

u

Resistance of

path is 2

1

1

a

Identify each edge of G with a unit resistor

is resistance between endpoints of e

1

v

u

Resistance from u to v

is

Resistance of

path is 2

1

1

a

Identify each edge of G with a unit resistor

is resistance between endpoints of e

-1

+1

v

u

1/3

-1/3

a

0

Identify each edge of G with a unit resistor

is resistance between endpoints of e

V

+1

-1

= potential difference between endpoints when

flow one unit from one endpoint to other

V

+1

-1

[Chandra et al. STOC ’89]

Sample edges of G with probability

If chosen, include in H with weight

Take q=O(nlogn/2) samples with replacement

Divide all weights by q.

Orient G arbitrarily.

Orient G arbitrarily.

Signed incidence matrix Bm£ n :

Orient G arbitrarily.

Signed incidence matrix Bm£ n :

Write Laplacian as

V

+1

-1

An algebraic expression for

Then

Reduce thm. to statement about 

Want

Lemma.

Lemma.

Sample edges of G with probability

If chosen, include in H with weight

Take q=O(nlogn/2) samples with replacement

Divide all weights by q.

Sample columns of with probability

If chosen, include in with weight

Take q=O(nlogn/2) samples with replacement

Divide all weights by q.

Sample columns of with probability

If chosen, include in with weight

Take q=O(nlogn/2) samples with replacement

Divide all weights by q.

Sample columns of with probability

If chosen, include in with weight

Take q=O(nlogn/2) samples with replacement

Divide all weights by q.

Sample columns of with probability

If chosen, include in with weight

Take q=O(nlogn/2) samples with replacement

Divide all weights by q.

cf. low-rank approx.

[FKV04,RV07]

So with prob. ½:

So with prob. ½:

Sample edges of G with probability

If chosen, include in H with weight

Take q=O(nlogn/2) samples with replacement

Divide all weights by q.

Sample edges of G with probability

If chosen, include in H with weight

Take q=O(nlogn/2) samples with replacement

Divide all weights by q.

So care about distances between cols. of BL-1

So care about distances between cols. of BL-1

Johnson-Lindenstrauss! Take random Qlogn£ m

Set Z=QBL-1

Find rows of Zlog n£ n by

Z=QBL-1

ZL=QB

ziL=(QB)i

Find rows of Zlog n£ n by

Z=QBL-1

ZL=QB

ziL=(QB)i

Solve O(logn) linear systems in L using Spielman-Teng ’04 solver

which uses combinatorial O(nlogcn) sparsifier.

Can show approximate Reff suffice.

Sparsifiers with O(n) edges.

m-1

1

k-by-k

complete

bipartite

0

m

k-by-k

complete

bipartite

1

m-1

61

Lemma.

•  is a projection matrix

• im()=im(B)

• Tr()=n-1

• (e,e)=||(e,-)||2

We also have

and

since ||e||2=(e,e).

Goal:

Goal:

Write

Then

Goal:

Write

Then

Goal:

Lemma.

Proof.  is the projection onto im(B).