Spectral Graph Theory. Outline:. Definitions and different spectra Physical analogy Description of bisection algorithm Relationship of spectrum to graph structure My own recent work on graphical images. I. Definition and Different Spectra:.
Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.
Spectral Graph Theory
If graph is undirected, A is symmetric. This means that its eigenvectors are real and orthogonal
Let D be the matrix composed of I multiplied by a vector containing the degree of each vertex
Then,
Sometimes normalized by
Adjacency Matrix:
Proportional to the sum:
In matrix notation:
Laplacian Matrix:
Proportional to the differences:
In matrix notation:
x3
x2
x1
x5
x4
Plugging in yields:
Canceling sin terms yields
Seeking solutions to the differential equation of the form:
where a and x0 are a scalar and vector
Therefore eigenvalues of M are:
With eigenvectors:
First harmonic:
+
Second harmonic:
+

Demmel endorses sign cut, most applied researchers seem to favor
median cut, while mathematicians (and Shi/Malik) favor ratio cut
A. The Lanczos method
Lanczos method takes an n x n sparse, symmetric matrix A and
computes a k x k tridiagonal matrix T whose eigenvalues/vectors
are good approximations of those in A
Even with k much smaller than n, the approximation is fairly good.
Fortunately, the values which converge first are the largest and
smallest, including the Fiedler values
Choose an arbitrary starting vector r
b(0) = norm(r)
i = 0
while not converged
i++
v(i) = r / b(i1)
r = A*v(i)
r = r  b(i1)*v(i1)
a(i) = dotproduct(v(i), r)
r = r a(i)*v(i)
b(i) = norm(r)
end
Solution: Multilevel method
Although approximation is quick and dirty, when you’re only concerned with the sign (or median or...), a rough approximation is okay
The number of connected components of G is equal to the number of λi = 0 In particular, λ2 0 iff G is connected
Eigenvalues of L(G) are nonnegative, in particular:
Feidler value = λ2 = “algebraic connectivity”
Let S be a subset of G i.e. with the same nodes and a subset of edges, so that S is “less connected” than G, then:
The number of spanning trees of a graph G is given by:
For a subset of the vertices S, let:
Define the Cheeger constant as:
Then the Fiedler value is bounded by:
A graph is regular with degree r iff:
Current Work in Image Processing
Solution: Formulate IP on graphs
Advantages:
Space variant vision possible
Processing on fewer components in same domain
Graph algorithms are fast
Goals:
Space variant applications
Choosing nodes based on content
Use graph theory algorithms to novel ends
Logonoid simulations, etc.
Why?
Current methods  ImgGraph
Get/set  Basic OOP methods
Adjacency  Computes the adjacency matrix
Laplacian  Computes the Laplacian matrix
Impotimg  Imports an image centered at location fovea
Edgegraph  Computes the edge map using 1st derivative
Makeweights  Computes edge weights
Neighborhood  Compute neighbor list and distances to each neighbor
Removenode  Removes a node list
Removeisolated  Finds and removes nodes of degree zero
Threshcut  Segmentation by intensity thresholding
Showstruct Displays graph structure without image data
Showmesh  Displays graph by interpolating across enclosed polygons
Showgraph  Displays graph as a traditional stickandball, where balls are colored to reflect RGB values at the node
Findfaces  Generates a list of enclosed polygons that can be fed to patch in the Showmesh call
Given two parameters (A, B), I define the weight between nodes i and j as:
Where d represents the (Euclidean) distance between nodes and c represents the RGB difference (L1)
Very important so that the results of IP algorithms may be visually assessed
Data structure supports three visualizations: