random walks l.
Download
Skip this Video
Loading SlideShow in 5 Seconds..
Random Walks PowerPoint Presentation
Download Presentation
Random Walks

Loading in 2 Seconds...

play fullscreen
1 / 34

Random Walks - PowerPoint PPT Presentation


  • 259 Views
  • Uploaded on

Random Walks. Ben Hescott CS591a1 November 18, 2002. Random Walk Definition.

loader
I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
capcha
Download Presentation

PowerPoint Slideshow about 'Random Walks' - Anita


An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
random walks

Random Walks

Ben Hescott

CS591a1

November 18, 2002

random walk definition
Random Walk Definition
  • Given an undirected, connected graph G(V,E) with |V| = n, |E| = m a random “step” in G is a move from some node u to a randomly selected neighbor v. A random walk is a sequence of these random steps starting from some initial node.
points to note
Points to note
  • Processes is discrete
  • G is not necessarily planar
  • G is not necessarily fully connected
  • A walk can back in on itself
  • Can consider staying in same place as a move
questions
Questions
  • How many steps to get from u to v
  • How many steps to get back to initial node
  • How many steps to visit every node
  • Easy questions to answer if we consider a simple example
regular graphs
Regular Graphs
  • The expected number of steps to get from vertex u to v in a regular graph is n-1,
  • The expected number of steps to get back to starting point is n for a regular graph.
  • Expected number of steps to visit every node in a regular graph is
triangle example
Triangle Example
  • Consider probabilities of being at a particular vertex at each step in walk.
  • Each of these can be consider a vector,
transition matrix
Transition Matrix
  • We can use a matrix to represent transition probabilities, consider adjacency matrix A and diagonal matrix, D, with entries 1/d(i) where d(i) is degree of node i. Then we can define matrix M = DA
  • For triangle d(i) = 2 so M =
  • Note for triangle Pr[a to b] = Pr[b to a]
markov chains generalized random walks
Markov Chains - Generalized Random Walks
  • A Markov Chain is a stochastic process defined on a set of states with matrix of transition probabilities.
  • The process is discrete, namely it is only in one state at given time step (0, 1, 2, …)
  • Next move does not depend on previous moves, formally
markov chain definitions
Markov Chain Definitions
  • Define vector
  • where the i-th entry is the probability that the chain is in state t
  • Note:
  • Notice that we can then calculate everything given q0 and P.
more definitions
More Definitions
  • Consider question where am I after t steps, define t-step probability
  • Question is this my first time at node j?
  • Consider probability visit state j at some time t>0 when started at state i.
  • Consider how many steps to get from state i to j. given ,otherwise
even more definitions
Even More Definitions
  • Consider fii
  • State i is called transient fii < 1
  • State i is called persistent if fii = 1
  • If state i is persistent and hii is infinite then i is null-persistent
  • If state i is persistent and hii is not infinite then i is non-null-persistent
  • Turns out every Markov Chain is either transient or non-null-persistent
almost there
Almost there
  • A strong component of a directed graph G is a subgraph C of G where for each edge eij there is a directed path from i to j and from j to i.
  • A Markov Chain is irreducible if underlying graph G consists of a single strong component.
  • A stationary distribution for a Markov Chain with transition matrix P is distribution  s.t. P  = 
  • The periodicity of state i is max int T for which there is a q0 and a>0 s.t. for all t, if then t is in arithmetic progression A state is periodic if T>1 and aperiodic otherwise
  • An ergodic Markov Chain is one where all states are aperiodic and non-null persistent
fundamental theorem of markov chains
Fundamental Theorem of Markov Chains
  • Given any irreducible, finite, aperoidic Markov Chain then all of the following hold
        • The chain is ergodic
        • There is a unique stationary distribution  where for
        • for
        • Given N(i,t) number of time chain visits state i in t steps then
random walk is a markov chain
Random Walk is a Markov Chain
  • Consider G a connected, non-bipartite, undirected graph with |V| = n, |E| = m. There is a corresponding Markov Chain
  • Consider states of chain to be set of vertices
  • Define transitional matrix to be
interesting facts
Interesting facts
  • MG is irreducible since G is connected and undirected
  • Notice the periodicity is the gcd of all cycles in G - closed walk
    • Smallest walk is 2 go one step and come back
    • Since G is non-bipartite then there is odd length cycle
    • GCD is then 1 so MG is aperiodic
fundamental theorem holds
Fundamental Theorem Holds
  • So we have a stationary distribution, P = 
  • But
  • Good news,
  • Also get
hitting time
Hitting Time
  • Generally hij, the expected number of steps needed before reaching node j when starting from node i is the hitting time.
  • The commute time is the expected number of steps to reach node j when starting from i and then returning back to i.
  • The commute is bounded by 2m
  • We can express hitting time in terms of commute time as
lollipop
Lollipop
  • Hitting time from i to j not necessarily same a time from j to i. Consider the kite or lollipop graph.
  • Here
cover time
Cover Time
  • How long until we visit every node?
  • The expected number of steps to reach every node in a graph G starting from node i is called the cover time, Ci(G)
  • Consider the maximum cover time over all nodes
  • [Matthews]The maximum cover time of any graph with n nodes is (1+1/2+…+1/n) times the maximum hitting time between any two nodes - 2m(n-1)
coupon collector
Coupon Collector
  • Want to collect n different coupons and every day Stop and Shop sends a different coupon at random - how long do have to wait before you can by food?
  • Consider cover time on on a complete graph, here cover time is O(nlgn)
mixing rate
Mixing Rate
  • Going back to probability, could ask how quickly do we converge to the stationary (limiting) distribution? We call this rate the mixing rate of the random walk.
  • We saw
  • How fast
mixing with the eigenvalues
Mixing with the Eigenvalues
  • How do we calculate - yes eigenvalues!
  • Since we are considering probability transitions as a matrix why not use spectral techniques
  • Need graph G to be non-bipartite
  • First need to make sure P is symmetric, not true unless G is regular
more decomposition
More decomposition
  • Need to make P symmetric.
  • Recall that P = DA, where D is diagonal matrix with entries of d(u), degree of node u
  • Consider
  • Claim this let us work with spectral since
  • Now mixing rate is
graph connectivity
Graph Connectivity
  • Recall the graph connectivity problem, Given undirected graph G(V,E) want to find out if node s is connected to node t.
  • Can do this in deterministic polynomial time.
  • But what about space, would like to do this in a small amount of space.
  • Recall hitting time of a graph is at most n3
  • Try a walk of 2n3 steps to get there, needs only lg(n) space to count number of steps.
sampling
Sampling
  • So what’s the fuss?
  • We can use random walks to sample, since we have the powerful notion of stationary distribution and on a regular graph this is uniform distribution, we can get at random elements.
  • More importantly we can get a random sample from an exponentially large set.
covering problems
Covering Problems
  • Jerrun, Valiant, and Vazirani - Babai product estimator, or enumeration - self reducibilty
  • Given V a set and V1, V2, V3, …,Vm subsets
    • For all i, |Vi| is polynomial time computable
    • Can sample from V uniformly at random
    • For all v in V, can determine efficiently that v is in Vi
  • Can we get size of union of subsets, or can we enumerate V in polynomial time
permanent
Permanent
  • Want to count the number of perfect matchings in a bipartite graph.
  • This is the permanent of a matrix
  • Given a n x n matrix A the permanent is
  • This is #P-complete
commercial break
Commercial Break
  • #P is the coolest class, defined as counting class.
  • Consider the number of solutions to the problem
  • Hard - [Toda] PH is contained in #P
  • Want to show hyp-PH is also contained in #P
how to use random walk for permanent approximation
How to use random walk for permanent approximation
  • Given graph G with d(u) > n/2 want to generate a random perfect matching
  • First notice that input graph is bipartite.
  • Instead consider graph of perfect matchings
  • Let each node be a perfect matching in G - problem is how to connect
  • Need to consider near-perfect matchings a matching with n/2-1 edges
permanent cont
Permanent Cont.
  • Connect near perfect matchings with an edge if they have n/2-2 edges in common and connect a perfect matching to all of the near perfect matchings contained in it to create a graph H.
  • Notice degree of H is bounded by 3n
  • Walk (randomly) a polynomial number of steps in H - if node is a perfect matching -good - otherwise try again.
volume
Volume
  • Want to be able to calculate the volume of a convex body in n dimensions
  • Computing convex polytope is #P-Hard
  • No fear randomization is here
  • Want to be able to fit this problem into enumeration problem
volume cont
Volume Cont.
  • Given C convex body in n dimensions, assume that C contains the origin
  • Further assume that C contains the unit ball and itself is contained in a ball of radius r<n3/2.
  • Define Ci = intersection of C and ball around origin of radius
volume cont33
Volume Cont.
  • Then we get
  • And we know Vol(Ci)
  • Now only need to be able to get a element uniformly at random - use a walk
  • Difficult to do walk, create a grid and walk from intersections of Ci’s
  • Stationary distribution is not uniform but a distribution with density function proportional to local conductance
partial reference list
Partial Reference List
  • L. Lovasz. Random Walks on Graphs: A survey. Combinatorics Paul Erdos is Eighty Volume 2, p1-46
  • R. Motwani, P. Raghavan Randomized Algorithms. Cambridge University 1995