1 / 13

6.896: Probability and Computation

6.896: Probability and Computation. Spring 2011. lecture 3. Constantinos ( Costis ) Daskalakis costis@mit.edu. recap. Markov Chains. Def: A Markov Chain on Ω is a stochastic process ( X 0 , X 1 ,…, X t , …) such that. distribution of X t conditioning on X 0 = x. then. a.

lynna
Download Presentation

6.896: Probability and Computation

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. 6.896: Probability and Computation Spring 2011 lecture 3 Constantinos (Costis) Daskalakis costis@mit.edu

  2. recap

  3. Markov Chains Def: A Markov Chain on Ω is a stochastic process (X0, X1,…, Xt, …) such that distribution of Xt conditioning on X0 = x. then a. b. =:

  4. Graphical Representation Represent Markov chain by a graph G(P): - nodes are identified with elements of the state-space Ω • there is a directed edge between states x and y if P(x, y) > 0, with edge-weight P(x,y); (no edge if P(x,y)=0;) - self loops are allowed (when P(x,x) >0) Much of the theory of Markov chains only depends on the topology of G (P), rather than its edge-weights. irreducibility aperiodicity

  5. Mixing of Reversible Markov Chains Theorem (Fundamental Theorem of Markov Chains): If a Markov chain P is irreducible and aperiodicthen: a. it has a unique stationary distribution π =π P ; b. ptx → π as t → ∞, for all x∈Ω.

  6. examples

  7. Example 1 Random Walk on undirected Graph G=(V, E) = P is irreducible iffG is connected; P is aperiodiciffG is non-bipartite; P is reversible with respect to π(x) = deg(x)/(2|E|).

  8. Example 2 Card Shufflings Showed convergence to uniform distribution using concept of doubly stochastic matrices.

  9. Example 3 Input: A positive weight function w : Ω → R+. Designing Markov Chains Goal: Define MC (Xt)t such that ptx → π, where The Metropolis Approach - define a graph Γ(Ω,Ε) on Ω. - define a proposal distributionκ(x,) s.t. κ(x,y ) = κ(y,x ) > 0, for all (x, y)E, xy κ(x,y ) = 0, if (x,y)  E

  10. Example 3 The Metropolis Approach - define a graph Γ(Ω,Ε) on Ω. - define a proposal distributionκ(x,) s.t. κ(x,y ) = κ(y,x ) > 0, for all (x, y)E, xy κ(x,y ) = 0, if (x,y)  E the Metropolis chain: When chain is at state x - Pick neighbor y with probability κ(x,y ) - Accept move to y with probability

  11. Example 3 the Metropolis chain: When process is at state x - Pick neighbor y with probability κ(x,y ) - Accept move to y with probability Transition Matrix: Claim: The Metropolis Markov chain is reversible wrt to π(x) = w(x)/Z. Proof: 1point exercise

  12. back to the fundamental theorem

  13. Mixing of Reversible Markov Chains Theorem (Fundamental Theorem of Markov Chains): If a Markov chain P is irreducible and aperiodicthen: a. it has a unique stationary distribution π =π P ; b. ptx → π as t → ∞, for all x∈Ω. Proof: see notes.

More Related