1 / 14

11. Markov Chains (MCs) 2

11. Markov Chains (MCs) 2. Courtesy of J. Bard, L. Page, and J. Heyl. 11.2.1 n-step transition probabilities (review). Transition prob. matrix. n-step transition prob. from state i to j is n-step transition matrix (for all states) is then For instance, two step transition matrix is.

raziya
Download Presentation

11. Markov Chains (MCs) 2

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. 11. Markov Chains (MCs) 2 Courtesy of J. Bard, L. Page, and J. Heyl

  2. 11.2.1 n-step transition probabilities (review)

  3. Transition prob. matrix • n-step transition prob. from state i to j is • n-step transition matrix (for all states) is then • For instance, two step transition matrix is

  4. Chapman-Kolmogorov equations • Prob. of going from state i at t=0, passing though statek at t=m, and ending at state j at t=m+n is • In matrix notation,

  5. 11.2.2 state probabilities

  6. State probability (pmf of an RV!) • Let p(n) = {pj(n)}, for all jE, be the row vector of state probs. at time n (i.e., state prob. vector) • Thus, p(n) is given by • From the initial state • In matrix notation

  7. How an MC changes (Ex 11.10, 11.11) 0.9 0.1 0.8 • A two-state system Silence (state 0) Speech (state 1) 0.2 Suppose p(0)=(1,0) Suppose p(0)=(0,1) p(1) = p(0)P = (0.9, 0.1) Then p(1) = p(0)P= (0,1)P = (0.2, 0.8) p(2) = (1,0)P2 = (0.83, 0.17) p(2)= (0.2,0.8)P= (0,1)P2 = (0.34, 0.66) p(4)= (1,0)P4 = (0.747, 0.253) p(4)= (0,1)P4 = (0.507, 0.493) p(8)= (1,0)P8 = (0.686, 0.314) p(8)= (0,1)P8 = (0.629, 0.371) p(16)= (1,0)P16 = (0.668, 0.332) p(16)= (0,1)P16 = (0.665, 0.335) p(32)= (1,0)P32 = (0.667, 0.333) p(32)= (0,1)P32 = (0.667, 0.333) p(64)= (1,0)P64 = (0.667, 0.333) p(64)= (0,1)P64 = (0.667, 0.333)

  8. Independence of initial condition

  9. The lesson to take away • No matter what assumptions you make about the initial probability distribution, • after a large number of steps, • the state probability distribution is approximately (2/3, 1/3) See p.666, 667

  10. 11.2.3 steady state probabilities

  11. State probabilities (pmf) converge • As n, then transition prob. matrix Pn approaches a matrix whose rows are equal to the same pmf. • In matrix notation, • where 1 is a column vector of all 1’s, and =(0, 1, … ) • The convergence of Pn implies the convergence of the state pmf’s

  12. Steady state probability • System reaches “equilibrium” or “steady state”, • i.e., n, pj(n)  j, pi(n-1)  i • In matrix notation, • here  is stationary state pmf of the Markov chain • To solve this,

  13. 0.90.1 0.20.8 Speech activity system  = P • From the steady state probabilities (1, 2) = (1, 2) 1 = 0.91 + 0.12 2 = 0.21 + 0.82 1 + 2 = 1 1 = 2/3= 0.667 2 = 1/3= 0.333

  14. Question 11-1: Alice, Bob and Carol are playing Frisbee. Alice always throws to Carol. Bob always throws to Alice. Carol throws to Bob 2/3 of the time and to Alice 1/3 of the time. In the long run, what percentage of the time do each of the players have the Frisbee?

More Related