1 / 53

STAT131 W12L1a Markov Chains

STAT131 W12L1a Markov Chains. by Anne Porter alp@uow.edu.au. Lecture Outline. Naming conventions Matrices Definition Multiplication Probability Markov Chains Definition Examples. Describe the location of the in the following diagram citing the row then the column.

ailish
Download Presentation

STAT131 W12L1a Markov Chains

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. STAT131W12L1a Markov Chains by Anne Porter alp@uow.edu.au

  2. Lecture Outline • Naming conventions • Matrices • Definition • Multiplication • Probability • Markov Chains • Definition • Examples

  3. Describe the location of the in the following diagram citing the row then the column. n1 n2 n3 n4 m1 m2 m3 Naming Conventions Squarem1n2 or maybe m1n2

  4. How else could we name the location of , keeping row • and column order but not using m and n in the description? n1 n2 n3 n4 m1 m2 m3 Naming Conventions Square12 or s12 or c12 or….

  5. Definition 1: Matrix • An m x nmatrix is a rectangular array of elements, with m rows and n columns, written

  6. m=3 n=2 Example: Matrix elements • An m x nmatrix is a rectangular array of elements, with m rows and n columns. • Given (a) What is b32 ? 1

  7. m=1 n=3 Example: Matrix elements • An m x nmatrix is a rectangular array of elements, with m rows and n columns. • Given (a) What is b13 ? 1

  8. Definition 2:Order of a matrix • An m x n matrix is said to be of order (or size) m x n. Example:If and (a) What is the size of A ? (b) What is the size of B? 2x3 3x3

  9. and the size of the new matrix C will be 2 x 3 If A is m x n and B is n x p, then AB is m x p Matrix multiplication • Two matrices A and B can multiplied together only if the number of columns of A is equal to the number of rows of B. • An example and A is order 2x3 B is of order 3x 3 2 rows x 3 columns 3 rows x 3 columns Hence these matrices can be multiplied

  10. Definition 3 • If the (i,j)th elements of A and B are aij and bij respectively then the (i,j)th element of C=AB is

  11. = x Evaluating C=A x B • where and C11= a11b11+ a12b21 + a13b31 = 1x2 + 2x1 + 3x3 = 13

  12. = x Evaluating C=A x B • where and C11= a11b11+ a12b21+ a13b31 = 13 C21= a21b11+ a22b21+ a23b31 2x2 + 3x1 + 1x3 =10

  13. = x C11= a11b11+ a12b21+ a13b31 = 13 Evaluating C=A x B • where and C21= a21b11+ a22b21+ a23b31 = 10 C12= a11b12+ a12b22+ a13b32 1x0 + 2x1 + 3 x1 = 5

  14. = x C11= a11b11+ a12b21+ a13b31 = 13 Evaluating C=A x B • where and C22= 2x0 + 3x1 +1x1 = 4 C21= a11b11+ a12b21+ a13b31 = 10 C12= a11b12+ a12b22+ a13b32 = 5

  15. = x C11= a11b11+ a12b21+ a13b31 = 13 Evaluating C=A x B • where and C22= 2x0 + 3x1 +1x1 = 4 C21= a11b11+ a12b21+ a13b31 = 10 C13= 1x1 +2x1 +3x2 =9 C12= a11b12+ a12b22+ a13b32 = 5

  16. = x C11= a11b11+ a12b21+ a13b31 = 13 C22= 2x0 + 3x1 +1x1 = 4 C21= a11b11+ a12b21+ a13b31 = 10 C13= 1x1 +2x1 +3x2 =9 C12= a11b12+ a12b22+ a13b32 = 5 Evaluating C=A x B • where and C23= 2x1 + 3x1 +1x2 =7

  17. = x Evaluating C=A x B

  18. Can multiply New matrix size 2x2 Multiply • Size 2x2 2x2 New Matrix

  19. Multiply Size 1x2 3x2 Can not be multiplied columns A not equal to rows of B

  20. Probability Example: WeatherSource: Griffiths (Additional notes) • Starting on a Wednesday the weather has an initial probability of 0.8 of being 'fine' and 0.2 of 'rain'. If the weather is fine on any day then the conditional probability that it will be fine the next day is 0.7, whereas if it rains on one day the conditional probability of it being fine the next is 0.4. • (1) Represent this information in a tree diagram. • (2) Determine the P(fine on Thursday). • (3) Determine the P(rain on Thursday).

  21. P(rainT| fineW) P(fineT | rainW) P(rainT | rainW) Probability :Using Tree Diagrams Thursday P(fineW and fineT) Wednesday P(fineT| fineW) = 0.7 P(fineW)=0.8 P(fineW and rainT) = 0.3 = 0.4 P(rainW and fineT) P(rainW)=0.2 = 0.6 P(rainW and rainT)

  22. Using the definition of conditional probability Given P(fineT) and P(fineT|wetW), how do we find P(fineW and fineT) Hence So P(fineW)xP(fineT|fineW)= P(fineW and fineT)

  23. Probability: Using Tree Diagrams Wednesday Thursday P(fineW and fineT) P(fineT| fineW) = 0.7 =0.8x0.7=0.56 P(fineW)=0.8 P(fineW and rainT) P(rainT| fineW) = 0.3 =0.8x0.3=0.24 = 0.4 P(rainW and fineT) P(fineT | rainW) P(rainW)=0.2 =0.2x0.4=0.08 P(rainW and rainT) P(rainT | rainW) = 0.6 =0.2x0.6=0.12

  24. Thursday P(fineW and fineT) = 0.56 P(fineW and rainT) = 0.24 P(rainW and fineT) =0.08 P(rainW and rainT) =0.12 Probability: Using Tree Diagrams • What is the probability that it will rain on Thursday? = 0.24 + 0.12 = 0.36 • What is the probability that • It will be fine on Thursday? = 0.56 + 0.08 = 0.64 • What is the probability it will • rain or be fine on Thursday? =1.00

  25. P(fineW and rainT) = 0.24 P(rainW and rainT) =0.12 Probability: Law of Total Probability Thursday • What is the probability that it will rain on Thursday? P(fineW)x P( rainT |fineW) P(rain) =0.24 + 0.12 =0.36 P(rain) = P(fineW)x P( rainT |fineW)+ P(rainW)x P( rainT |rainW) P(rainW)x P( rainT |rainW)

  26. Markov Chains: Context • In contrast to coin tossing, which is a sequence of independent events, there are processes that change over time. Stochastic processes (or random or chance processes) that can often be modelled by a sequence of dependent experiments. Here we will consider one special case of experimental dependence.

  27. Definition: Markov Chain A Markov Chain or Markov Process exists if the following conditions are satisfied: • There is a finite number of ‘states’ of the experimental • system, and the system is in exactly one of these states after • each repetition of the experiment. The different states are • denoted by E1,E2,…,En , where each repetition of the • experiment has to result in one of these states. • The state of the process after a repetition of the experiment • depends (probabilistically) on only the state of the process • immediately after the previous experiment but not on the • states after earlier experiments.That is, the process has no • memory of the past, beyond the previous experiment.

  28. Example 1: Markov Chain Source: Griffiths Weather example (Additional notes) • Starting on a Wednesday the weather has an initial probability of 0.8 of being 'fine' and 0.2 of 'rain'. If the weather is fine on any day then the conditional probability that it will be fine the next day is 0.7, whereas if it rains on one day the conditional probability of it being fine the next is 0.4. (1) What are the states of this system? S= {fine, rain}

  29. Example 2: Markov Chain Rules for Snakes and Ladders • If you land on the bottom of the ladder, automatically go to the top • If you land on the snake head automatically slide down to its tail • You must land exactly on square 7 to finish; if your move would take you beyond square 7, then you cannot take the move, so you remain on the same square. • (1) What is the state space? S={0,1,3,5,7}

  30. To describe a Markov Chain: Two sets of probabilities must be known. • the initial probability vector and • the transition probability matrix

  31. Initial probability vector • The initial probability vector p0 describes the initial state (S) of the process p0= [ P( initial S is p1), P(initial S is p2),…, P(initial S is pn)] • If the initial state is known the initial vector will have • one of the probabilities equal to 1 and the rest equal to 0.

  32. Example 1: Markov Chain Source: Griffiths Weather example (Additional notes) • Starting on a Wednesday the weather has an initial probability of 0.8 of being 'fine' and 0.2 of 'rain'. If the weather is fine on any day then the conditional probability that it will be fine the next day is 0.7, whereas if it rains on one day the conditional probability of it being fine the next is 0.4. (1) What is the initial probability vector to start? Fine rain [0.8 0.2]

  33. Example 2: Markov Chain Rules • If you land on the bottom of the ladder, automatically go to the top • If you land on the snake head automatically slide down to its tail • You must land exactly on square 7 to finish; if your move would take you beyond square 7, then you cannot take the move, so you remain on the same square. • (2) If we start on 0 in snakes and ladders what is the initial vector? States 0 1 3 5 7 [1 0 0 0 0 ]

  34. Transition probability matrix • The (conditional) probability that the process moves from state i to state j is called a (one-step) transition probability, and is denoted by ,pij that is pij=P(Ej next |Ei before) • It is usual to display the values in an m (rows) x m (columns) matrix. That is a square matrix.

  35. Transition probability matrix After state 1 2 m Before state 1 2 m pij=P(Ej next |Ei before)

  36. P(end|start) Example 1: Markov Chain Source: Griffiths Weather example (Additional notes) • Starting on a Wednesday the weather has an initial probability of 0.8 of being 'fine' and 0.2 of 'rain'. If the weather is fine on any day then the conditional probability that it will be fine the next day is 0.7, whereas if it rains on one day the conditional probability of it being fine the next is 0.4. (1) What is the transition matrix End Fine Rain Start Fine rain

  37. Example 2: Markov Chains Rules for Snakes and Ladders • If you land on the bottom of the ladder, automatically go to the top • If you land on the snake head automatically slide down to its tail • You must land exactly on square 7 to finish; if your move would take you beyond square 7, then you cannot take the move, so you remain on the same square. • (3) Represent the conditional probabilities of end states given the starting states.

  38. Example 2: Markov Chains • Transition Matrix

  39. Example 1: Markov Chain Source: Griffiths Weather example (Additional notes) • Starting on a Wednesday the weather has an initial probability of 0.8 of being 'fine' and 0.2 of 'rain'. If the weather is fine on any day then the conditional probability that it will be fine the next day is 0.7, whereas if it rains on one day the conditional probability of it being fine the next is 0.4. • This can be represented in Matrix notation (we previously did it as a tree diagram). To do this we use the Law of Total Probability.

  40. P(fine and fine) P(fine|fine) = 0.7 =0.8x0.7=0.56 P(fine)=0.8 P(fine and rain) = 0.3 P(rain|fine) =0.8x0.3=0.24 P(rain and fine) = 0.4 P(fine|wet) P(rain)=0.2 =0.2x0.4=0.08 = 0.6 P(rain and rain) P(rain|rain) =0.2x0.6=0.12 Probability: Using Tree Diagrams Wednesday Thursday P(FineT)=0.64 P(RainT)=0.36

  41. Probability: Law of Total Probability • What is the probability that it will be fine on Thursday? • Wet on Thursday? Represented in matrix form PB=PA.PB|A Where PB=P [P(B) P(Not B)] PA=[P(A1) P(A2) …P(Am) and

  42. Fine rain [0.8 0.2] End Fine Rain Start Fine rain Probability: Law of Total Probability • What is the probability that it will be fine on Thursday? • Wet on Thursday? • Initial probability matrix • Transition Matrix

  43. P(fineT) P(rainT) Probability: Law of Total Probability Represented in matrix form PB=PA.PB|A • What is the probability that it will be fine on Thursday? • Wet on Thursday? • Initial probability matrix • Transition Matrix P[B] = [0.8 0.2] x [0.8 0.2] = [0.64 0.36]

  44. Now predict P(fine) and P(Rain on Friday) • What was the probability of fine and rain on Thursday? [0.64 0.36] • What is the initial probability vector starting • on Thursday? [0.64 0.36] • What else do we need? Transition matrix So [P(fineF) P(rainF)]= [0.64 0.36]x

  45. Now predict P(fine) and P(Rain on Friday) [P(fineF) P(rainF)]= [0.64 0.36]x The size of the matrix will be 1x2 That is [P(fineF) P(rainF)] [P(fineF) P(rainF)]= [ ] 0.64x0.7+0.36x0.4

  46. Now predict P(fine) and P(Rain on Friday) [P(fineF) P(rainF)]= [0.64 0.36]x The size of the matrix will be 1x2 That is [P(fineF) P(rainF)] [P(fineF) P(rainF)]= [ ] 0.64x0.7+0.36x0.4 0.64x0.3+0.36x0.6 = [0.592 0.408] The sum of these two values P(fineF) and P(rainF) should equal 1

  47. P(fineF) P(rainF) =[0.64 0.36] x =[0.8 0.2] x Probability: n-step transition P[B] = [0.8 0.2] x = [0.64 0.36] =[P(fineT) P(rainT)] P(fineF) P(rainF)

  48. P(fine)=0.8 P(rain)=0.2 Probability: Using Tree Diagrams Friday Wednesday Thursday 0.7 P(fine|fineW) = 0.7 0.3 0.4 = 0.3 P(rain|fineW) 0.6 0.7 = 0.4 P(fine|wetW) 0.3 0.4 = 0.6 P(rain|rainW) 0.6 And we would multiply through each branch then add all probabilities for fineF and then rainF

  49. Example 2: Markov Chain Rules for Snakes and Ladders • If you land on the bottom of the ladder, automatically go to the top • If you land on the snake head automatically slide down to its tail • You must land exactly on square 7 to finish; if your move would take you beyond square 7, then you cannot take the move, so you remain on the same square. • (1) What is the state space? S={0,1,3,5,7}

  50. Example 2: Markov Chain Rules • If you land on the bottom of the ladder, automatically go to the top • If you land on the snake head automatically slide down to its tail • You must land exactly on square 7 to finish; if your move would take you beyond square 7, then you cannot take the move, so you remain on the same square. • (2) If we start on 0 in snakes and ladders what is the initial vector? States 0 1 3 5 7 [1 0 0 0 0 ]

More Related