240 likes | 480 Views
Al-Imam Mohammad Ibn Saud University. CS433 Modeling and Simulation Lecture 06 – Part 03 Discrete Markov Chains. Dr. Anis Koubâa. 12 Apr 2009. Classification of States: 1. A path is a sequence of states, where each transition has a positive probability of occurring.
E N D
Al-Imam Mohammad Ibn Saud University CS433Modeling and SimulationLecture 06 – Part 03 Discrete Markov Chains Dr. Anis Koubâa 12 Apr 2009
Classification of States: 1 Apathis a sequence of states, where each transition has a positive probability of occurring. State jis reachable (or accessible)(يمكن الوصول إليه) from state i(ij) if there is a path from i to j –equivalently Pij(n)> 0 for some n≥0, i.e.the probability to go from ito j in nsteps is greater than zero. States i and j communicate (ij)(يتصل) ifiis reachable fromjandjis reachable fromi. (Note: a state i always communicates with itself) A set of states C is a communicating classif every pair of states in C communicates with each other, and no state in C communicates with any state not in C.
Classification of States: 1 A state i is said to be an absorbing state if pii= 1. A subset S of the state space Xis a closed set if no state outside of S is reachable from any state in S (like an absorbing state, but with multiple states), this means pij= 0for every iS and j S A closed set S of states is irreducible(غير قابل للتخفيض) if any state j Sis reachable from every state iS. A Markov chain is said to be irreducible if the state space X is irreducible.
Example Irreducible Markov Chain p01 p12 p22 p00 p21 p10 p01 p12 p23 0 1 2 3 p32 p10 p00 p14 0 1 2 p22 p33 4 • Reducible Markov Chain Absorbing State Closed irreducible set
Classification of States: 2 State iis atransient state(حالة عابرة)if there exists a state j such that j is reachable from ibut i is not reachable from j. A state that is not transient is recurrent(حالة متكررة) . There are two types of recurrent states: Positive recurrent: if the expected time to return to the state is finite. Null recurrent (less common): if the expected time to return to the state is infinite(this requires an infinite number of states). A state iis periodic with periodk>1, ifkis the smallest number such that all paths leading from state iback to state i have a multiple of k transitions. A state is aperiodic if it has period k =1. A state is ergodic if it is positive recurrent and aperiodic.
Classification of States: 2 Example from Book Introduction to Probability: Lecture Notes D. Bertsekas and J. Tistsiklis – Fall 200
Transient and Recurrent States We define the hittingtime Tijas the random variable that represents the time to go from state j to stat i, and is expressed as: k is the number of transition in a path from i to j. Tijis the minimum number of transitions in a path from i to j. We define the recurrence timeTii as the first time that the Markov Chain returns to state i. The probability that the first recurrence to state ioccurs at the nth-step is TiTime for first visit to i given X0 = i. The probability of recurrence to state iis
Transient and Recurrent States • The mean recurrence time is • A state is recurrent if fi=1 • If Mi < then it is said Positive Recurrent • If Mi = then it is said Null Recurrent • A state is transient if fi<1 • If , then is the probability of never returning to state i.
Transient and Recurrent States We define Niasthe number of visits to stateigiven X0=i, Theorem: If Ni is the number of visits to state igiven X0=i,then Proof Transition Probability from state i to state i after n steps
Transient and Recurrent States The probability of reaching state j for first time in n-steps starting from X0 = i. The probability of ever reaching j starting from state i is
Three Theorems If a Markov Chain has finite state space, then: at least one of the states is recurrent. If state i is recurrent and state j is reachable from state i then: state j is also recurrent. IfS is a finite closed irreducible set of states, then: every state in S is recurrent.
Positive and Null Recurrent States Let Mi be the mean recurrence time of state i A state is said to be positive recurrent if Mi<∞. If Mi=∞ then the state is said to be null-recurrent. Three Theorems If state i is positive recurrent and statej is reachable from state i then, state j is also positive recurrent. If S is a closed irreducible set of states, then every state in S is positive recurrent or, every state in S is null recurrent, or, every state in S is transient. If S is a finite closed irreducible set of states, then every state in S is positive recurrent.
Example p01 p12 p23 0 1 2 3 Positive Recurrent States Transient States p32 p10 p00 p14 p22 p33 4 Recurrent State
Periodic and Aperiodic States Suppose that the structure of the Markov Chain is such that state i is visited after a number of steps that is an integer multiple of an integer d >1. Then the state is called periodic with period d. If no such integer exists (i.e., d =1) then the state is called aperiodic. Example 1 0.5 0 1 2 1 0.5 Periodic State d = 2
Steady State Analysis Recall that the state probability, which is the probability of finding the MC at state i after the kth step is given by: • An interesting question is what happens in the “long run”, i.e., • This is referred to as steady stateor equilibrium or stationary state probability • Questions: • Do these limits exists? • If they exist, do they converge to a legitimate probability distribution, i.e., • How do we evaluate πj, for all j.
Steady State Analysis Recall the recursive probability • If steady state exists, then π(k+1)π(k), and therefore the steady state probabilities are given by the solution to the equations and • If an Irreducible Markov Chain, then the presence of periodic states prevents the existence of a steady state probability • Example: periodic.m
Steady State Analysis • THEOREM: In an irreducible aperiodic Markov chain consisting of positive recurrentstates a unique stationary state probabilityvector π exists such that πj > 0 and where Mj is the mean recurrence time of state j • The steady state vector πis determined by solving and • Ergodic Markov chain.
Discrete Birth-Death Example 1-p 1-p 1-p 0 1 i p p p p • Thus, to find the steady state vector πwe need to solve and
Discrete Birth-Death Example • In other words • Solving these equations we get • In general • Summing all terms we get
Discrete Birth-Death Example • Therefore, for all states j we get • If p<1/2, then All states are transient • If p>1/2, then All states are positive recurrent
Discrete Birth-Death Example • If p=1/2, then All states are null recurrent
Reducible Markov Chains Transient Set T Irreducible Set S1 Irreducible Set S2 • In steady state, we know that the Markov chain will eventually end in an irreducible set and the previous analysis still holds, or an absorbing state. • The only question that arises, in case there are two or more irreducible sets, is the probability it will end in each set
Reducible Markov Chains Transient Set T Irreducible Set S s1 r sn i • Suppose we start from state i. Then, there are two ways to go to S. • In one step or • Go to r T after k steps, and then to S. • Define
Reducible Markov Chains • First consider the one-step transition • Next consider the general case for k=2,3,…