1 / 26

Discrete Time Markov Chains

Discrete Time Markov Chains. EE384X Review 2 Winter 2006. Outline. Some examples Definitions Stationary Distributions References (on reserve in library): 1. Hoel, Port, and Stone: Introduction to Stochastic Processes 2. Wolff: Stochastic Modeling and the Theory of Queues. p. 1 -p. 1 -q.

taryn
Download Presentation

Discrete Time Markov Chains

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Discrete Time Markov Chains EE384X Review 2 Winter 2006

  2. Outline • Some examples • Definitions • Stationary Distributions References (on reserve in library): 1. Hoel, Port, and Stone: Introduction to Stochastic Processes 2. Wolff: Stochastic Modeling and the Theory of Queues

  3. p 1-p 1-q 0 1 q Simple DTMCs e 1 0 • “States” can be labeled (0,)1,2,3,… • At every time slot a “jump” decision is made randomly based on current state a c b f d 2 (Sometimes the arrow pointing back to the same state is omitted)

  4. 1-D Random Walk p 1-p • Time is slotted • The walker flips a coin every time slot to decide which way to go X(t)

  5. Single Server Queue Geom(q) • Consider a queue at a supermarket • In every time slot: • A customer arrives with probability p • The HoL customer leaves with probability q Bernoulli(p)

  6. 3 0 2 1 Birth-Death Chain • Can be modeled by a Birth-Death Chain (aka. Geom/Geom/1 queue) • Want to know: • Queue size distribution • Average waiting time, etc.

  7. Markov Property • “Future” is independent of “Past” given “Present” • In other words: Memoryless • We’ve seen memoryless distributions: Exponential and Geometric • Useful for modeling and analyzing real systems

  8. Discrete Time Markov Chains • A sequence of random variables {Xn} is called a Markov chain if it has the Markov property: • States are usually labeled {(0,)1,2,…} • State space can be finite or infinite

  9. Transition Probability • Probability to jump from state i to state j • Assume stationary: independent of time • Transition probability matrix: P = (pij) • Two state MC:

  10. Stationary Distribution Define Then pk+1 = pk P (p is a row vector) Stationary Distribution: if the limit exists. If p exists, we can solve it by

  11. Balance Equations • These are called balance equations • Transitions in and out of state i are balanced

  12. In General • If we partition all the states into two sets, then transitions between the two sets must be “balanced”. • Equivalent to a bi-section in the state transition graph • This can be easily derived from the Balance Equations

  13. Conditions for p to Exist (I) • Definitions: • State j is reachable by state i if • State i and jcommute if they are reachable by each other • The Markov chain is irreducible if all states commute

  14. Conditions for p to Exist (I) (cont’d) • Condition: The Markov chain is irreducible • Counter-examples: 3 4 1 2 p=1 2 1 3

  15. Conditions for p to Exist (II) • The Markov chain is aperiodic: • Counter-example: 0 1 0 1 1 1 0 0 2

  16. Conditions for p to Exist (III) • The Markov chain is positive recurrent: • State i is recurrent if • Otherwise transient • If recurrent • State i is positive recurrent if E(Ti)<1, where Tiis time between visits to state i • Otherwise null recurrent

  17. p 1-p 1-q 0 1 q Solving for p

  18. 1-u-d 1-u-d 1-u-d u u u u 1-u 0 2 1 3 d d d d Birth-Death Chain • Arrival w.p. p ; departure w.p. q • Let u = p(1-q), d = q(1-p), r = u/d • Balance equations:

  19. Birth-Death Chain (cont’d) • Continue like this, we can derive: p(i-1) u = p(i) d • Equivalently, we can draw a bi-section between state i and state i-1 • Therefore, we have

  20. Birth-Death Chain (cont’d)

  21. Any Problems? • What if r is greater than 1? • Then the stationary distribution does not exist • Which condition does it violate?

  22. 1 1 2 1 2 1 2 1 2£2 Switch w/ HoL Blocking • Packets arrive as Bernoulli iid uniform • Packets queued at inputs • Only one packet can leave an output every time slot

  23. 2£2 Switch (cont’d) • If both HoL packets are destined to the same output • Only one of them is served (chosen randomly) • The other output is idle, as packets are blocked • This is called head of line blocking • HoL blocking reduces throughput • Want to know: throughput of this switch

  24. 2£2 Switch - DTMC 0.5 • States are the number of HoL packets destined to output 1 and output 2 • But states (0,2) and (2,0) are the same • Can “collapse” them together 0.25 0.25 0,2 2,0 1,1 0.5 0.5 0.5 0.5

  25. 2£2 Switch – DTMC (cont’d) 0.5 • Now P{(0,2)} = P{(1,1)} = 0.5 • Switch throughput = 0.5£1+0.5£2 = 1.5 • Per output throughput = 1.5/2 = 0.75 0,2 1,1 0.5 0.5 0.5

  26. Another Method to Find p • Sometimes the Markov chain is not easy to solve analytically • Can run the Markov chain for a long time, then {fraction of time spent in state i} !p (i)

More Related