1 / 19

Introduction to Concepts Markov Chains and Processes

Introduction to Concepts Markov Chains and Processes. Why?. Core Statistical Processes Appear in Nature Exemplify Performance Measurement Discrete and Continuous Time Versions Fun!. DEFINITION. {X n , n>=0} hops around on statepace {…-2, -1, 0, 1, 2, …} according to

quiana
Download Presentation

Introduction to Concepts Markov Chains and Processes

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Introduction to ConceptsMarkov Chains and Processes

  2. Why? • Core Statistical Processes • Appear in Nature • Exemplify Performance Measurement • Discrete and Continuous Time Versions • Fun!

  3. DEFINITION {Xn, n>=0} hops around on statepace {…-2, -1, 0, 1, 2, …} according to probability Transition Matrix P: Pi,j = P[Xn+1 = j | Xn = i] ai = Prob[X0 = i]

  4. DEFINITION • X’s state completely characterized by its current state • X is “Markov” • P is a “stochastic matrix” • its rows sum to 1

  5. EXAMPLE • state 0: sunny • state 1: rainy .7 .8 .3 0 1 .2 Tomorrow’s weather depends ONLY on today’s weather.

  6. EXAMPLEFROG ON THE ROAD • p = Prob [jump fwd] • q = Prob [jump back] • infinite state space • Prob[X11 = 5]? • Queuing! • state = number customers in system • p = Prob [next event is arrival] • q = Prob [next event is service compl.]

  7. Neurology Marketing Gambling Inventory Manpower Electronic Support Measures Communications Service Models Air-to-Air Combat MARKOV CHAINS IN NATURE

  8. Chapman-Kolmogorov Equation • For any r < n Upshot: Pn = n-transition probability matrix N is the number of states in the statespace.

  9. LIMITING DISTRIBUTION pi exists implies process is aperiodic

  10. LIMITING DISTRIBUTION ...completely useless but interesting better think about it...

  11. CONTINUOUS TIME MARKOV CHAINS {X(t), t >= 0} is a continuous time process with > sojourn times S0, S1, S2, ... > embedded process Xn = X(Sn-1+) X is a CTMC if Sn ~ Exp(qi) where i=Xn

  12. MATRICES Probability Transition Matrix

  13. GENERATOR MATRIX GIVES RISE TO THE NUMERICAL METHODS INVOLVING RAISING A MATRIX TO A POWER -- ROW SUMS EQUAL ZERO -- DIAGONAL-DOMINATE

  14. M/M/1 QUEUE • l = rate of arrival (# per unit time) • m = rate of service (1/m = avg serve time)

  15. FIRST PASSAGE TIME Want to know how soon X(t) gets to a special state: mi = E[min t: X(t) is “special”|X(0) = i]

  16. LIMITING DISTRIBUTION Corollary of the General Key Renewal Theorem

More Related