1 / 16

Markov Chains

Markov Chains. Kevin Comer CSS 650 January 31, 2013. Definitions. Markov Chain: A random process with the Markov Property Markov Property: Given the present state, the future and past states are independent. “ M emorylessness ” Possible values of X i form the state space.

ramiro
Download Presentation

Markov Chains

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Markov Chains Kevin Comer CSS 650 January 31, 2013

  2. Definitions • Markov Chain: A random process with the Markov Property • Markov Property: Given the present state, the future and past states are independent. • “Memorylessness” • Possible values of Xi form the state space

  3. Simple Example Source: http://en.wikipedia.org/wiki/File:MarkovChain1.png

  4. Reducibility • State j is accessible from state i if there is a non-zero probability of transitioning from i to j at some point • State j is communicate with state i if both j is accessible from i and i is accessible from j • State i is essential if for all states i is accessible to it is also accessible from. • A Markov Chain is irreducible if all states are communicate with every other state.

  5. Periodicity • A state i has period k if any return to i must occur in multiples of k steps • Example: if, starting in state i, one can return to i in {6, 8, 10, 12…} steps, state i has period k = 2. • If k = 1, the state is aperiodic, meaning returns to state i can occur at irregular times. • A Markov chain is aperiodic only if every state in it is aperiodic.

  6. Recurrence • A state is transient if there is a non-zero probability that we will never return to that state. • If a state is not transient, then it is recurrent. • Recurrent states have a finite hitting time with probability 1. • The mean recurrence time is the expected return time • If Mi is finite, the state is positive recurrent. • Otherwise, the state is non-null recurrent.

  7. Recurrence • Expected Number of Visits • A state is recurrent if and only ifthe expected number of return visits over an infinite time is infinite. • A state is absorbing if it is impossible to leave the state. • and for i ≠ j

  8. Ergodicity • A state is ergodic if it is aperiodic and positive recurrent. • If all states in an irreducible Markov chain are ergodic, then the chain is ergodic.

  9. Variations of Markov Chains • Time-homogenous Markov Chain • Probability of transition is independent of time n • Markov Chain of order m • Future state depends on the past m states • given that n > m • Can be reformatted to construct a state space of ordered m-tuples of X values

  10. Steady State Analysis • Markov process can be described by a time-independent matrix pij • Stationary distribution represented by vector π • Assuming the state space is finite, the transition matrix P is defined as

  11. Model of Class Mobility • System of equations (since ) • Solving for probabilities • , ,

  12. Branching Process • Consider a population where each individual produces j ≥ 0 new offspring with probability Pj • The population at any given time n is Xn • State Xn = 0 is a recurrent state, all others are transient • Let π0 denote the probability that the population will eventually die out.

  13. Time-Reversible Markov Chains • Assume a stationary ergodic Markov chain that we want to reverse • If Qij = Pij for all i, j, then the chain is considered time reversible • Also can be expressed as

  14. Continuous-Time Markov Process • Length of time in each state exponentially distributed (i.e. “memoryless”) • Rate parameter = qii • Transition rate matrix Q • Given n systems in state i, they will transition to state j at a rate of nqij (provided large enough n) • As transition rates sum up to 0,

  15. Process Example

  16. Social Science Applications • Economics • Macroeconomic business cycle • Economic development of countries • Mathematical Biology • Birth and Death Rates • Simple Example: Conway’s Game of Life • Epidemiology (SIR model) • Queuing Theory • Music

More Related