markov chains n.
Download
Skip this Video
Download Presentation
Markov Chains

Loading in 2 Seconds...

play fullscreen
1 / 8

Markov Chains - PowerPoint PPT Presentation


  • 374 Views
  • Uploaded on

Markov Chains. Plan: Introduce basics of Markov models Define terminology for Markov chains Discuss properties of Markov chains Show examples of Markov chain analysis On-Off traffic model Markov-Modulated Poisson Process Server farm model Erlang B blocking formula.

loader
I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
capcha
Download Presentation

PowerPoint Slideshow about 'Markov Chains' - jacinta


Download Now An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
markov chains
Markov Chains
  • Plan:
    • Introduce basics of Markov models
    • Define terminology for Markov chains
    • Discuss properties of Markov chains
    • Show examples of Markov chain analysis
      • On-Off traffic model
      • Markov-Modulated Poisson Process
      • Server farm model
      • Erlang B blocking formula
definition markov chain
Definition: Markov Chain
  • A discrete-state Markov process
  • Has a set S of discrete states: |S| > 1
  • Changes randomly between states in a sequence of discrete steps
  • Continuous-time process, although the states are discrete
  • Very general modeling technique used for system state, occupancy, traffic, queues, ...
  • Analogy: Finite State Machine (FSM) in CS
some terminology 1 of 3
Some Terminology (1 of 3)
  • Markov property: behaviour of a Markov process depends only on what state it is in, and not on its past history (i.e., how it got there, or when)
  • A manifestation of the memoryless property, from the underlying assumption of exponential distributions
some terminology 2 of 3
Some Terminology (2 of 3)
  • The time spent in a given state on a given visit is called the sojourn time
  • Sojourn times are exponentially distributed and independent
  • Each state i has a parameter q_i that characterizes its sojourn behaviour
some terminology 3 of 3
Some Terminology (3 of 3)
  • The probability of changing from state i to state j is denoted by p_ij
  • This is called the transition probability (sometimes called transition rate)
  • Often expressed in matrix format
  • Important parameters that characterize the system behaviour
properties of markov chains
Properties of Markov Chains
  • Irreducibility: every state is reachable from every other state (i.e., there are no useless, redundant, or dead-end states)
  • Ergodicity: a Markov chain is ergodic if it is irreducible, aperiodic, and positive recurrent (i.e., can eventually return to a given state within finite time, and there are different path lengths for doing so)
  • Stationarity: stable behaviour over time
analysis of markov chains
Analysis of Markov Chains
  • The analysis of Markov chains focuses on steady-state behaviour of the system
  • Called equilibrium, or long-run behaviour as time t approaches infinity
  • Well-defined state probabilities p_i (non-negative, normalized, exclusive)
  • Flow balance equations can be applied
examples of markov chains
Examples of Markov Chains
  • Traffic modeling: On-Off process
  • Interrupted Poisson Process (IPP)
  • Markov-Modulated Poisson Process
  • Computer repair models (server farm)
  • Erlang B blocking formula
  • Birth-Death processes
  • M/M/1 Queueing Analysis