1 / 16

Agenda for This Week

Agenda for This Week. Chapter 17. Markov Processes – Part 3. Review. A Markov Process describes a situation where a system is in one state at a time Switching between states is probabilistic The state of the system is dependent ONLY on the previous state of the system

Download Presentation

Agenda for This Week

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Agenda for This Week

  2. Chapter 17 Markov Processes – Part 3

  3. Review • A Markov Process describes a situation where a system is in one state at a time • Switching between states is probabilistic • The state of the system is dependent ONLY on the previous state of the system • Steady State probabilities: The long term probability of being in a particular state no matter which state you begin in

  4. Absorbing States • Markov chains can also be used to analyze the properties of a system in which some states are “absorbing”, that is, where once the system reaches that state, it never leaves that state. • In an absorbing state, the probability that the process remains in that state once it enters the state is 1. • States that are not absorbing are called “transient states”.

  5. Absorbing State Examples • Account aging in Accounts Receivable departments • Movement of biological populations to extinction • Depletion of non-renewable resources (such as oil, gas, etc.)

  6. Absorbing States • Provided that all states “communicate” with each other, the system will eventually end up in one of the absorbing states. • In other words, as long as there is a pathway for the transient states to get to an absorbing state, it will eventually end up in an absorbing state.

  7. Absorbing States • If a Markov chain has both absorbing and nonabsorbing states, the states may be rearranged so that the transition matrix can be written as the following composition of four submatrices: I, 0, R, and Q:

  8. Absorbing State

  9. Fundamental Matrix • The computation of absorbing state probabilities requires the determination and use of what is called a fundamental matrix • The fundamental matrix, N, is the inverse of the difference between the identity matrix and the Q matrix. Note: I and Q must be the same size, ex: 2x2, 3x3 N = (I-Q) -1

  10. NR Matrix • The NR matrix is the product of the fundamental (N) matrix and the (R) matrix. • It gives the probabilities of eventually moving from each nonabsorbing state to each absorbing state. • Multiplying any vector of initial nonabsorbing state probabilities by NR gives the vector of probabilities for the process eventually reaching each of the absorbing states.

  11. Calculating Inverse Matrices • Use the following equations to calculate the inverse: (I-Q) -1 INVERSE d = (a11) (a22) – (a21) (a12)

  12. Absorbing State Example #12 Xmas tree farm has 5000 trees. 1500 trees classified as protected trees, 3500 available for cutting. Even if available, may not be sold. Most trees not cut live to next year but some diseased trees lost each year. State 1: Cut and Sold State 2: Lost to disease State 3: Too small for cutting State 4: Available but not cut and sold P =

  13. Transition Matrix IROQ

  14. #12 Continued N = (I-Q) -1 I-Q = - = (1-Q) -1 = = = N D = (.5)(.5) – (0)(-.2) = .25

  15. #12 Continued • NR = = • If we have 5000 trees (1500 protected and 3500 available), we can multiply this by the NR matrix to how many will be eventually sold and lost. = 3580 1420

  16. For Next Class • Do HWs 1-4 • Try # 13 • Look for Case 3 on Class Website

More Related