1 / 18

Day 3 Markov Chains

Day 3 Markov Chains. For some interesting demonstrations of this topic visit: http://ocw.mit.edu/OcwWeb/Mathematics/18-06Spring-2005/Tools/index.htm. Equations of the form: are called discrete equations because they only model the system at whole number time increments.

kyna
Download Presentation

Day 3 Markov Chains

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Day 3 Markov Chains For some interesting demonstrations of this topic visit: http://ocw.mit.edu/OcwWeb/Mathematics/18-06Spring-2005/Tools/index.htm

  2. Equations of the form:are called discrete equations because they only model the system at whole number time increments • Difference equation is an equation involving differences. We can see difference equation from at least three points of views: as sequence of number, discrete dynamical system and iterated function. It is the same thing but we look at different angle.

  3. Difference Equations vs Differential Equations Dynamical system come with many different names. Our particular interesting dynamical system is for the system whose state depends on the input history. In discrete time system, we call such system difference equation (equivalent to differential equation in continuous time).

  4. Markov Matrices [ ] Consider the matrix .1 .01 .3 A = .2 .99 .3 .7 0 .4 Properties of Markov Matrices All entries are ≥ 0 All Columns add up to one Note: the powers of the matrix will maintain these properties Each column is representing probabilities

  5. [ ] Markov Matrices .1 .01 .3 A = .2 .99 .3 .7 0 .4 1 is a eigenvalue of all Markov Matrices Why? Subtract 1 down each entry in the diagonal. Each column will then add to zero which means that the rows are dependent. Which means that the matrix is singular

  6. [ ] Markov Matrices .1 .01 .3 A = .2 .99 .3 .7 0 .4 One eigenvalue is one all other eigenvalues have an absolute value ≤ 1 We are interested in raising A to some powers If 1 is an eigenvector and all other vectors are less than 1 then the steady state is the eigenvector Note: this requires n independent vectors

  7. [ ] Short cuts for finding eigenvectors -.9 .01 .3 A-I = .2 -.01 .3 det(A -1I) .7 0 -.6 To find the eigenvector that corresponds to λ=1 Use < .6, ?? , .7> to get the last row to be zero. Then use the top row to get the missing middle value. <.6,33,.7>

  8. Applications of Markov Matrices Markov Matrices are used to when the probability of an event depends on its current state. For this model, the probability of an event must remain constant over time. The total population is not changing over time Markov matrices have applications in Electrical engineering

  9. Applications of Markov Matrices uk+1 = Auk Suppose we have two cities Suzhou (S) an Hangzhou (H) with initial condition at k=0, S = 0 and H = 1000 We would like to describe movement in population between these two cities. us+1 = .9 .2 uS uH+1 .1 .8 uH Population of Suzhou and Hongzhou at time t+1 Column 1: .9 of the people in S stay there and .1 move to H Column 2: .8 of the people in H stay there are and .2 move to S [ ] [ ] [ ] Population of S and H at time t

  10. Applications of Markov Matrices [ ] [ ] [ ] uk+1 = Auk . us+1 = .9 .2 uS uH +1 .1 .8 uH Find the eigenvalues and eigenvectors

  11. Applications of Markov Matrices [ ] [ ] [ ] uk+1 = Auk . us +1 = .9 .2 uS uH +1 .1 .8 uH eigenvalues 1 and .7 (from properties of Markov Matrices and the trace) Eigenvectors Ker (A-I), Ker (A-.7I) A-I -.1 .2 Ker=<2,1> A-.7I = .2 .2 Ker=<1,-1> .1 -.2 .1 .1 [ ] [ ]

  12. Applications [ ] [ ] [ ] uk+1 = Auk us+1 = .9 .2 uS uH+1 .1 .8 uH eigenvalue 1 eigenvector <2,1> eigenvalue .7 eigenvector <-1,1> This tells us about time and ∞ λ=1 will be a steady state, λ=.7 will disappear as t→∞ The eigenvector tells us that we need a ratio of 2:1 The total population is still 1000 so the final population will be 1000 (2/3) and 1000 (1/3)

  13. Initial condition at k=0, S = 0 and H = 1000 Applications [ ] [ ] [ ] us+1 = .9 .2 uS uH +1 .1 .8 uH To find the amounts after a finite number of steps Aku0 = c1(1) 2 + c2 (.7) k -1 1 1 Use the initial condition to solve for constants 0 = c1 2 + c2 -1 c1 =1000/3 1000 1 1 c2 = 2000/3 [ ] [ ] k [ ] [ ] [ ]

  14. Steady state for Markov Matrices Every Markov chain will be a steady state. The steady state will be the eigenvector for the eigenvalue λ=1

  15. Applications of Markov matrices Airlines - Markov matrices are used in creating networks for airlines to determine routes of planes. The sum of the probabilities is 1 in each column because all planes in a given location go somewhere (includes possibly not moving) Airlines want to create flight plans so they do not end up with too many planes in one part of the world and not enough in another.

  16. More applications of Markov Game theory – looking setting house rules for casinos ensuring casinos come out ahead Economics- Economic mobility over generations • http://www.facstaff.bucknell.edu/ap030/Math345LAApplications/MarkovProcesses.html

  17. Homework (diff 3): review worksheet 8.1 3-6,8,9,13 eigenvalue review worksheet 1-5 "Genius is one per cent inspiration, ninety-nine per cent perspiration.“ Thomas Alva Edison

  18. For More information visit: http://people.revoledu.com/kardi/tutorial/DifferenceEquation/WhatIsDifferenceEquation.htm http://www.math.duke.edu/education/ccp/materials/linalg/diffeqs/diffeq2.html Fibonacci via matrices http://www.maths.leeds.ac.uk/applied/0380/fibonacci03.pdf

More Related