An Introduction to Markov Chains Homer and Marge repeatedly play a gambling game. Each time they play, the probability that Homer wins is 0.4, and the probability that Homer loses is 0.6 A “Drunkard’s Walk” 0 1 2 3 4 0 1 2 3 4 P(Homer wins) = .4 P(Homer loses) = .6

Related searches for www2.spsu

Download Presentation
## PowerPoint Slideshow about 'www2.spsu' - andrew

**An Image/Link below is provided (as is) to download presentation**

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.

- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -