A Markov Chain Model of Baseball - PowerPoint PPT Presentation

A markov chain model of baseball l.jpg
Download
1 / 16

  • 405 Views
  • Updated On :
  • Presentation posted in: Sports / Games

A Markov Chain Model of Baseball. Eric Kuennen Department of Mathematics University of Wisconsin Oshkosh kuennene@uwosh.edu. Used as a project for an undergraduate Stochastic Modeling course. Presented at: Joint Mathematics Meetings Washington, D.C. January 6, 2009.

I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.

Download Presentation

A Markov Chain Model of Baseball

An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -

Presentation Transcript


A markov chain model of baseball l.jpg

A Markov Chain Model of Baseball

Eric Kuennen

Department of Mathematics

University of Wisconsin Oshkosh

kuennene@uwosh.edu

Used as a project for an undergraduate Stochastic Modeling course

Presented at: Joint Mathematics Meetings

Washington, D.C.

January 6, 2009


Markov chain model for baseball l.jpg

Markov Chain Model for Baseball

  • View an inning of baseball as a stochastic process with 25 possible states.

  • There are 8 different arrangements of runners on the bases: (bases empty, runner on 1st, runner on 2nd, runner on 3rd, runners on 1st and 2nd , runners on 1st and 3rd, runners on 2nd and 3rd , bases loaded) and three possibilities for the number of outs (0 outs, 1 out, 2 outs), for a total of 24 non-absorbing states.

  • The 25th state (3 outs) is an absorbing state for the inning.


Transition probabilities l.jpg

Transition Probabilities

  • A Markov Chain is a stochastic process in which the next state depends only on the present state. In other words, future states are independent of past states.

  • Let Pijdenote the probability the next state is j, given the current state is i.

  • Form the Transition Matrix T = [Pij].

w = probability of a walk

s = probability of a single

d = probability of a double

t = probability of a triple

h = probability of a home run

out = probability of an out


Transition matrix l.jpg

Transition Matrix


Run matrix l.jpg

Run Matrix


Methods of analysis l.jpg

Methods of Analysis

Theoretical Calculations with Maple

  • Expected Run Values for each state

  • Steady State Probability Vector

  • Expected Value of a given play in a given state or in general


Expected run values l.jpg

Expected Run Values

  • Let vi be the expected number of runs scored starting in state i

  • Students use Maple’s linear algebra package to solve for the vector v


Expected run values9 l.jpg

Expected Run Values

From 2005 MLB:

w = .094s = .157d = .049

t = .005h = .029out = .661


Sacrifice bunting l.jpg

Sacrifice Bunting

Is it ever advantageous to sacrifice bunt?


Slide11 l.jpg

Stealing Bases

How successful does a base-stealer need to be on average in order for it to be worth-while to attempt to steal second base with a runner on first and no outs?


Methods of analysis12 l.jpg

Methods of Analysis

Experimental Simulations with Minitab

  • Students write a Minitab macro that uses a random number generator to simulate the step by step evolution of the Markov Chain

  • Large-scale simulations are used to estimate Expected Run Values and perform situational strategy analyses


Two simulated innings l.jpg

Two Simulated Innings

First Inning

1. Single

2. Out

3. Double

4. Single

5. Out

6. Single

7. Out

Second Inning

8. Single

9. Homerun

10. Out

11. Out

12. Single

13. Out


Sacrificing with the game on the line l.jpg

Sacrificing with the game on the line

In the ninth inning, your team needs one run to win or tie. Suppose the first batter reaches first. Should you bunt?

Mean number of runs scored:

0.909

Probability of scoring at least one run:

0.390

Mean number of runs scored:

0.665

Probability of scoring at least one run:

0.406


Reference l.jpg

Reference

  • Sokol, J.S. (2004) “AnIntuitive Markov Chain Lesson From Baseball,” Informs Transactions on Education. 5 pp. 47-55.


Please contact me for l.jpg

Please contact me for:

  • Sample Maple Worksheet

  • Sample Minitab Macro

  • Project Assignment Handout

kuennene@uwosh.edu


  • Login