1 / 68

MCMC for Stochastic Epidemic Models

MCMC for Stochastic Epidemic Models . Philip D. O’Neill School of Mathematical Sciences University of Nottingham. This includes joint work with…. Tom Britton (Stockholm) Niels Becker (ANU, Canberra) Gareth Roberts (Lancaster) Peter Marks (NHS, Derbyshire PCT). Contents.

shelby
Download Presentation

MCMC for Stochastic Epidemic Models

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. MCMC for Stochastic Epidemic Models Philip D. O’Neill School of Mathematical Sciences University of Nottingham

  2. This includes joint work with… • Tom Britton (Stockholm) • Niels Becker (ANU, Canberra) • Gareth Roberts (Lancaster) • Peter Marks (NHS, Derbyshire PCT)

  3. Contents • 1. MCMC: overview and basics • 2. Example: Vaccine efficacy • 3. Data augmentation • 4. Example: SIR epidemic model • 5. Model choice • 6. Example: Norovirus outbreak • 7. Other topics

  4. Markov chain Monte Carlo (MCMC) Overview and basics • The key problem is to explore a density function π known up to proportionality. • The output of an MCMC algorithm is a sequence of samples from the correctly normalised π. • These samples can be used to estimate summaries of π, e.g. its mean, variance.

  5. How MCMC works • Key idea is to construct a discrete time Markov chain X1, X2, X3, … on state space S whose stationary distribution is π. • If P(dy,dx) is the transitional kernel of the chain this means that

  6. How MCMC works (2) • Subject to some technical conditions, Distribution of Xn→ πas n → • Thus to obtain samples from π we simulate the chain and sample from it after a “long time”.

  7. Example: π (x)  x e-2x

  8. Example: π (x)  x e-2x X1, X2, … XN = 1.2662

  9. Example: π (x)  x e-2x XN = 1.2662 XN+1 = 1.7840

  10. Example: π (x)  x e-2x XN = 1.2662 XN+1 = 1.7840 XN+2 = 0. 6629

  11. Example: π (x)  x e-2x • Suppose Markov chain output is ..., XN = 1.2662, XN+1 = 1.7840, XN+2 = 0.6629, …. ,XN+M = 1.0312 (i.e. discard initial N values, burn-in)

  12. How to build the Markov chain • Surprisingly, there are many ways to construct a Markov chain with stationary distribution π. • Perhaps the simplest is the Metropolis-Hastings algorithm.

  13. Metropolis-Hastings algorithm • Set an initial value X1. • If the chain is currently at Xn = x, randomly propose a new position Xn+1 = y according to a proposal density q(y | x). • Accept the proposed jump with probability • If not accepted, Xn+1 = x.

  14. Why the M-H algorithm works • Let P(dx,dy) denote the transition kernel of the chain. • Then P(dx,dy) is approximately the probability that the chain jumps from a region dx to a region dy. • We can calculate P(dx,dy) as follows:

  15. Why M-H works (2)

  16. Why M-H works (3) This last equation shows that πis a stationary distribution for the Markov chain.

  17. Comments on M-H algorithm (1) • The choice of proposal q(y|x) is fairly arbitrary. • Popular choices include q(y|x) = q(y) (Independence sampler) q(y|x) ~ N(x, 2) (Gaussian proposal) q(y|x) = q(|y-x|) (Symmetric proposal)

  18. Comments on M-H (2) • In practice, MCMC is almost always used for multi-dimensional problems. Given a target density π(x1, x2, … ,xn) it is possible to update each component separately, or even in blocks, using different M-H schemes.

  19. Comments on M-H (3) • A popular multi-dimensional scheme is the Gibbs sampler, in which the proposal for a component xi is its full conditional density π (xi | (x1,…,xi-1, xi+1, … ,xn) ) • The M-H acceptance probability is equal to one in this case.

  20. General comments on MCMC • How to check convergence? There is no guaranteed way. Visual inspection of trace plots; diagnostic tools (e.g. looking at autocorrelation). • Starting values – try a range • Acceptance rates – not too large/small • Mixing – how fast does the chain move around?

  21. Contents • 1. MCMC: overview and basics • 2. Example: Vaccine efficacy • 3. Data augmentation • 4. Example: SIR epidemic model • 5. Model choice • 6. Example: Norovirus outbreak • 7. Other topics

  22. 2. Example: Vaccine Efficacy • Outbreak of Variola Minor, Brazil 1956 • Data on cases in households (size 1 to 12) • 338 households: 126 had no cases • 1542 individuals: 809 vaccinated, 85 cases 733 unvaccinated, 425 cases Objective: estimate vaccine efficacy

  23. Disease transmission model • Population divided into separate households. • Divide transmission into community-acquired and within-household. • q = P( individual avoids outside infection ) •  = P ( one individual fails to infect another in the same household )

  24. Disease transmission model q m

  25. Vaccine response model • For a vaccinated individual, three responses can occur: complete protection; vaccine failure; or partial protection and infectivity reduction. • c = P(complete protection) • f = P(vaccine failure) • a = proportionate susceptibility reduction • b = proportionate infectivity reduction

  26. Vaccine response : (A,B) • A convenient way of summarising the random response is to suppose that an individual’s susceptibility and infectivity reduction is given by a bivariate random variable (A,B). Thus P[ (A,B) = (0, -)] = c P[ (A,B) = (1,1) ] = f P[ (A,B) = (a,b) ] = 1-c-f

  27. Efficacy Measures • Furthermore it is sensible to define measures of vaccine efficacy using (A,B). • VES = 1- E[A] = 1 - f - a(1-f-c) is a protective measure • VEI = 1 - E[AB] / E[A] = 1 - [f + ab(1-f-c)] / [f + a(1-f-c)] is a measure of infectivity reduction • Note both are functions of basic model parameters

  28. Bayesian inference • Object of inference is the posterior density (  | n ) = ( a,b,c,f,q, | n ) where n is the data set. By Bayes’ Theorem (| n )  (n | )  (), where (| n ) is the likelihood, and  () is the prior density for .

  29. MCMC details • There are six parameters: a,b,c,f,q, • Each parameter has range [0,1] • Update each parameter separately using a Metropolis-Hastings step with Gaussian proposal centered on the current value

  30. MCMC pseudocode • Initialise parameters (e.g. a = 0.5, b=0.5,…) • User input burn-in (B), sample size (S), thinning gap (T) • LOOP: counter from –B to (S x T) Update a, update b, …, update  IF (counter > 0) AND (counter/T is integer) THEN store current values • END LOOP

  31. Updating details for a • Propose ã~ N(a, 2) • Accept with probability • Note that the (symmetric) proposal cancels out • The other parameters are updated similarly

  32. Trace plot for a

  33. Density estimate for a

  34. Scatterplot of a versus c

  35. Results for VES Posterior mean: VES = 1 – E(A) = 0.84 Posterior Standard Deviation = 0.03 These results are easily obtained using the raw Markov chain output for the model parameters.

  36. Contents • 1. MCMC: overview and basics • 2. Example: Vaccine efficacy • 3. Data augmentation • 4. Example: SIR epidemic model • 5. Model choice • 6. Example: Norovirus outbreak • 7. Other topics

  37. 4. Data augmentation • Suppose we have a model with unknown parameter vector  = (1,2,…,n). • Available data are y = ( y1, y2,…, ym ). • If the likelihood π(y |  ) is intractable… • …one solution is to introduce extra parameters (“missing data”) x = (x1, x2,…, xp) such that π(y , x |  ) is tractable.

  38. Data augmentation (2) • The extra parameters x = (x1, x2,…, xp) are simply treated as unknown model parameters as before. • To obtain samples from π( y |  ), take samples from π(y , x |  ) and ignore x. • Such a scheme is often easy using MCMC.

  39. Data augmentation (3) • Can also add parameters to improve the mixing of the Markov chain (auxiliary variables). • Choosing how to augment data is not always obvious!

  40. Contents • 1. MCMC: overview and basics • 2. Example: Vaccine efficacy • 3. Data augmentation • 4. Example: SIR epidemic model • 5. Model choice • 6. Example: Norovirus outbreak • 7. Other topics

  41. 4. SIR Epidemic Model • Suppose we observe daily numbers of cases during an epidemic outbreak in some fixed population. • Objective is to say something about infection rates and infectious period duration of the disease.

  42. Epidemic curve (SARS in Canada)

  43. Model definition • Population of N individuals • At time t there are: St susceptibles It infectives Rt recovered/immune individuals Thus St + It + Rt = N for all t. Initially (S0, I0 ,R0 ) = (N-1,1,0).

  44. Model definition (2) • Each infectious individual remains so for a length of time TI~ Exponential(). • During this time, infectious contacts occur with each susceptible according to a Poisson process of rate /N. • Thus overall infection rate is  St It/N. • Two model parameters,  and .

  45. Data, likelihood, augmentation • Suppose we observe removals at times 0 ≤ r1≤ r2 ≤ …≤ rn ≤ . • Define r = ( r1,r2 , …,rn ). • The likelihood of the data, π (r | , ), is practically intractable. • However, given the (unknown) infection times i = ( i1,i2 , …,in ), π (i ,r | , ) is tractable.

  46. MCMC algorithm • Specifically, • It follows that if π() ~ Gamma distribution thenπ( | …) ~ Gamma distribution also. • Same is true for . • So can update  and  using a Gibbs step.

  47. MCMC algorithm – infection times • It remains to update the infection times i = ( i1,i2 , …,in ) • Various ways of doing this. • A simple way is to use a M-H scheme to randomly move the times. • For example, propose a new ik by picking a new time uniformly at random in (0,).

  48. Updating infection times Updating I2 : Acceptance prob. π (i*,r | ,) / π (i,r | ,) I6 I2 I4 I6 I4 I2*

  49. Extensions • Epidemic not known to be finished by  • Non-exponential infectious periods • Multi-group models (e.g. age-stratified data) • More sophisticated updates of infection times • Inclusion of latent periods

  50. Contents • 1. MCMC: overview and basics • 2. Example: Vaccine efficacy • 3. Data augmentation • 4. Example: SIR epidemic model • 5. Model choice • 6. Example: Norovirus outbreak • 7. Other topics

More Related