1 / 34

Sequential Bayesian Prediction in the Presence of Changepoints

Sequential Bayesian Prediction in the Presence of Changepoints. Roman Garnett, Michael Osborne, Stephen Roberts Pattern Analysis Research Group Department of Engineering University of Oxford.

rmontgomery
Download Presentation

Sequential Bayesian Prediction in the Presence of Changepoints

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Sequential Bayesian Prediction in the Presence of Changepoints Roman Garnett, Michael Osborne, Stephen Roberts Pattern Analysis Research Group Department of Engineering University of Oxford

  2. We employ Gaussian processes to perform Bayesian prediction when our data includes changepoints. missing correlated delayed

  3. The Gaussian distribution allows us to produce distributions for variables conditioned on any other observed variables.

  4. The Gaussian distribution allows us to produce distributions for variables conditioned on any other observed variables.

  5. A Gaussian process is the generalisation of a multivariate Gaussian distribution to a potentially infinite number of variables.

  6. A Gaussian process represents a powerful way to perform Bayesian inference about functions. • We want to consider functions of time x(t).

  7. A Gaussian process produces a mean estimate.

  8. A Gaussian process produces a mean estimate along with an indication of the uncertainty in it.

  9. Gaussian processes can be used for tracking.

  10. Gaussian processes can be used for tracking.

  11. Gaussian processes can be used for tracking.

  12. Gaussian processes can be used for tracking.

  13. Gaussian processes can be used for tracking.

  14. Gaussian processes can be used for tracking.

  15. Gaussian processes can be used for tracking.

  16. Covariance functions inform the Gaussian process of how we expect the function to vary with input e.g. time. We commonly possess prior expectations that the function should be smooth.

  17. Covariance functions are very flexible. They allow us to express knowledge of periodicity correlated sensors long-term drifts

  18. We have covariance functions for changepoints.

  19. We have covariance functions for changepoints.

  20. Associated with our covariance function are a number of hyperparametersf, such as periods correlations amplitudes

  21. Similarly, we require hyperparameters for changepoint covariance functions. input scale pre-changepoint changepoint location input scale post-changepoint

  22. ( j ) I p x x D ? ; R ( j ) ( j ) ( j ) Á Á Á d Á I I I p x x p x p D D ? ; ; ; d P i t t x : r e c a n s = ? R ( j ) ( j ) Á Á d Á I I p x p D d P i ; t x : r e c o r s D I C t t : o n e x We assign a prior distribution to each hyperparameter and then marginalise them. Unfortunately, as this is nonanalytic, we are forced to use numerical integration.

  23. We evaluate our predictions for each of a sample set of hyperparameters fS,e.g. f = the period of x(t).

  24. Bayesian Monte Carlo assigns another Gaussian process to the integrand as a function of the covariance hyperparameters f.

  25. We propagate through time one Gaussian process for each of our sample set fS, adjusting the weights according to the data as we go.

  26. We use similar techniques to determine posterior distributions for any hyperparameters of interest.

  27. We use an iterative formulation of our Gaussian processes that allows us to efficiently update our predictions. Similarly, we discard old data when the algorithm judges it sufficiently uninformative.

  28. A GP with a changepoint covariance is able to effectively track data with changepoints.

  29. We can also determine the posterior distribution for the location of changepoints.

  30. Bramblemet is a network of wireless weather sensors deployed in the Solent.

  31. Bramblemet is used by port authorities and recreational sailors. We would like to use its local, recent observations to produce accurate predictions.

  32. The posterior distribution for the changepoint location clearly identifies the fault.

  33. We have introduced a sequential Gaussian process algorithm to perform prediction in the presence of changepoints.

  34. 2 3 , , µ µ µ T 1 ( [ ( ] [ 0 0 ) 0 ] ) ( ) ( 0 ) ( ) ( 0 ) l l l l d l l d d d K K K K i i t t t t c o s c o s c o s ¡ ¡ 1 2 4 s s a g g a g g l b l l b l l l 0 i t a e a e m e ; ; ; ; ; ; µ µ µ µ µ i i i 0 s n s n c o s s n c o s 1 2 3 4 5 6 7 s e g = 6 7 . . µ µ µ µ µ i i i i 0 0 s n s n s n s n c o s 2 3 4 5 6 4 5 µ µ µ i i i 0 0 0 s n s n s n 4 5 6 Our covariance is a product of a term over sensor label and a term over time. delays typically Matern class length scales spherical parameterisation

More Related