1 / 9

Gaussian Processes for Regression CKI Williams and CE Rasmussen

Gaussian Processes for Regression CKI Williams and CE Rasmussen. Summarized by Joon Shik Kim 12.05.10.(Fri) Computational Models of Intelligence. Introduction.

kanan
Download Presentation

Gaussian Processes for Regression CKI Williams and CE Rasmussen

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Gaussian Processes for RegressionCKI Williams and CE Rasmussen Summarized by Joon Shik Kim 12.05.10.(Fri) Computational Models of Intelligence

  2. Introduction • In the Bayesian approach to neural networks a prior distribution over the weights induces a prior distribution over functions. This prior is combined with a noise model, which specifies the probability of observing the target t given function value y, to yield a posterior over functions which can then be used for predictions.

  3. Prediction with Gaussian Processes (1/3) • A stochastic process is a collection of random variables {Y(x)|x∈X) indexed by a set X. In our case X will be the input space with dimension d, the number of inputs. The stochastic process is specified by giving the probability distribution for every finite subset of variables Y(x(1)),…,Y(x(k)) in a consistent manner. A Gaussian process is a stochastic process which can be fully specified by its mean function μ(x)=E[Y(x)] and its covariance function C(x,x’)=E(Y(x)-μ(x))(Y(x’)-μ(x’)). We consider Gaussain processes which have μ(x)=0.

  4. Prediction with Gaussian Processes (2/3) • The training data consists of n pairs of inputs and targets {(x(i),t(i)). i=1…n}. The input vector for a test case is denoted x (with no superscript). The inputs are d-dimensional x1,…,xdand the targets are scalar.

  5. Prediction with Gaussian Processes (3/3)

  6. Illustration of Prediction using GP

  7. Proof of Prediction Model (1/3)

  8. Proof of Prediction Model (2/3)

  9. Proof of Prediction Model (3/3)

More Related