1 / 38

Gaussian Process Latent Variable Model (GPLVM)

Gaussian Process Latent Variable Model (GPLVM). Principle, Variants and Applications 19/07/2018. Table of Contents. 1.Gaussian Process(GP) 2.Latent Variable Model(LVM) 3.GPLVM 4.Gaussian Process Dynamic Model(GPDM) 5.Some Applications 6.Relationship with Our Work 7.Reference. 1.GP.

msumner
Download Presentation

Gaussian Process Latent Variable Model (GPLVM)

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Gaussian Process Latent Variable Model(GPLVM) Principle, Variants and Applications 19/07/2018

  2. Table of Contents 1.Gaussian Process(GP) 2.Latent Variable Model(LVM) 3.GPLVM 4.Gaussian Process Dynamic Model(GPDM) 5.Some Applications 6.Relationship with Our Work 7.Reference

  3. 1.GP • Essence:Regression with kernel method. • Functional space rather than parameter space. • The covariance between labels is the similarity between the corresponding inputs: • cov(y1,y2)=k(x1,x2)

  4. 1.GP

  5. 1.GP • The prior of functional space: • Depend on the kernel. • The likelihood: • Given by the data. • The posterior: • Marginalize out the functions inconsistent with the observed data.

  6. 1.GP

  7. 1.GP

  8. 1.GP Using GP for prediction:

  9. 1.GP • One thing to remember about GP: • 一个空间中的变量服从一个高斯分布,其协方差矩阵由另一个空间中对应元素的相似性/核函数确定

  10. 1.GP Learning non-linear patterns:

  11. 1.GP

  12. 1.GP • Some extra notes: • (Semi-)positive definite kernel. • Reproducing Hilbert Kernel Space (RHKS.) • Mercer's theorem. • Kernel parameter learning (type II MLE.) • etc...

  13. 2.LVM:PPCA PCA Probabilistic PCA(PPCA)

  14. 2.LVM:PPCA

  15. 2.LVM:PPCA

  16. 2.LVM:PPCA

  17. 2.LVM:PPCA

  18. 2.LVM:PPCA

  19. 2.LVM:Dual PPCA • Dual PPCA • PPCA marginalizes out X • Why not marginalize out W?

  20. 2.LVM:Dual PPCA

  21. 2.LVM:Dual PPCA

  22. 3.GPLVM • Dual PPCA = GPLVM • Why ? 一个空间中的变量服从一个高斯分布,其协方差矩阵由另一个空间中对应元素的相似性/核函数确定

  23. 3.GPLVM All together p independent GP regression sharing one kernel matrix.

  24. 3.GPLVM • Learn parameters: • Learn kernel parameters: • MAP for: • Learn latent vectors: • Somewhat arbitrary...

  25. 3.GPLVM:Examples

  26. 3.GPLVM • Advantages: • Analytical, nice in mathematic formalization. • More tractable than neural networks in parameter tunning. • Combine intuitive knowledge and iteration. • Disadvantages: • Rely on the type of kernels. • No universal toolbox. (matlab? : P)

  27. 4.GPDM Introducint dynamic relationship in the latent space ~HMM

  28. 4.GPDM Given latent variables:

  29. 4.GPDM • Prior of latent variables: • Markov property:

  30. 4.GPDM • Parameter tunning: • Minimize negative log posterior:

  31. 4.GPDM • Some extra notes: • Multiple sequances • High order Markov Chain

  32. 5.Application • Recognizing patterns from walking pose • Y = R^62

  33. 5.Application • Recognizing patterns from walking pose • X = R^3

  34. 6.Relationship with Our Work

  35. 6.Relationship with Our Work

  36. 6.Relationship with Our Work This paper presents a variant of PCA using Laplace approximation. The proposed method is very fundamental and crucial. These exists the novelty in some extent. Also, the most impressive part is the technical sound in mathematical derivation. Overall, this manuscript is a great work in this conference. I have several minor suggestions as listed below. 1. It will be better if the authors can provide some practical applications using the proposed methods in the experiment section. 2. Please review another probabilistic solution based on PPCA, named "Gaussian Process Latent Variable Model" and refer its applications, such as 1) M. Ding, G. Fan, "Multilayer Joint Gait-Pose Manifolds for Human Gait Motion Modeling", in IEEE Transactions on Cybernetics, 2015 2) X. Zhang, M. Ding, G. Fan, "Video-based Human Walking Estimation by Using Joint Gait and Pose Manifolds", in IEEE Transactions on Circuits and Systems for Video Technology, 2016 3) M. Ding, G. Fan, "Multi-layer joint gait-pose manifold for human motion modeling", in Automatic Face and Gesture Recognition (FG), 2013

  37. 6.Relationship with Our Work • Recognizing dynamic pattern in lip-reading: • Similarities: • Biometric, need CNN-based segmentation. • Can be paratermized by a few parameters. • Jobs to be done: • How to combine pose with semantic content? • How to measure the characteristics?

  38. 7.Reference • Referred papers: • Gaussian Process Latent Variable Models for Visualisation of High Dimensional Data, Neil D.Lawrence • Gaussian Process Dynamic Models, Jack M.Wang • Gaussian Process Latent Variable Model (presentation), Ahmad Ashar • URLs: • https://solour-lfq.github.io/papers/PRML.pdf • https://solour-lfq.github.io/papers/GPML.pdf • https://solour-lfq.github.io/papers/lepca.pdf • http://www.dgp.toronto.edu/~jmwang/gpdm/ • https://github.com/lawrennd/fgplvm

More Related