Links

Tools

Export citation

Search in Google Scholar

Bayesian Gaussian Process Latent Variable Model.

Journal article published in 2010 by Michalis K. Titsias, Neil D. Lawrence ORCID
This paper was not found in any repository; the policy of its publisher is unknown or unclear.
This paper was not found in any repository; the policy of its publisher is unknown or unclear.

Full text: Unavailable

Question mark in circle
Preprint: policy unknown
Question mark in circle
Postprint: policy unknown
Question mark in circle
Published version: policy unknown

Abstract

We introduce a variational inference framework for training the Gaussian process latent variable model and thus performing Bayesian nonlinear dimensionality reduction. This method allows us to variationally integrate out the input vari- ables of the Gaussian process and compute a lower bound on the exact marginal likelihood of the nonlinear latent variable model. The maxi- mization of the variational lower bound provides a Bayesian training procedure that is robust to overfitting and can automatically select the di- mensionality of the nonlinear latent space. We demonstrate our method on real world datasets. The focus in this paper is on dimensionality re- duction problems, but the methodology is more general. For example, our algorithm is imme- diately applicable for training Gaussian process models in the presence of missing or uncertain inputs.