Published in

Proceedings of the 23rd international conference on Machine learning - ICML '06

DOI: 10.1145/1143844.1143909

Links

Tools

Export citation

Search in Google Scholar

Local distance preservation in the GP-LVM through back constraints

Proceedings article published in 2006 by Neil D. Lawrence ORCID, Joaquin Quiñonero Candela, Joaquin Quiñonero-Candela
This paper is available in a repository.
This paper is available in a repository.

Full text: Download

Green circle
Preprint: archiving allowed
Green circle
Postprint: archiving allowed
Red circle
Published version: archiving forbidden
Data provided by SHERPA/RoMEO

Abstract

The Gaussian process latent variable model (GP-LVM) is a generative approach to non- linear low dimensional embedding, that pro- vides a smooth probabilistic mapping from latent to data space. It is also a non-linear generalization of probabilistic PCA (PPCA) (Tipping & Bishop, 1999). While most ap- proaches to non-linear dimensionality meth- ods focus on preserving local distances in data space, the GP-LVM focusses on exactly the opposite. Being a smooth mapping from latent to data space, it focusses on keep- ing things apart in latent space that are far apart in data space. In this paper we first provide an overview of dimensionality reduc- tion techniques, placing the emphasis on the kind of distance relation preserved. We then show how the GP-LVM can be generalized, through back constraints, to additionally pre- serve local distances. We give illustrative ex- periments on common data sets.