1. Rosman, G., Bronstein, M.M., Bronstein, A.M., Kimmel, R.: Nonlinear dimensionality
reduction by topologically constrained isometric embedding. International
Journal of Computer Vision 89 (2010) 56{68
2. Bishop, C.: Pattern Recognition and Machine Learning. Information Science and
Statistics. Springer (2006)
3. Mardia, K.V., Kent, J.T., Bibby, J.M.: Multivariate analysis. Academic Press,
London (1979)
4. Tenenbaum, J.B., de Silva, V., Langford, J.C.: A global geometric framework for
nonlinear dimensionality reduction. Science 290 (2000) 2319{2323
5. Roweis, S.T., Saul, L.K.: Nonlinear dimensionality reduction by locally linear
embedding. SCIENCE 290 (2000) 2323{2326
6. Scholkopf, B., Smola, A., Muller, K.R.: Nonlinear component analysis as a kernel
eigenvalue problem. Neural Computation 10(5) (1998) 1299{1319
7. Lawrence, N.: Probabilistic non-linear principal component analysis with gaussian
process latent variable models. Journal of Machine Learning Research 6 (2005)
1783{1816
8. Gao, J., Zhang, J., Tien, D.: Relevance units latent variable model and nonlinear
dimensionality reduction. IEEE Transactions on Neural Networks 21 (2010) 123{
135
9. Jiang, X., Gao, J., Shi, D., Wang, T.: Thin plate spline latent variable models for
dimensionality reduction. In: International Joint Conference on Neural Networks
(IJCNN). (2012) 1{8
10. Tipping, M.E., Bishop, C.M.: Probabilistic principal component analysis. Journal
of the Royal Statistical Society, Series B 61 (1999) 611{622
11. Wang, J.M., Fleet, D.J., Hertzmann, A.: Gaussian process dynamical models. In:
Advances in Neural Information Processing Systems (NIPS). (2005) 1441{1448
12. Lawrence, N.D., Quinonero-Candela, J.: Local distance preservation in the gplvm
through back constraints. In: International Conference on Machine Learning
(ICML), ACM Press (2006) 513{520
13. Urtasun, R., Darrell, T.: Discriminative gaussian process latent variable model
for classi�cation. In: International Conference on Machine learning (ICML), ACM
(2007) 927{934
14. Jiang, X., Gao, J., Wang, T., Zheng, L.: Supervised latent linear gaussian process
latent variable model for dimensionality reduction. IEEE Transactions on Systems,
Man, and Cybernetics - Part B: Cybernetics 42(6) (2012) 1620{1632
15. Jiang, X., Gao, J., Hong, X., Cai, Z., Wang, T.: Double gaussian process latent
variable model for dimensionality reduction. Submitted (2013)
16. Ek, C.H.: Shared Gaussian Process Latent Variables Models. PhD thesis, Oxford
Brookes University (2009)
17. Titsias, M.K., Lawrence, N.D.: Bayesian gaussian process latent variable model.
In: International Conference on Arti�cial Intelligence and Statistics (AISTATS).
(2010)
18. Cottrell, G.W., Munro, P., Zipser, D.: Learning internal representations from
grayscale images: An example of extensional programming. In: Conference of the
Cognitive Science Society. (1987)
19. Bengio, Y., Lamblin, P., Popovici, D., Larochelle, H.: Greedy layer-wise training
of deep networks. In: Advances in Neural Information Processing Systems. (2007)
20. Neal, R.: Bayesian learning for neural networks. Lecture Notes in Statistics 118
(1996)
21. Rasmussen, C.E., Williams, C.K.I.: Gaussian Processes for Machine Learning. The
MIT Press (2006)
22. Snoek, J., Adams, R.P., Larochelle, H.: Nonparametric guidance of autoencoder
representations using label information. Journal of Machine Learning Research 13
(2012) 2567{2588
23. Palm, R.B.: Prediction as a candidate for learning deep hierarchical models of
data. Master's thesis, Technical University of Denmark (2012)
24. Frank, A., Asuncion, A.: UCI machine learning repository (2010)