Accessibility navigation


Fast identification algorithms for Gaussian process model

Hong, X., Gao, J., Jiang, X. and Harris, C. J. (2014) Fast identification algorithms for Gaussian process model. Neurocomputing, 133. pp. 25-31. ISSN 0925-2312

Full text not archived in this repository.

It is advisable to refer to the publisher's version if you intend to cite from this work. See Guidance on citing.

To link to this item DOI: 10.1016/j.neucom.2013.11.035

Abstract/Summary

A class identification algorithms is introduced for Gaussian process(GP)models.The fundamental approach is to propose a new kernel function which leads to a covariance matrix with low rank,a property that is consequently exploited for computational efficiency for both model parameter estimation and model predictions.The objective of either maximizing the marginal likelihood or the Kullback–Leibler (K–L) divergence between the estimated output probability density function(pdf)and the true pdf has been used as respective cost functions.For each cost function,an efficient coordinate descent algorithm is proposed to estimate the kernel parameters using a one dimensional derivative free search, and noise variance using a fast gradient descent algorithm. Numerical examples are included to demonstrate the effectiveness of the new identification approaches.

Item Type:Article
Refereed:Yes
Divisions:Science > School of Mathematical, Physical and Computational Sciences > Department of Computer Science
ID Code:36487
Uncontrolled Keywords:Gaussian process Optimization Kullback–Leibler divergence
Publisher:Elsevier

University Staff: Request a correction | Centaur Editors: Update this record

Page navigation