Accessibility navigation


A forward-constrained regression algorithm for sparse kernel density estimation

Hong, X. ORCID: https://orcid.org/0000-0002-6832-2298, Chen, S. and Harris, C. J. (2008) A forward-constrained regression algorithm for sparse kernel density estimation. IEEE Transactions on Neural Networks, 19 (1). pp. 193-198. ISSN 1045-9227

Full text not archived in this repository.

It is advisable to refer to the publisher's version if you intend to cite from this work. See Guidance on citing.

To link to this item DOI: 10.1109/tnn.2007.908645

Abstract/Summary

Using the classical Parzen window (PW) estimate as the target function, the sparse kernel density estimator is constructed in a forward-constrained regression (FCR) manner. The proposed algorithm selects significant kernels one at a time, while the leave-one-out (LOO) test score is minimized subject to a simple positivity constraint in each forward stage. The model parameter estimation in each forward stage is simply the solution of jackknife parameter estimator for a single parameter, subject to the same positivity constraint check. For each selected kernels, the associated kernel width is updated via the Gauss-Newton method with the model parameter estimate fixed. The proposed approach is simple to implement and the associated computational cost is very low. Numerical examples are employed to demonstrate the efficacy of the proposed approach.

Item Type:Article
Refereed:Yes
Divisions:Science > School of Mathematical, Physical and Computational Sciences > Department of Computer Science
ID Code:15275
Uncontrolled Keywords:cross validation, jackknife parameter estimator, Parzen window (PW), probability density function (pdf), sparse modeling, REGULARIZATION

University Staff: Request a correction | Centaur Editors: Update this record

Page navigation