Accessibility navigation


Sparse kernel regression modeling using combined locally regularized orthogonal least squares and D-optimality experimental design

Chen, S., Hong, X. and Harris, C. J. (2003) Sparse kernel regression modeling using combined locally regularized orthogonal least squares and D-optimality experimental design. IEEE Transactions on Automatic Control, 48 (6). pp. 1029-1036. ISSN 0018-9286

Full text not archived in this repository.

It is advisable to refer to the publisher's version if you intend to cite from this work. See Guidance on citing.

To link to this item DOI: 10.1109/tac.2003.812790

Abstract/Summary

The note proposes an efficient nonlinear identification algorithm by combining a locally regularized orthogonal least squares (LROLS) model selection with a D-optimality experimental design. The proposed algorithm aims to achieve maximized model robustness and sparsity via two effective and complementary approaches. The LROLS method alone is capable of producing a very parsimonious model with excellent generalization performance. The D-optimality design criterion further enhances the model efficiency and robustness. An added advantage is that the user only needs to specify a weighting for the D-optimality cost in the combined model selecting criterion and the entire model construction procedure becomes automatic. The value of this weighting does not influence the model selection procedure critically and it can be chosen with ease from a wide range of values.

Item Type:Article
Refereed:Yes
Divisions:Science > School of Mathematical, Physical and Computational Sciences > Department of Computer Science
ID Code:15165
Uncontrolled Keywords:Bayesian learning, D-optimality, optimal experimental design, orthogonal least squares, regularization, sparse modeling, BASIS FUNCTION NETWORKS, NON-LINEAR SYSTEMS, IDENTIFICATION, ALGORITHM, SELECTION

University Staff: Request a correction | Centaur Editors: Update this record

Page navigation