Accessibility navigation


Elastic-net prefiltering for two-class classification

Hong, X., Chen, S. and Harris, C. J. (2013) Elastic-net prefiltering for two-class classification. IEEE Transactions on Cybernetics, 43 (1). pp. 286-295. ISSN 2168-2267

Full text not archived in this repository.

It is advisable to refer to the publisher's version if you intend to cite from this work. See Guidance on citing.

To link to this item DOI: 10.1109/TSMCB.2012.2205677

Abstract/Summary

A two-stage linear-in-the-parameter model construction algorithm is proposed aimed at noisy two-class classification problems. The purpose of the first stage is to produce a prefiltered signal that is used as the desired output for the second stage which constructs a sparse linear-in-the-parameter classifier. The prefiltering stage is a two-level process aimed at maximizing a model's generalization capability, in which a new elastic-net model identification algorithm using singular value decomposition is employed at the lower level, and then, two regularization parameters are optimized using a particle-swarm-optimization algorithm at the upper level by minimizing the leave-one-out (LOO) misclassification rate. It is shown that the LOO misclassification rate based on the resultant prefiltered signal can be analytically computed without splitting the data set, and the associated computational cost is minimal due to orthogonality. The second stage of sparse classifier construction is based on orthogonal forward regression with the D-optimality algorithm. Extensive simulations of this approach for noisy data sets illustrate the competitiveness of this approach to classification of noisy data problems.

Item Type:Article
Refereed:Yes
Divisions:Faculty of Science > School of Mathematical, Physical and Computational Sciences > Department of Computer Science
ID Code:31726
Uncontrolled Keywords:Cross-validation (CV), elastic net (EN), forward regression, leave-one-out (LOO) errors, linear-in-the-parameter model, regularization
Publisher:IEEE

University Staff: Request a correction | Centaur Editors: Update this record

Page navigation