Accessibility navigation


Simplex basis function based sparse least squares support vector regression

Hong, X., Mitchell, R. and Di Fatta, G. (2019) Simplex basis function based sparse least squares support vector regression. Neurocomputing, 330. pp. 394-402. ISSN 0925-2312

[img]
Preview
Text - Accepted Version
· Available under License Creative Commons Attribution Non-commercial No Derivatives.
· Please see our End User Agreement before downloading.

399kB

It is advisable to refer to the publisher's version if you intend to cite from this work. See Guidance on citing.

To link to this item DOI: 10.1016/j.neucom.2018.11.025

Abstract/Summary

In this paper, a novel sparse least squares support vector regression algorithm, referred to as LSSVR-SBF, is introduced which uses a new low rank kernel based on simplex basis function, which has a set of nonlinear parameters. It is shown that the proposed model can be represented as a sparse linear regression model based on simplex basis functions. We propose a fast algorithm for least squares support vector regression solution at the cost of O(N) by avoiding direct kernel matrix inversion. An iterative estimation algorithm has been proposed to optimize the nonlinear parameters associated with the simplex basis functions with the aim of minimizing model mean square errors using the gradient descent algorithm. The proposed fast least square solution and the gradient descent algorithm are alternatively applied. Finally it is shown that the model has a dual representation as a piecewise linear model with respect to the system input. Numerical experiments are carried out to demonstrate the effectiveness of the proposed approaches.

Item Type:Article
Refereed:Yes
Divisions:Science > School of Mathematical, Physical and Computational Sciences > Department of Computer Science
ID Code:80789
Publisher:Elsevier

Downloads

Downloads per month over past year

University Staff: Request a correction | Centaur Editors: Update this record

Page navigation