Sparse model identification using orthogonal forward regression with basis pursuit and D-optimalityHong, X. ORCID: https://orcid.org/0000-0002-6832-2298, Brown, M., Chen, S. and Harris, C. J. (2004) Sparse model identification using orthogonal forward regression with basis pursuit and D-optimality. IEE Proceedings-Control Theory and Applications, 151 (4). pp. 491-498. ISSN 1350-2379 Full text not archived in this repository. It is advisable to refer to the publisher's version if you intend to cite from this work. See Guidance on citing. To link to this item DOI: 10.1049/ip-cta:20040693 Abstract/SummaryAn efficient model identification algorithm for a large class of linear-in-the-parameters models is introduced that simultaneously optimises the model approximation ability, sparsity and robustness. The derived model parameters in each forward regression step are initially estimated via the orthogonal least squares (OLS), followed by being tuned with a new gradient-descent learning algorithm based on the basis pursuit that minimises the l(1) norm of the parameter estimate vector. The model subset selection cost function includes a D-optimality design criterion that maximises the determinant of the design matrix of the subset to ensure model robustness and to enable the model selection procedure to automatically terminate at a sparse model. The proposed approach is based on the forward OLS algorithm using the modified Gram-Schmidt procedure. Both the parameter tuning procedure, based on basis pursuit, and the model selection criterion, based on the D-optimality that is effective in ensuring model robustness, are integrated with the forward regression. As a consequence the inherent computational efficiency associated with the conventional forward OLS approach is maintained in the proposed algorithm. Examples demonstrate the effectiveness of the new approach.
Altmetric Deposit Details University Staff: Request a correction | Centaur Editors: Update this record |