Convergence and evaluation-complexity analysis of a regularized tensor-Newton method for solving nonlinear least-squares problems
Gould, N. I.M., Rees, T. and Scott, J.
It is advisable to refer to the publisher's version if you intend to cite from this work. See Guidance on citing. To link to this item DOI: 10.1007/s10589-019-00064-2 Abstract/SummaryGiven a twice-continuously differentiable vector-valued function r(x), a local minimizer of ∥r(x)∥2 is sought. We propose and analyse tensor-Newton methods, in which r(x) is replaced locally by its second-order Taylor approximation. Convergence is controlled by regularization of various orders. We establish global convergence to a first-order critical point of ∥r(x)∥2, and provide function evaluation bounds that agree with the best-known bounds for methods using second derivatives. Numerical experiments comparing tensor-Newton methods with regularized Gauss-Newton and Newton methods demonstrate the practical performance of the newly proposed method.
Download Statistics DownloadsDownloads per month over past year Altmetric Deposit Details University Staff: Request a correction | Centaur Editors: Update this record |