Local analysis of a spectral correction for the Gauss-Newton model applied to quadratic residual problems (Q329315): Difference between revisions

From MaRDI portal
Changed an Item
ReferenceBot (talk | contribs)
Changed an Item
 
(3 intermediate revisions by 3 users not shown)
Property / describes a project that uses
 
Property / describes a project that uses: levmar / rank
 
Normal rank
Property / MaRDI profile type
 
Property / MaRDI profile type: MaRDI publication profile / rank
 
Normal rank
Property / full work available at URL
 
Property / full work available at URL: https://doi.org/10.1007/s11075-016-0101-3 / rank
 
Normal rank
Property / OpenAlex ID
 
Property / OpenAlex ID: W2298107905 / rank
 
Normal rank
Property / cites work
 
Property / cites work: The estimation of the hessian matrix in nonlinear least squares problems with non-zero residuals / rank
 
Normal rank
Property / cites work
 
Property / cites work: Two-Point Step Size Gradient Methods / rank
 
Normal rank
Property / cites work
 
Property / cites work: A unified local convergence analysis of inexact constrained Levenberg-Marquardt methods / rank
 
Normal rank
Property / cites work
 
Property / cites work: Convergence of a Regularized Euclidean Residual Algorithm for Nonlinear Least-Squares / rank
 
Normal rank
Property / cites work
 
Property / cites work: Strong local convergence properties of adaptive regularized methods for nonlinear least squares / rank
 
Normal rank
Property / cites work
 
Property / cites work: Nonmonotone Spectral Projected Gradient Methods on Convex Sets / rank
 
Normal rank
Property / cites work
 
Property / cites work: Derivative free analogues of the Levenberg-Marquardt and Gauss algorithms for nonlinear least squares approximation / rank
 
Normal rank
Property / cites work
 
Property / cites work: Hessian Matrix vs. Gauss–Newton Hessian Matrix / rank
 
Normal rank
Property / cites work
 
Property / cites work: Levenberg--Marquardt methods with strong local convergence properties for solving nonlinear equations with convex constraints / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q4868581 / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q3813210 / rank
 
Normal rank
Property / cites work
 
Property / cites work: Local convergence analysis of the Gauss-Newton method under a majorant condition / rank
 
Normal rank
Property / cites work
 
Property / cites work: Convergence of the Gauss--Newton Method for Convex Composite Optimization under a Majorant Condition / rank
 
Normal rank
Property / cites work
 
Property / cites work: Algorithms for the Solution of the Nonlinear Least-Squares Problem / rank
 
Normal rank
Property / cites work
 
Property / cites work: Approximate Gauss–Newton Methods for Nonlinear Least Squares Problems / rank
 
Normal rank
Property / cites work
 
Property / cites work: A method for the solution of certain non-linear problems in least squares / rank
 
Normal rank
Property / cites work
 
Property / cites work: An Algorithm for Least-Squares Estimation of Nonlinear Parameters / rank
 
Normal rank
Property / cites work
 
Property / cites work: Specialised versus general-purpose algorithms for minimising functions that are sums of squared terms / rank
 
Normal rank
Property / cites work
 
Property / cites work: The Barzilai and Borwein Gradient Method for the Large Scale Unconstrained Minimization Problem / rank
 
Normal rank
Property / cites work
 
Property / cites work: Modified Gauss–Newton scheme with worst case guarantees for global performance / rank
 
Normal rank
Property / cites work
 
Property / cites work: Numerical Optimization / rank
 
Normal rank
Property / cites work
 
Property / cites work: Convergence analysis of a proximal Gauss-Newton method / rank
 
Normal rank
Property / cites work
 
Property / cites work: An Improved Marquardt Procedure for Nonlinear Regressions / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q4039860 / rank
 
Normal rank
Property / cites work
 
Property / cites work: Recent advances in numerical methods for nonlinear equations and nonlinear least squares / rank
 
Normal rank
Property / cites work
 
Property / cites work: On the local convergence of a derivative-free algorithm for least-squares minimization / rank
 
Normal rank
Property / cites work
 
Property / cites work: A Derivative-Free Algorithm for Least-Squares Minimization / rank
 
Normal rank

Latest revision as of 19:31, 12 July 2024

scientific article
Language Label Description Also known as
English
Local analysis of a spectral correction for the Gauss-Newton model applied to quadratic residual problems
scientific article

    Statements

    Local analysis of a spectral correction for the Gauss-Newton model applied to quadratic residual problems (English)
    0 references
    21 October 2016
    0 references
    This paper is a helpful contribution to nonlinear programming at its interface with data mining, statistics and the theory of inverse problems, being a ``classical'' and yet challenging field of modern operational research (OR), but also of economics, medicine, engineering, etc. There, smart inventions are appreciated very much to cope with problems of complexity, of convergence and to further provide qualitative theory. In fact, in today's OR, the strongly emerging area of data mining is often named under ``big data'', ``data analytics'', ``business analytics'', or just ``analytics''. This paper is rigorous and based on it, future research can be raised and so many real-world applications made. In fact, the authors present a simple -- smart -- spectral correction for the famous Gauss-Newton model from nonlinear regression theory, namely, from the optimization of nonlinear least squares. That correction consists in the addition of a sign-free multiple of the unit (or identity) matrix to the Gauss-Newton model's Hessian, namely, the multiple that bases on spectral approximations for the Hessians of the residual functions. For the resulting method, an analysis of local convergence is provided in detail and applied on the class of quadratic residual problems. Given some mild assumptions, the proposed technique is demonstrated to converge for problems where convergence of the Gauss-Newton procedure might not be guaranteed. Furthermore, for a class of non-zero residue problems the rate of linear convergence is shown to be better than the one of the Gauss-Newton method. With the help of numerical examples on quadratic and non-quadratic residual problems, these theoretical results are illustrated in the article. This valuable article is well-structured, mathematically deep, well exemplified and illustrated, related to rank-deficiency, etc., also, and written well. The five sections of this work are as follows: 1. Introduction, 2. The spectral approximation, 3. Local analysis: quadratic residual case, 4. Illustrative numerical examples, and 5. Conclusions and future perspectives. In the future, theoretical and algorithmic refinements and generalizations, strong results and codes could be expected in the research community, initiated by this research article. Those could be made in terms of nonlinear optimization, semi-infinite optimization, robust optimization, optimal control, image processing, pattern recognition, shape detection, tomography and information Theory. Such a process could foster progress in science and engineering, finance, business administration, economics, earth-sciences, neuroscience and medicine.
    0 references
    0 references
    0 references
    0 references
    0 references
    nonlinear least squares
    0 references
    quadratic residues
    0 references
    spectral parameter
    0 references
    Gauss-Newton method
    0 references
    local convergence
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references