A Characterization of Superlinear Convergence and Its Application to Quasi-Newton Methods

From MaRDI portal
Publication:4768565

DOI10.2307/2005926zbMath0282.65042OpenAlexW4234564249MaRDI QIDQ4768565

Jorge J. Moré, John E. jun. Dennis

Publication date: 1974

Full work available at URL: https://doi.org/10.2307/2005926



Related Items

Generalized derivatives and nonsmooth optimization, a finite dimensional tour (with comments and rejoinder), Convergence properties of the Broyden-like method for mixed linear-nonlinear systems of equations, A residual algorithm for finding a fixed point of a nonexpansive mapping, Difference equations and local convergence of inexact Newton methods, Global convergence and stabilization of unconstrained minimization methods without derivatives, Two-step and three-step Q-superlinear convergence of SQP methods, Interpolation by conic model for unconstrained optimization, On a monotone Newton-like method, On the local convergence of adjoint Broyden methods, A corrected Levenberg-Marquardt algorithm with a nonmonotone line search for the system of nonlinear equations, A globally and R-linearly convergent hybrid HS and PRP method and its inexact version with applications, Convergence theory for the structured BFGS secant method with an application to nonlinear least squares, A three-term derivative-free projection method for nonlinear monotone system of equations, Superlinear convergence of smoothing quasi-Newton methods for nonsmooth equations, A pointwise quasi-Newton method for unconstrained optimal control problems, Modifying the BFGS method, On superlinear convergence of quasi-Newton methods for nonsmooth equations, Derivative-free method for bound constrained nonlinear monotone equations and its application in solving steady state reaction-diffusion problems, Root finding by high order iterative methods based on quadratures, Newton and quasi-Newton methods for normal maps with polyhedral sets, Parallel quasi-Newton methods for unconstrained optimization, Difference Newton-like methods under weak continuity conditions, Some convergence properties of descent methods, Finding plasma equilibria with magnetic islands, Quasi-Newton methods with derivatives, Jacobi-free and complex-free method for finding simultaneously all zeros of polynomials having only real zeros, A double-parameter scaling Broyden-Fletcher-Goldfarb-Shanno method based on minimizing the measure function of Byrd and Nocedal for unconstrained optimization, A trust-region-based BFGS method with line search technique for symmetric nonlinear equations, Some efficient algorithms for unconstrained discrete-time optimal control problems, Preconditioned Newton methods using incremental unknowns methods for the resolution of a steady-state Navier-Stokes-like problem, A double parameter scaled BFGS method for unconstrained optimization, Two-level Newton's method for nonlinear elliptic PDEs, Über die globale Konvergenz von Variable-Metrik-Verfahren mit nicht- exakter Schrittweitenbestimmung, Superlinear/quadratic smoothing Broyden-like method for the generalized nonlinear complementarity problem, A Riemannian view on shape optimization, A spectral algorithm for large-scale systems of nonlinear monotone equations, Global approximate Newton methods, Higher-order metric subregularity and its applications, Quasi-Newton methods in infinite-dimensional spaces and application to matrix equations, Enlarging the region of convergence of Newton's method for constrained optimization, Minimizing a differentiable function over a differential manifold, An algorithm for discrete linear \(L_ p\) approximation, Recourse-based stochastic nonlinear programming: properties and Benders-SQP algorithms, A variable metric algorithm for unconstrained minimization without evaluation of derivatives, Comments on: ``A family of derivative-free conjugate gradient methods for large-scale nonlinear systems of equations, The genesis and early developments of Aitken's process, Shanks' transformation, the \(\varepsilon\)-algorithm, and related fixed point methods, A BFGS trust-region method for nonlinear equations, A globally and superlinearly convergent quasi-Newton method for general box constrained variational inequalities without smoothing approximation, An efficient three-term conjugate gradient method for nonlinear monotone equations with convex constraints, Adaptive cubic regularisation methods for unconstrained optimization. II: Worst-case function- and derivative-evaluation complexity, A new smoothing Broyden-like method for solving nonlinear complementarity problem with a \(P_{0}\)-function, Sparse quasi-Newton updates with positive definite matrix completion, The global convergence of partitioned BFGS on problems with convex decompositions and Lipschitzian gradients, Factorized quasi-Newton methods for nonlinear least squares problems, A nonsmooth version of the univariate optimization algorithm for locating the nearest extremum (locating extremum in nonsmooth univariate optimization), Incomplete Jacobian Newton method for nonlinear equations, An adaptive scaled BFGS method for unconstrained optimization, Analysis of a self-scaling quasi-Newton method, A quasi-Gauss-Newton method for solving nonlinear algebraic equations, A quasi-Newton method for solving nonlinear algebraic equations, A nonmonotone PSB algorithm for solving unconstrained optimization, Adaptive cubic regularisation methods for unconstrained optimization. I: Motivation, convergence and numerical results, Local convergence of quasi-Newton methods for B-differentiable equations, A new structured quasi-Newton algorithm using partial information on Hessian, On the superlinear convergence of the successive approximations method, Globally and superlinearly convergent QP-free algorithm for nonlinear constrained optimization, An accurate active set Newton algorithm for large scale bound constrained optimization., Inexact Josephy-Newton framework for generalized equations and its applications to local analysis of Newtonian methods for constrained optimization, Convergence analysis of a modified BFGS method on convex minimizations, On the relation between quadratic termination and convergence properties of minimization algorithms. Part I. Theory, Subspace selection algorithms to be used with the nonlinear projection methods in solving systems of nonlinear equations, A geometric method in nonlinear programming, Properties of updating methods for the multipliers in augmented Lagrangians, A family of variable metric proximal methods, Global convergence of quasi-Newton methods based on adjoint Broyden updates, Convergence of Broyden-like matrix, On averaging and representation properties of the BFGS and related secant updates, Combining trust-region techniques and Rosenbrock methods to compute stationary points, BFGS trust-region method for symmetric nonlinear equations, A new backtracking inexact BFGS method for symmetric nonlinear equations, Local convergence analysis for partitioned quasi-Newton updates, A Newton-type univariate optimization algorithm for locating the nearest extremum, Newton's method and quasi-Newton-SQP method for general \(\text{LC}^1\) constrained optimization, A Kantorovich theorem for the structured PSB update in Hilbert space., On preconditioned Uzawa methods and SOR methods for saddle-point problems, Convergence of Newton-like-iterative methods, Some investigations in a new algorithm for nonlinear optimization based on conic models of the objective function, The convergence of matrices generated by rank-2 methods from the restricted \(\beta\)-class of Broyden, Variable metric methods for unconstrained optimization and nonlinear least squares, Practical quasi-Newton methods for solving nonlinear systems, A quasi-Newton method with modification of one column per iteration, On the convergence of a process basing on the modified secant method, Rates of convergence for adaptive Newton methods, Quasi-Newton methods for solving underdetermined nonlinear simultaneous equations, Local analysis of Newton-type methods for variational inequalities and nonlinear programming, The projection method for solving nonlinear systems of equations under the most violated constraint control, Solution of nonlinear systems of equations by an optimal projection method, An inexact Newton method for nonlinear two-point boundary-value problems, A smoothing Broyden-like method for the mixed complementarity problems, Quasi-Newton methods with factorization scaling for solving sparse nonlinear systems of equations, Multivariate spectral DY-type projection method for convex constrained nonlinear monotone equations, Distributed adaptive Newton methods with global superlinear convergence, Forward-backward quasi-Newton methods for nonsmooth optimization problems, Superlinear convergence of a class of \(\theta\)-bounded rank-one update methods, An efficient DY-type spectral conjugate gradient method for system of nonlinear monotone equations with application in signal recovery, Majorizing Sequences and Error Bounds for Iterative Methods, The local convergence of the Byrd-Schnabel algorithm for constrained optimization, An efficient gradient-free projection algorithm for constrained nonlinear equations and image restoration, Local and superlinear convergence of quasi-Newton methods based on modified secant conditions, On the Convergence of a Quasi-Newton Method for Sparse Nonlinear Systems, A parallel quasi-Newton algorithm for unconstrained optimization, Quadratically and superlinearly convergent algorithms for the solution of inequality constrained minimization problems, A filter method for solving nonlinear complementarity problems based on derivative-free line search, The column-updating method for solving nonlinear equations in Hilbert space, Limited-memory BFGS with displacement aggregation, Rates of superlinear convergence for classical quasi-Newton methods, Inexact Newton methods on a vector supercomputer, An extension of the theory of secant preconditioners, Inexact Newton methods for solving nonsmooth equations, The inexact, inexact perturbed, and quasi-Newton methods are equivalent models, A quasi-Newton method with Wolfe line searches for multiobjective optimization, Hessian informed mirror descent, The global convergence of the BFGS method with a modified WWP line search for nonconvex functions, Unnamed Item, A modified nonmonotone BFGS algorithm for unconstrained optimization, A modified Liu-Storey-conjugate descent hybrid projection method for convex constrained nonlinear equations and image restoration, The global convergence of a modified BFGS method for nonconvex functions, A derivative-free conjugate residual method using secant condition for general large-scale nonlinear equations, Geometric notes on optimization with equality constraints, The global convergence of the BFGS method under a modified Yuan-Wei-Lu line search technique, Forward-Backward Envelope for the Sum of Two Nonconvex Functions: Further Properties and Nonmonotone Linesearch Algorithms, A Nonmonotone Filter SQP Method: Local Convergence and Numerical Results, Diagonal approximation of the Hessian by finite differences for unconstrained optimization, A diagonal quasi-Newton updating method for unconstrained optimization, The projection technique for two open problems of unconstrained optimization problems, Direct Secant Updates of Matrix Factorizations, Inertial iterative method for solving variational inequality problems of pseudo-monotone operators and fixed point problems of nonexpansive mappings in Hilbert spaces, Approximate norm descent methods for constrained nonlinear systems, Newton-type methods for non-convex optimization under inexact Hessian information, A family of modified spectral projection methods for nonlinear monotone equations with convex constraint, Unnamed Item, Shamanskii-like Levenberg-Marquardt method with a new line search for systems of nonlinear equations, Local convergence of quasi-Newton methods under metric regularity, Mise à jour de la métrique dans les méthodes de quasi-Newton réduites en optimisation avec contraintes d'égalité, Diagonal quasi-Newton methods via least change updating principle with weighted Frobenius norm, A Structured Quasi-Newton Algorithm for Optimizing with Incomplete Hessian Information, PRP-like algorithm for monotone operator equations, A superlinearly convergent algorithm for minimization without evaluating derivatives, Block BFGS Methods, Optimization Methods for Large-Scale Machine Learning, A derivative-free three-term projection algorithm involving spectral quotient for solving nonlinear monotone equations, Quasi-Newton methods: superlinear convergence without line searches for self-concordant functions, Local and superlinear convergence for truncated iterated projections methods, A new modified BFGS method for unconstrained optimization problems, A regularized limited memory BFGS method for nonconvex unconstrained minimization, The Sequential Quadratic Programming Method, On the convergence rate of imperfect minimization algorithms in Broyden'sβ-class, On local convergence of sequential quadratically-constrained quadratic-programming type methods, with an extension to variational problems, Multilevel least-change Newton-like methods for equality constrained optimization problems, A projection method for convex constrained monotone nonlinear equations with applications, Convergence analysis of an improved BFGS method and its application in the Muskingum model, A survey on the high convergence orders and computational convergence orders of sequences, New quasi-Newton methods for unconstrained optimization problems, Unnamed Item, A globally convergent BFGS method for nonlinear monotone equations without any merit functions, Deriving collinear scaling algorithms as extensions of quasi-Newton methods and the local convergence of DFP- and BFGS-related collinear scaling algorithms, Spectral gradient projection method for solving nonlinear monotone equations, Globalized inexact proximal Newton-type methods for nonconvex composite functions, Hölder strong metric subregularity and its applications to convergence analysis of inexact Newton methods, New results on superlinear convergence of classical quasi-Newton methods, New line search methods for unconstrained optimization, A global convergent quasi-Newton method for systems of monotone equations, A modified scaled spectral-conjugate gradient-based algorithm for solving monotone operator equations, Global convergence of BFGS and PRP methods under a modified weak Wolfe-Powell line search, A smoothing quasi-Newton method for solving general second-order cone complementarity problems, A modified Broyden-like quasi-Newton method for nonlinear equations, FR-type algorithm for finding approximate solutions to nonlinear monotone operator equations, Global inexact quasi-Newton method for nonlinear system of equations with constraints, Superlinear convergence of Broyden's boundedθ-class of methods, On solving double direction methods for convex constrained monotone nonlinear equations with image restoration, A limited memory \(q\)-BFGS algorithm for unconstrained optimization problems, Continuation Newton methods with the residual trust-region time-stepping scheme for nonlinear equations, A method with inertial extrapolation step for convex constrained monotone equations, Scaled nonlinear conjugate gradient methods for nonlinear least squares problems, A derivative-free iterative method for nonlinear monotone equations with convex constraints, A globally convergent BFGS method for symmetric nonlinear equations, A class of factorization update algorithm for solving systems of sparse nonlinear equations, Projection method with inertial step for nonlinear equations: application to signal recovery, On two conjectures about Dennis-Moré conditions, Generalized self-concordant functions: a recipe for Newton-type methods, A primal-dual interior-point algorithm for nonlinear least squares constrained problems, A modified BFGS type quasi-Newton method with line search for symmetric nonlinear equations problems, The modified PRP conjugate gradient algorithm under a non-descent line search and its application in the Muskingum model and image restoration problems, The Newton-arithmetic mean method for the solution of systems of nonlinear equations., An improved quasi-newton method for unconstrained optimization, Adaptive three-term family of conjugate residual methods for system of monotone nonlinear equations, On zeros of polynomial and vector solutions of associated polynomial system from Viëta theorem, A globally convergent incremental Newton method, Local properties of algorithms for minimizing nonsmooth composite functions, An efficient three-term conjugate gradient-based algorithm involving spectral quotient for solving convex constrained monotone nonlinear equations with applications, On the implementation of a quasi-Newton interior-point method for PDE-constrained optimization using finite element discretizations, A derivative-free three-term Hestenes–Stiefel type method for constrained nonlinear equations and image restoration, Improved convergence analysis of a smoothing Newton method for the circular cone programming, Unnamed Item, Non-asymptotic superlinear convergence of standard quasi-Newton methods, A modified conjugate gradient parameter via hybridization approach for solving large-scale systems of nonlinear equations, An efficient modified residual-based algorithm for large scale symmetric nonlinear equations by approximating successive iterated gradients, Approximate Newton Policy Gradient Algorithms, Inexact free derivative quasi-Newton method for large-scale nonlinear system of equations, A Riemannian subspace BFGS trust region method, Another hybrid approach for solving monotone operator equations and application to signal processing, Unnamed Item, A \(J\)-symmetric quasi-Newton method for minimax problems, Solving nonlinear equations with a direct Broyden method and its acceleration, A derivative‐free projection method for nonlinear equations with non‐Lipschitz operator: Application to LASSO problem, Comment on: ``A derivative-free iterative method for nonlinear monotone equations with convex constraints, Hessian averaging in stochastic Newton methods achieves superlinear convergence, Competitive secant (BFGS) methods based on modified secant relations for unconstrained optimization, A hybrid HS-LS conjugate gradient algorithm for unconstrained optimization with applications in motion control and image recovery, Greedy PSB methods with explicit superlinear convergence, Global convergence via modified self-adaptive approach for solving constrained monotone nonlinear equations with application to signal recovery problems, Approximating Higher-Order Derivative Tensors Using Secant Updates, On the Hybridization of the Double Step Length Method for Solving System of Nonlinear Equations, The strict superlinear order can be faster than the infinite order, Modified three-term derivative-free projection method for solving nonlinear monotone equations with application, An adaptive projection BFGS method for nonconvex unconstrained optimization problems, Local convergence analysis of an inexact trust-region method for nonsmooth optimization, Unnamed Item, A New Diagonal Quasi-Newton Updating Method With Scaled Forward Finite Differences Directional Derivative for Unconstrained Optimization, A new accelerated diagonal quasi-Newton updating method with scaled forward finite differences directional derivative for unconstrained optimization, Stable factorized quasi-Newton methods for nonlinear least-squares problems, A modified BFGS method and its global convergence in nonconvex minimization, IQN: An Incremental Quasi-Newton Method with Local Superlinear Convergence Rate, Inexact perturbed Newton methods and applications to a class of Krylov solvers, Nonsmooth equation based BFGS method for solving KKT systems in mathematical programming, Properties and numerical performance of quasi-Newton methods with modified quasi-Newton equations, Unnamed Item, Three-steps modified Levenberg-Marquardt method with a new line search for systems of nonlinear equations, An Inexact Variable Metric Proximal Point Algorithm for Generic Quasi-Newton Acceleration, Unnamed Item, A Bregman Forward-Backward Linesearch Algorithm for Nonconvex Composite Optimization: Superlinear Convergence to Nonisolated Local Minima, Greedy Quasi-Newton Methods with Explicit Superlinear Convergence, Inexact Newton Methods and Dennis--Moré Theorems for Nonsmooth Generalized Equations, How Many Steps Still Left to $x$*?, Convergence of quasi-Newton methods for solving constrained generalized equations, One-Step Estimation with Scaled Proximal Methods



Cites Work