Broyden's quasi-Newton methods for a nonlinear system of equations and unconstrained optimization: a review and open problems
From MaRDI portal
Publication:2926061
DOI10.1080/10556788.2013.856909zbMath1308.90202OpenAlexW1988431651WikidataQ115005285 ScholiaQ115005285MaRDI QIDQ2926061
Francesca Maggioni, Emilio Spedicato, Mehiddin Al-Baali
Publication date: 29 October 2014
Published in: Optimization Methods and Software (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1080/10556788.2013.856909
unconstrained optimizationquasi-Newton methodsfinite terminationABS methodsline search techniquenonlinear algebraic equationsoptimal conditioningmodified methods
Numerical mathematical programming methods (65K05) Nonlinear programming (90C30) Optimality conditions and duality in mathematical programming (90C46) Methods of quasi-Newton type (90C53)
Related Items
Convergence properties of the Broyden-like method for mixed linear-nonlinear systems of equations, A conjugate gradient algorithm for large-scale unconstrained optimization problems and nonlinear equations, A descent hybrid conjugate gradient method based on the memoryless BFGS update, A modified Hager-Zhang conjugate gradient method with optimal choices for solving monotone nonlinear equations, A positive spectral gradient-like method for large-scale nonlinear monotone equations, Nonmonotone diagonally scaled limited-memory BFGS methods with application to compressive sensing based on a penalty model, A hybrid quasi-Newton method with application in sparse recovery, A modified quasi-Newton method for nonlinear equations, A descent Dai-Liao conjugate gradient method for nonlinear equations, Unnamed Item, An alternative formulation of the differential quadrature method with a neural network perspective, Accelerated Dai-Liao projection method for solving systems of monotone nonlinear equations with application to image deblurring, A restart scheme for the memoryless BFGS method, Some modified Hestenes-Stiefel conjugate gradient algorithms with application in image restoration, A modified Newton-like method for nonlinear equations, An improved three-term derivative-free method for solving nonlinear equations, Continuous Variable Neighborhood Search (C-VNS) for Solving Systems of Nonlinear Equations, Broyden's Method for Nonlinear Eigenproblems, Spectrum-based Stability Analysis and Stabilization of Time-periodic Time-delay Systems, Exploiting damped techniques for nonlinear conjugate gradient methods, Implementing and modifying Broyden class updates for large scale optimization, Spectral modified Polak-Ribiére-Polyak projection conjugate gradient method for solving monotone systems of nonlinear equations, An efficient modified AZPRP conjugate gradient method for large-scale unconstrained optimization problem, Unnamed Item, Unnamed Item, Unnamed Item, On the order of convergence of Broyden's method. Faster convergence on mixed linear-nonlinear systems of equations and a conjecture on the q-order, Diagonally scaled memoryless quasi-Newton methods with application to compressed sensing, A NEW DERIVATIVE-FREE CONJUGATE GRADIENT METHOD FOR LARGE-SCALE NONLINEAR SYSTEMS OF EQUATIONS
Cites Work
- Unnamed Item
- Unnamed Item
- A combined class of self-scaling and modified quasi-Newton methods
- Analysis of a self-scaling quasi-Newton method
- On the limited memory BFGS method for large scale optimization
- Matrix multiplication via arithmetic progressions
- Partitioned variable metric updates for large structured optimization problems
- A bound to the condition number of canonical rank-two corrections and applications to the variable metric method
- On some classes of variationally derived quasi-Newton methods for systems of nonlinear algebraic equations
- Variational quasi-Newton methods for unconstrained optimization
- Sizing the BFGS and DFP updates: Numerical study
- Representations of quasi-Newton matrices and their use in limited memory methods
- Impact of partial separability on large-scale optimization
- Global and superlinear convergence of a restricted class of self-scaling methods with inexact line searches, for convex functions
- Matrix algebras in quasi-Newton methods for unconstrained minimization
- Variable metric methods for unconstrained optimization and nonlinear least squares
- Practical quasi-Newton methods for solving nonlinear systems
- Local and superlinear convergence of quasi-Newton methods based on modified secant conditions
- On variable-metric algorithms
- Stability of Huang's update for the conjugate gradient method
- On the performance of switching BFGS/SR1 algorithms for unconstrained optimization
- Approximate invariant subspaces and quasi-newton optimization methods
- Numerical methods for large-scale nonlinear optimization
- A class of rank-one positive definite qnasi-newton updates for unconstrained minimization2
- Global Convergence of a Cass of Quasi-Newton Methods on Convex Problems
- Quasi-Newton Algorithms with Updates from the Preconvex Part of Broyden's Family
- Some Convergence Properties of Broyden’s Method
- A note about sparsity exploiting quasi-Newton updates
- On the Behavior of Broyden’s Class of Quasi-Newton Methods
- Variable Metric Method for Minimization
- Self-Scaling Variable Metric (SSVM) Algorithms
- Algorithm 500: Minimization of Unconstrained Multivariate Functions [E4]
- Optimal conditioning of self-scaling variable Metric algorithms
- Matrix conditioning and nonlinear optimization
- Algorithms for nonlinear constraints that use lagrangian functions
- Numerical Optimization
- Trust Region Methods
- On Sizing and Shifting the BFGS Update within the Sized-Broyden Family of Secant Updates
- A Theoretical and Experimental Study of the Symmetric Rank-One Update
- On measure functions for the self-scaling updating formulae for quasi-newton methods∗
- On the Local and Superlinear Convergence of Quasi-Newton Methods
- Why Broyden’s Nonsymmetric Method Terminates on Linear Equations
- Historical Development of the Newton–Raphson Method
- A Rapidly Convergent Descent Method for Minimization
- The Convergence of a Class of Double-rank Minimization Algorithms 1. General Considerations
- A survey of quasi-Newton equations and quasi-Newton methods for optimization