On optimality of the parameters of self-scaling memoryless quasi-Newton updating formulae
From MaRDI portal
Publication:887119
DOI10.1007/s10957-015-0724-xzbMath1327.90394OpenAlexW2042877555MaRDI QIDQ887119
Publication date: 28 October 2015
Published in: Journal of Optimization Theory and Applications (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s10957-015-0724-x
unconstrained optimizationeigenvaluelarge-scale optimizationcondition numbermemoryless quasi-Newton update
Numerical computation of eigenvalues and eigenvectors of matrices (65F15) Methods of quasi-Newton type (90C53) Numerical methods based on nonlinear programming (49M37)
Related Items (28)
A hybrid scaling parameter for the scaled memoryless BFGS method based on the ℓ∞ matrix norm ⋮ An accelerated three-term conjugate gradient method with sufficient descent condition and conjugacy condition ⋮ A modified scaling parameter for the memoryless BFGS updating formula ⋮ Two accelerated nonmonotone adaptive trust region line search methods ⋮ A diagonal quasi-Newton updating method based on minimizing the measure function of Byrd and Nocedal for unconstrained optimization ⋮ Nonmonotone diagonally scaled limited-memory BFGS methods with application to compressive sensing based on a penalty model ⋮ A hybrid quasi-Newton method with application in sparse recovery ⋮ A double-parameter scaling Broyden-Fletcher-Goldfarb-Shanno method based on minimizing the measure function of Byrd and Nocedal for unconstrained optimization ⋮ New conjugate gradient algorithms based on self-scaling memoryless Broyden-Fletcher-Goldfarb-Shanno method ⋮ Nonmonotone quasi-Newton-based conjugate gradient methods with application to signal processing ⋮ A modified scaled memoryless symmetric rank-one method ⋮ Eigenvalue analyses on the memoryless Davidon-Fletcher-Powell method based on a spectral secant equation ⋮ A NONMONOTONE ADMM-BASED DIAGONAL QUASI-NEWTON UPDATE WITH APPLICATION TO THE COMPRESSIVE SENSING PROBLEM ⋮ A restart scheme for the memoryless BFGS method ⋮ An approximate Newton-type proximal method using symmetric rank-one updating formula for minimizing the nonsmooth composite functions ⋮ A diagonally scaled Newton-type proximal method for minimization of the models with nonsmooth composite cost functions ⋮ A spectral conjugate gradient method for solving large-scale unconstrained optimization ⋮ An adaptive nonmonotone trust region algorithm ⋮ A new accelerated diagonal quasi-Newton updating method with scaled forward finite differences directional derivative for unconstrained optimization ⋮ Analysis of the maximum magnification by the scaled memoryless DFP updating formula with application to compressive sensing ⋮ Two--parameter scaled memoryless BFGS methods with a nonmonotone choice for the initial step length ⋮ A modified nonmonotone trust region line search method ⋮ A linear hybridization of the Hestenes-Stiefel method and the memoryless BFGS technique ⋮ An augmented memoryless BFGS method based on a modified secant equation with application to compressed sensing ⋮ A new CG algorithm based on a scaled memoryless BFGS update with adaptive search strategy, and its application to large-scale unconstrained optimization problems ⋮ The new spectral conjugate gradient method for large-scale unconstrained optimisation ⋮ An adaptive nonmonotone trust region method based on a modified scalar approximation of the Hessian in the successive quadratic subproblems ⋮ Diagonally scaled memoryless quasi-Newton methods with application to compressed sensing
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Two modified scaled nonlinear conjugate gradient methods
- A note on the global convergence theorem of the scaled conjugate gradient algorithms proposed by Andrei
- A modified scaled conjugate gradient method with global convergence for nonconvex functions
- Open problems in nonlinear conjugate gradient algorithms for unconstrained optimization
- A modified BFGS algorithm based on a hybrid secant equation
- New conjugacy condition and related new conjugate gradient methods for unconstrained optimization
- A scaled BFGS preconditioned conjugate gradient algorithm for unconstrained optimization
- Two new conjugate gradient methods based on modified secant equations
- New quasi-Newton equation and related methods for unconstrained optimization
- A quadratic hybridization of Polak-Ribière-Polyak and Fletcher-Reeves conjugate gradient methods
- Accelerated scaled memoryless BFGS preconditioned conjugate gradient algorithm for unconstrained optimization
- A modified scaled memoryless BFGS preconditioned conjugate gradient method for unconstrained optimization
- Scaled conjugate gradient algorithms for unconstrained optimization
- New quasi-Newton methods for unconstrained optimization problems
- Optimization theory and methods. Nonlinear programming
- The global convergence of self-scaling BFGS algorithm with non-monotone line search for unconstrained nonconvex optimization problems
- A Modified BFGS Algorithm for Unconstrained Optimization
- A nonlinear conjugate gradient method based on the MBFGS secant condition
- A scaled nonlinear conjugate gradient algorithm for unconstrained optimization
- Two-Point Step Size Gradient Methods
- Self-Scaling Variable Metric (SSVM) Algorithms
- Self-Scaling Variable Metric (SSVM) Algorithms
- Optimal conditioning of self-scaling variable Metric algorithms
- A Nonlinear Conjugate Gradient Algorithm with an Optimal Property and an Improved Wolfe Line Search
- Scaled memoryless BFGS preconditioned conjugate gradient algorithm for unconstrained optimization
- Convergence Conditions for Ascent Methods
- A modified BFGS method and its global convergence in nonconvex minimization
- Properties and numerical performance of quasi-Newton methods with modified quasi-Newton equations
- Benchmarking optimization software with performance profiles.
This page was built for publication: On optimality of the parameters of self-scaling memoryless quasi-Newton updating formulae