A class of diagonal quasi-Newton methods for large-scale convex minimization
DOI10.1007/S40840-015-0117-1zbMATH Open1357.65075OpenAlexW2059444119MaRDI QIDQ503542FDOQ503542
Publication date: 13 January 2017
Published in: Bulletin of the Malaysian Mathematical Sciences Society. Second Series (Search for Journal in Brave)
Full work available at URL: http://psasir.upm.edu.my/id/eprint/43466/1/abstract01.pdf
Recommendations
- Diagonal quasi-Newton methods via least change updating principle with weighted Frobenius norm
- Scaling on diagonal quasi-Newton update for large-scale unconstrained optimization
- Diagonal quasi-Newton method via variational principle under generalized Frobenius norm
- scientific article; zbMATH DE number 6270804
- A diagonal-sparse quasi-Newton method for unconstrained optimization problem
numerical resultscalingquasi-Newton methodsdiagonal updatingglobal \(R\)-linear convergencelarge-scale convex minimization
Numerical mathematical programming methods (65K05) Convex programming (90C25) Large-scale problems in mathematical programming (90C06) Methods of quasi-Newton type (90C53)
Cites Work
- Testing Unconstrained Optimization Software
- Title not available (Why is that?)
- Nonmonotone Spectral Projected Gradient Methods on Convex Sets
- Function minimization by conjugate gradients
- Two-Point Step Size Gradient Methods
- Line search algorithms with guaranteed sufficient decrease
- An unconstrained optimization test functions collection
- Title not available (Why is that?)
- A Nonmonotone Line Search Technique for Newton’s Method
- Title not available (Why is that?)
- Sizing and Least-Change Secant Methods
- Global Convergence of a Cass of Quasi-Newton Methods on Convex Problems
- Title not available (Why is that?)
- Self-Scaling Variable Metric (SSVM) Algorithms
- Global and superlinear convergence of a restricted class of self-scaling methods with inexact line searches, for convex functions
- The Quasi-Cauchy Relation and Diagonal Updating
- An improved multi-step gradient-type method for large scale optimization
- Modified two-point stepsize gradient methods for unconstrained optimization
- A new gradient method via quasi-Cauchy relation which guarantees descent
- Numerical expirience with a class of self-scaling quasi-Newton algorithms
- A new two-step gradient-type method for large-scale unconstrained optimization
Cited In (3)
- Global convergence of a family of modified BFGS methods under a modified weak-Wolfe-Powell line search for nonconvex functions
- Diagonal bundle method with convex and concave updates for large-scale nonconvex and nonsmooth optimization
- Diagonally scaled memoryless quasi-Newton methods with application to compressed sensing
Uses Software
This page was built for publication: A class of diagonal quasi-Newton methods for large-scale convex minimization
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q503542)