A modified scaled memoryless BFGS preconditioned conjugate gradient algorithm for nonsmooth convex optimization
From MaRDI portal
Publication:1716990
DOI10.3934/jimo.2017075zbMath1412.90113OpenAlexW2758749225MaRDI QIDQ1716990
Publication date: 5 February 2019
Published in: Journal of Industrial and Management Optimization (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.3934/jimo.2017075
convergence analysisMoreau-Yosida regularizationnonsmooth convex optimizationnonmonotone techniquescaled conjugate gradient methodmemoryless BFGS method
Numerical mathematical programming methods (65K05) Convex programming (90C25) Nonlinear programming (90C30)
Related Items (1)
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- A modified Hestenes and Stiefel conjugate gradient algorithm for large-scale nonsmooth minimizations and nonlinear equations
- An ODE-like nonmonotone method for nonsmooth convex optimization
- An approximate quasi-Newton bundle-type method for nonsmooth optimization
- A trust region algorithm with adaptive cubic regularization methods for nonsmooth convex minimization
- Globally convergent limited memory bundle method for large-scale nonsmooth optimization
- Proximal quasi-Newton methods for nondifferentiable convex optimization
- Globally convergent BFGS method for nonsmooth convex optimization
- On the superlinear convergence of the variable metric proximal point algorithm using Broyden and BFGS matrix secant updating
- A trust region method for nonsmooth convex optimization
- A family of variable metric proximal methods
- A quasi-second-order proximal bundle algorithm
- Gradient trust region algorithm with limited memory BFGS update for nonsmooth convex minimization
- Conjugate gradient type methods for the nondifferentiable convex minimization
- A modified Polak-Ribière-Polyak conjugate gradient algorithm for nonsmooth convex programs
- A modified scaled memoryless BFGS preconditioned conjugate gradient method for unconstrained optimization
- Scaled conjugate gradient algorithms for unconstrained optimization
- Optimization theory and methods. Nonlinear programming
- A class of descent four-term extension of the Dai-Liao conjugate gradient method based on the scaled memoryless BFGS update
- THE BARZILAI AND BORWEIN GRADIENT METHOD WITH NONMONOTONE LINE SEARCH FOR NONSMOOTH CONVEX OPTIMIZATION PROBLEMS
- A descent algorithm for nonsmooth convex optimization
- A descent modified Polak–Ribière–Polyak conjugate gradient method and its global convergence
- Two-Point Step Size Gradient Methods
- On the Convergence of a New Conjugate Gradient Algorithm
- Practical Aspects of the Moreau--Yosida Regularization: Theoretical Preliminaries
- A derivative-free line search and global convergence of Broyden-like method for nonlinear equations
- A Nonmonotone Line Search Technique and Its Application to Unconstrained Optimization
- Convergence Analysis of Some Algorithms for Solving Nonsmooth Equations
- A Globally and Superlinearly Convergent Algorithm for Nonsmooth Convex Minimization
- Numerical methods for nondifferentiable convex optimization
- New limited memory bundle method for large-scale nonsmooth optimization
- A spectral conjugate gradient method for unconstrained optimization
- Benchmarking optimization software with performance profiles.
This page was built for publication: A modified scaled memoryless BFGS preconditioned conjugate gradient algorithm for nonsmooth convex optimization