Conjugate gradient type methods for the nondifferentiable convex minimization
From MaRDI portal
Publication:1941196
DOI10.1007/s11590-011-0437-5zbMath1287.90049OpenAlexW2009182781MaRDI QIDQ1941196
Publication date: 12 March 2013
Published in: Optimization Letters (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s11590-011-0437-5
Related Items
Convergence of the proximal bundle algorithm for nonsmooth nonconvex optimization problems, An ODE-like nonmonotone method for nonsmooth convex optimization, Multivariate spectral gradient algorithm for nonsmooth convex optimization problems, A nonlinear conjugate gradient method using inexact first-order information, An efficient conjugate gradient method with strong convergence properties for non-smooth optimization, A modified scaled memoryless BFGS preconditioned conjugate gradient algorithm for nonsmooth convex optimization, A conceptual conjugate epi-projection algorithm of convex optimization: superlinear, quadratic and finite convergence, A modified Polak-Ribière-Polyak conjugate gradient algorithm for nonsmooth convex programs, A memory gradient method for non-smooth convex optimization
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Smooth minimization of non-smooth functions
- A trust region algorithm with adaptive cubic regularization methods for nonsmooth convex minimization
- Global convergence of some modified PRP nonlinear conjugate gradient methods
- Globally convergent limited memory bundle method for large-scale nonsmooth optimization
- Discrete gradient method: Derivative-free method for nonsmooth optimization
- A new trust region algorithm for nonsmooth convex minimization
- Nonlinear analysis and variational problems. In Honor of George Isac
- Nonsmooth approach to optimization problems with equilibrium constraints. Theory, applications and numerical results
- A bundle-Newton method for nonsmooth unconstrained minimization
- Proximal quasi-Newton methods for nondifferentiable convex optimization
- Convergence of some algorithms for convex minimization
- A preconditioning proximal Newton method for nondifferentiable convex optimization
- An implementation of Shor's \(r\)-algorithm
- Globally convergent BFGS method for nonsmooth convex optimization
- On the superlinear convergence of the variable metric proximal point algorithm using Broyden and BFGS matrix secant updating
- A trust region method for nonsmooth convex optimization
- A family of variable metric proximal methods
- A quasi-second-order proximal bundle algorithm
- An approximate subgradient algorithm for unconstrained nonsmooth, nonconvex optimization
- A Quasi-Newton Approach to Nonsmooth Convex Optimization Problems in Machine Learning
- A descent algorithm for nonsmooth convex optimization
- A descent modified Polak–Ribière–Polyak conjugate gradient method and its global convergence
- Encyclopedia of Optimization
- The Speed of Shor's R-algorithm
- A quasisecant method for minimizing nonsmooth functions
- Practical Aspects of the Moreau--Yosida Regularization: Theoretical Preliminaries
- Quasi-Newton Bundle-Type Methods for Nondifferentiable Convex Optimization
- A Globally and Superlinearly Convergent Algorithm for Nonsmooth Convex Minimization
- Numerical methods for nondifferentiable convex optimization
- A New Conjugate Gradient Method with Guaranteed Descent and an Efficient Line Search
- Excessive Gap Technique in Nonsmooth Convex Minimization
- A Two-Term PRP-Based Descent Method