Several accelerated subspace minimization conjugate gradient methods based on regularization model and convergence rate analysis for nonconvex problems
From MaRDI portal
Publication:2098802
DOI10.1007/s11075-022-01319-6OpenAlexW4280609174MaRDI QIDQ2098802
Publication date: 22 November 2022
Published in: Numerical Algorithms (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s11075-022-01319-6
convergenceconjugate gradient methodKurdyka-Łojasiewicz propertysubspace minimizationregularization model
Numerical mathematical programming methods (65K05) Large-scale problems in mathematical programming (90C06) Nonlinear programming (90C30) Methods of reduced gradient type (90C52)
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- A Barzilai-Borwein conjugate gradient method
- Proximal alternating linearized minimization for nonconvex and nonsmooth problems
- An accelerated subspace minimization three-term conjugate gradient algorithm for unconstrained optimization
- Adaptive cubic regularisation methods for unconstrained optimization. I: Motivation, convergence and numerical results
- Adaptive cubic regularisation methods for unconstrained optimization. II: Worst-case function- and derivative-evaluation complexity
- Updating the regularization parameter in the adaptive cubic regularization algorithm
- On the limited memory BFGS method for large scale optimization
- Modified two-point stepsize gradient methods for unconstrained optimization
- A subspace conjugate gradient algorithm for large-scale unconstrained optimization
- An efficient Barzilai-Borwein conjugate gradient method for unconstrained optimization
- An efficient gradient method with approximate optimal stepsize for large-scale unconstrained optimization
- New subspace minimization conjugate gradient methods based on regularization model for unconstrained optimization
- A class of accelerated subspace minimization conjugate gradient methods
- A subspace minimization conjugate gradient method based on conic model for unconstrained optimization
- On global minimizers of quadratic functions with cubic regularization
- A new subspace minimization conjugate gradient method with nonmonotone line search for unconstrained optimization
- A Newton-like method with mixed factorizations and cubic regularization for unconstrained minimization
- On the use of iterative methods in cubic regularization for unconstrained optimization
- Convergence of a Regularized Euclidean Residual Algorithm for Nonlinear Least-Squares
- A Modified BFGS Algorithm for Unconstrained Optimization
- Algorithm 851
- Two-Point Step Size Gradient Methods
- Numerical Optimization
- Relatively Smooth Convex Optimization by First-Order Methods, and Applications
- A Nonmonotone Line Search Technique and Its Application to Unconstrained Optimization
- A Subspace Study on Conjugate Gradient Algorithms
- A Nonlinear Conjugate Gradient Method with a Strong Global Convergence Property
- A new subspace minimization conjugate gradient method based on tensor model for unconstrained optimization
- Strong local convergence properties of adaptive regularized methods for nonlinear least squares
- A Nonlinear Conjugate Gradient Algorithm with an Optimal Property and an Improved Wolfe Line Search
- A New Conjugate Gradient Method with Guaranteed Descent and an Efficient Line Search
- Function minimization by conjugate gradients
- The Limited Memory Conjugate Gradient Method
- CUTEr and SifDec
- The conjugate gradient method in extremal problems
- Methods of conjugate gradients for solving linear systems
- Benchmarking optimization software with performance profiles.
This page was built for publication: Several accelerated subspace minimization conjugate gradient methods based on regularization model and convergence rate analysis for nonconvex problems