Several accelerated subspace minimization conjugate gradient methods based on regularization model and convergence rate analysis for nonconvex problems
DOI10.1007/S11075-022-01319-6OpenAlexW4280609174MaRDI QIDQ2098802FDOQ2098802
Authors: Yanyan Li
Publication date: 22 November 2022
Published in: Numerical Algorithms (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s11075-022-01319-6
Recommendations
- New subspace minimization conjugate gradient methods based on regularization model for unconstrained optimization
- A class of accelerated subspace minimization conjugate gradient methods
- A new subspace minimization conjugate gradient method with nonmonotone line search for unconstrained optimization
- A subspace minimization conjugate gradient method based on conic model for unconstrained optimization
- A new subspace minimization conjugate gradient method based on modified secant equation for unconstrained optimization
convergenceconjugate gradient methodsubspace minimizationregularization modelKurdyka-Łojasiewicz property
Numerical mathematical programming methods (65K05) Large-scale problems in mathematical programming (90C06) Nonlinear programming (90C30) Methods of reduced gradient type (90C52)
Cites Work
- Algorithm 851
- CUTEr and SifDec
- Numerical Optimization
- Benchmarking optimization software with performance profiles.
- On the limited memory BFGS method for large scale optimization
- Function minimization by conjugate gradients
- Two-Point Step Size Gradient Methods
- A Nonmonotone Line Search Technique and Its Application to Unconstrained Optimization
- A Nonlinear Conjugate Gradient Method with a Strong Global Convergence Property
- A nonlinear conjugate gradient algorithm with an optimal property and an improved Wolfe line search
- A New Conjugate Gradient Method with Guaranteed Descent and an Efficient Line Search
- The conjugate gradient method in extremal problems
- Methods of conjugate gradients for solving linear systems
- Proximal alternating linearized minimization for nonconvex and nonsmooth problems
- Title not available (Why is that?)
- Convergence of a regularized Euclidean residual algorithm for nonlinear least-squares
- Strong local convergence properties of adaptive regularized methods for nonlinear least squares
- A survey of nonlinear conjugate gradient methods
- A Subspace Study on Conjugate Gradient Algorithms
- A Modified BFGS Algorithm for Unconstrained Optimization
- Adaptive cubic regularisation methods for unconstrained optimization. I: Motivation, convergence and numerical results
- Adaptive cubic regularisation methods for unconstrained optimization. II: Worst-case function- and derivative-evaluation complexity
- An accelerated subspace minimization three-term conjugate gradient algorithm for unconstrained optimization
- The Limited Memory Conjugate Gradient Method
- A Barzilai-Borwein conjugate gradient method
- Modified two-point stepsize gradient methods for unconstrained optimization
- Title not available (Why is that?)
- An efficient Barzilai-Borwein conjugate gradient method for unconstrained optimization
- Updating the regularization parameter in the adaptive cubic regularization algorithm
- On the use of iterative methods in cubic regularization for unconstrained optimization
- A Newton-like method with mixed factorizations and cubic regularization for unconstrained minimization
- Relatively smooth convex optimization by first-order methods, and applications
- A subspace conjugate gradient algorithm for large-scale unconstrained optimization
- An efficient gradient method with approximate optimal stepsize for large-scale unconstrained optimization
- New subspace minimization conjugate gradient methods based on regularization model for unconstrained optimization
- A subspace minimization conjugate gradient method based on conic model for unconstrained optimization
- On global minimizers of quadratic functions with cubic regularization
- A new subspace minimization conjugate gradient method with nonmonotone line search for unconstrained optimization
- A new subspace minimization conjugate gradient method based on tensor model for unconstrained optimization
- A class of accelerated subspace minimization conjugate gradient methods
Cited In (3)
Uses Software
This page was built for publication: Several accelerated subspace minimization conjugate gradient methods based on regularization model and convergence rate analysis for nonconvex problems
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2098802)