New subspace minimization conjugate gradient methods based on regularization model for unconstrained optimization
From MaRDI portal
Publication:2041515
DOI10.1007/s11075-020-01017-1zbMath1472.90133arXiv2004.01455OpenAlexW3094968720MaRDI QIDQ2041515
Publication date: 23 July 2021
Published in: Numerical Algorithms (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/2004.01455
unconstrained optimizationconjugate gradient methodnonmonotone line search\(p\)-regularization modelsubspace technique
Numerical mathematical programming methods (65K05) Large-scale problems in mathematical programming (90C06) Nonlinear programming (90C30)
Related Items (6)
A filter sequential adaptive cubic regularization algorithm for nonlinear constrained optimization ⋮ A regularized limited memory subspace minimization conjugate gradient method for unconstrained optimization ⋮ A new subspace minimization conjugate gradient method for unconstrained minimization ⋮ A sequential adaptive regularisation using cubics algorithm for solving nonlinear equality constrained optimization ⋮ A class of accelerated subspace minimization conjugate gradient methods ⋮ Several accelerated subspace minimization conjugate gradient methods based on regularization model and convergence rate analysis for nonconvex problems
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- A Barzilai-Borwein conjugate gradient method
- An accelerated subspace minimization three-term conjugate gradient algorithm for unconstrained optimization
- Interior-point methods for nonconvex nonlinear programming: cubic regularization
- Worst-case evaluation complexity for unconstrained nonlinear optimization using high-order regularized models
- Adaptive cubic regularisation methods for unconstrained optimization. I: Motivation, convergence and numerical results
- Adaptive cubic regularisation methods for unconstrained optimization. II: Worst-case function- and derivative-evaluation complexity
- A new class of nonlinear conjugate gradient coefficients with exact and inexact line searches
- Updating the regularization parameter in the adaptive cubic regularization algorithm
- Subspace methods for large scale nonlinear equations and nonlinear least squares
- Gradient methods with adaptive step-sizes
- On the limited memory BFGS method for large scale optimization
- Conjugate gradient algorithms in nonconvex optimization
- On solving trust-region and other regularised subproblems in optimization
- Accelerating the cubic regularization of Newton's method on convex problems
- A subspace conjugate gradient algorithm for large-scale unconstrained optimization
- An efficient Barzilai-Borwein conjugate gradient method for unconstrained optimization
- An efficient gradient method with approximate optimal stepsize for large-scale unconstrained optimization
- A subspace minimization conjugate gradient method based on conic model for unconstrained optimization
- On global minimizers of quadratic functions with cubic regularization
- Several efficient gradient methods with approximate optimal stepsizes for large scale unconstrained optimization
- A new subspace minimization conjugate gradient method with nonmonotone line search for unconstrained optimization
- Linear convergence of first order methods for non-strongly convex optimization
- Cubic regularization of Newton method and its global performance
- On the use of iterative methods in cubic regularization for unconstrained optimization
- A cubic regularization algorithm for unconstrained optimization using line search and nonmonotone techniques
- Convergence of a Regularized Euclidean Residual Algorithm for Nonlinear Least-Squares
- A Modified BFGS Algorithm for Unconstrained Optimization
- Two-Point Step Size Gradient Methods
- Theory and application of p-regularized subproblems for p>2
- A Nonmonotone Line Search Technique and Its Application to Unconstrained Optimization
- A Subspace Study on Conjugate Gradient Algorithms
- A Nonlinear Conjugate Gradient Method with a Strong Global Convergence Property
- Strong local convergence properties of adaptive regularized methods for nonlinear least squares
- A Nonlinear Conjugate Gradient Algorithm with an Optimal Property and an Improved Wolfe Line Search
- A New Conjugate Gradient Method with Guaranteed Descent and an Efficient Line Search
- Function minimization by conjugate gradients
- The Limited Memory Conjugate Gradient Method
- Affine conjugate adaptive Newton methods for nonlinear elastomechanics
- CUTEr and SifDec
- The conjugate gradient method in extremal problems
- Methods of conjugate gradients for solving linear systems
- New conjugacy conditions and related nonlinear conjugate gradient methods
- Benchmarking optimization software with performance profiles.
- A new efficient conjugate gradient method for unconstrained optimization
This page was built for publication: New subspace minimization conjugate gradient methods based on regularization model for unconstrained optimization