A three-term conjugate gradient method with a random parameter for large-scale unconstrained optimization and its application in regression model
From MaRDI portal
Publication:6137748
DOI10.11650/TJM/230503OpenAlexW4379467405MaRDI QIDQ6137748FDOQ6137748
Yueting Yang, Xue Zhang, Mingyuan Cao, Guoling Zhou
Publication date: 16 January 2024
Published in: Taiwanese Journal of Mathematics (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.11650/tjm/230503
regression modelglobal convergencethree-term conjugate gradientlarge-scale unconstrained optimizationrandom parameter
Cites Work
- CUTEr and SifDec
- Benchmarking optimization software with performance profiles.
- A simple three-term conjugate gradient algorithm for unconstrained optimization
- Technical Note—A Modified Conjugate Gradient Algorithm
- A Nonlinear Conjugate Gradient Algorithm with an Optimal Property and an Improved Wolfe Line Search
- A New Conjugate Gradient Method with Guaranteed Descent and an Efficient Line Search
- New conjugacy conditions and related nonlinear conjugate gradient methods
- Global Convergence Properties of Conjugate Gradient Methods for Optimization
- Global convergence properties of nonlinear conjugate gradient methods with modified secant condition
- A nonlinear conjugate gradient method based on the MBFGS secant condition
- A Subspace Study on Conjugate Gradient Algorithms
- Two new conjugate gradient methods based on modified secant equations
- A New Dai-Liao Conjugate Gradient Method with Optimal Parameter Choice
- A descent family of Dai–Liao conjugate gradient methods
- The new spectral conjugate gradient method for large-scale unconstrained optimisation
- Nonlinear conjugate gradient methods for unconstrained optimization
- A new accelerated conjugate gradient method for large-scale unconstrained optimization
- A descent hybrid conjugate gradient method based on the memoryless BFGS update
- A Dai-Liao conjugate gradient algorithm with clustering of eigenvalues
- An inexact accelerated stochastic ADMM for separable convex optimization
- An adaptive three-term conjugate gradient method based on self-scaling memoryless BFGS matrix
- A three-term conjugate gradient algorithm using subspace for large-scale unconstrained optimization
- Sufficient descent Riemannian conjugate gradient methods
- Some modified Hestenes-Stiefel conjugate gradient algorithms with application in image restoration
- Smoothing strategy along with conjugate gradient algorithm for signal reconstruction
- Signal reconstruction by conjugate gradient algorithm based on smoothing \(l_1\)-norm
This page was built for publication: A three-term conjugate gradient method with a random parameter for large-scale unconstrained optimization and its application in regression model
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6137748)