A new class of conjugate gradient methods for unconstrained smooth optimization and absolute value equations
DOI10.1007/S10092-018-0298-8zbMATH Open1407.90311OpenAlexW2902250637WikidataQ128826320 ScholiaQ128826320MaRDI QIDQ667882FDOQ667882
Authors: Farzad Rahpeymaii, Keyvan Amini, Tofigh Allahviranloo, M. Rostamy-Malkhalifeh
Publication date: 1 March 2019
Published in: Calcolo (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s10092-018-0298-8
Recommendations
- Some three-term conjugate gradient methods for solving unconstrained optimization problems
- New three-term conjugate gradient method with guaranteed global convergence
- Some three-term conjugate gradient methods with the new direction structure
- Modified three-term conjugate gradient method and its applications
- A three-term conjugate gradient method with sufficient descent property for unconstrained optimization
conjugate gradient methodWolfe conditionssmooth optimizationabsolute value equationsconjugate subgradient method
Nonlinear programming (90C30) Nonlinear ordinary differential equations and systems (34A34) Least squares and related methods for stochastic control systems (93E24)
Cites Work
- A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems
- CUTEst: a constrained and unconstrained testing environment with safe threads for mathematical optimization
- Title not available (Why is that?)
- Benchmarking optimization software with performance profiles.
- Function minimization by conjugate gradients
- Line search algorithms with guaranteed sufficient decrease
- Nonlinear conjugate gradient methods with sufficient descent condition for large-scale unconstrained optimization
- On three-term conjugate gradient algorithms for unconstrained optimization
- A simple three-term conjugate gradient algorithm for unconstrained optimization
- A modified Polak-Ribière-Polyak conjugate gradient algorithm for unconstrained optimization
- A descent modified Polak–Ribière–Polyak conjugate gradient method and its global convergence
- Restart procedures for the conjugate gradient method
- A Nonlinear Conjugate Gradient Method with a Strong Global Convergence Property
- A New Conjugate Gradient Method with Guaranteed Descent and an Efficient Line Search
- Some descent three-term conjugate gradient methods and their global convergence
- Methods of conjugate gradients for solving linear systems
- Title not available (Why is that?)
- Sparse Reconstruction by Separable Approximation
- A modified Polak-Ribière-Polyak conjugate gradient algorithm for nonsmooth convex programs
- Title not available (Why is that?)
- An improved adaptive trust-region method for unconstrained optimization
- Absolute value equations
- Convergence of quasi-Newton matrices generated by the symmetric rank one update
- A nonmonotone trust region method based on simple conic models for unconstrained optimization
- A survey of nonlinear conjugate gradient methods
- Convergence Properties of Nonlinear Conjugate Gradient Methods
- New version of the three-term conjugate gradient method based on spectral scaling conjugacy condition that generates descent search direction
- A generalized Newton method for absolute value equations
- Global and finite convergence of a generalized Newton method for absolute value equations
- A theorem of the alternatives for the equationAx+B|x| =b
- A globally and quadratically convergent method for absolute value equations
- Fixed-Point Continuation Applied to Compressed Sensing: Implementation and Numerical Experiments
- Levenberg-Marquardt method for solving systems of absolute value equations
- On an iterative method for solving absolute value equations
- The “global” convergence of Broyden-like methods with suitable line search
- A three-term conjugate gradient algorithm for large-scale unconstrained optimization problems
- Unconstrained Optimization of Real Functions in Complex Variables
- A new restarting adaptive trust-region method for unconstrained optimization
- Generalized conjugate gradient methods for \(\ell_1\) regularized convex quadratic programming with finite convergence
- Some techniques for solving absolute value equations
- A new iterative method for solving linear systems
- Minimum norm solution of the absolute value equations via simulated annealing algorithm
Cited In (6)
- A descent conjugate gradient algorithm for optimization problems and its applications in image restoration and compression sensing
- An inverse-free dynamical system for solving the absolute value equations
- A new concave minimization algorithm for the absolute value equation solution
- On finite termination of the generalized Newton method for solving absolute value equations
- A new three-term spectral subgradient method for solving absolute value equation
- An inertial inverse-free dynamical system for solving absolute value equations
Uses Software
This page was built for publication: A new class of conjugate gradient methods for unconstrained smooth optimization and absolute value equations
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q667882)