Nonmonotone quasi-Newton-based conjugate gradient methods with application to signal processing
DOI10.1007/s11075-022-01477-7OpenAlexW4311524465MaRDI QIDQ6109887
Fatemeh Dargahi, Zohre Aminifard, Saman Babaie-Kafaki
Publication date: 31 July 2023
Published in: Numerical Algorithms (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s11075-022-01477-7
nonlinear optimizationconjugate gradient methodnonmonotone line searchsparse recoverynonnegative matrix factorizationforgetting factor
Numerical mathematical programming methods (65K05) Nonlinear programming (90C30) Signal theory (characterization, reconstruction, filtering, etc.) (94A12) Numerical analysis (65-XX)
Cites Work
- Unnamed Item
- Unnamed Item
- Smooth minimization of non-smooth functions
- A self-adaptive three-term conjugate gradient method for monotone nonlinear equations with convex constraints
- Modified subspace Barzilai-Borwein gradient method for non-negative matrix factorization
- Globally convergent three-term conjugate gradient methods that use secant conditions and generate descent search directions for unconstrained optimization
- Conjugate gradient methods based on secant conditions that generate descent search directions for unconstrained optimization
- A new generalized shrinkage conjugate gradient method for sparse recovery
- A nonmonotone conjugate gradient algorithm for unconstrained optimization
- Exponential convergence of recursive last squares with exponential forgetting factor
- On optimality of the parameters of self-scaling memoryless quasi-Newton updating formulae
- A truncated Newton method with non-monotone line search for unconstrained optimization
- A class of nonmonotone conjugate gradient methods for unconstrained optimization
- Robust regression: Asymptotics, conjectures and Monte Carlo
- Nonmonotone trust region method for solving optimization problems
- A new modified nonmonotone adaptive trust region method for unconstrained optimization
- A conjugate gradient algorithm for large-scale nonlinear equations and image restoration problems
- A class of modified FR conjugate gradient method and applications to non-negative matrix factorization
- A structured quasi-Newton algorithm with nonmonotone search strategy for structured NLS problems and its application in robotic motion control
- Diagonally scaled memoryless quasi-Newton methods with application to compressed sensing
- Global convergence via descent modified three-term conjugate gradient projection algorithm with applications to signal recovery
- A modified conjugate gradient method for monotone nonlinear equations with convex constraints
- On efficiency of nonmonotone Armijo-type line searches
- A conjugate gradient algorithm and its applications in image restoration
- A modified spectral conjugate gradient method with global convergence
- A modified descent Polak-Ribiére-Polyak conjugate gradient method with global convergence property for nonconvex functions
- A nonmonotone trust-region line search method for large-scale unconstrained optimization
- An inexact line search approach using modified nonmonotone strategy for unconstrained optimization
- Optimization theory and methods. Nonlinear programming
- Large sparse signal recovery by conjugate gradient algorithm based on smoothing technique
- NESTA: A Fast and Accurate First-Order Method for Sparse Recovery
- Algorithm 851
- From Sparse Solutions of Systems of Equations to Sparse Modeling of Signals and Images
- Two-Point Step Size Gradient Methods
- Global Convergence Properties of Conjugate Gradient Methods for Optimization
- Nonmonotone Spectral Projected Gradient Methods on Convex Sets
- A diagonal quasi-Newton updating method based on minimizing the measure function of Byrd and Nocedal for unconstrained optimization
- An adaptive nonmonotone trust region algorithm
- A Nonmonotone Line Search Technique and Its Application to Unconstrained Optimization
- A Nonmonotone Line Search Technique for Newton’s Method
- An Assessment of Nonmonotone Linesearch Techniques for Unconstrained Optimization
- TWO MODIFIED HYBRID CONJUGATE GRADIENT METHODS BASED ON A HYBRID SECANT EQUATION
- A Nonlinear Conjugate Gradient Algorithm with an Optimal Property and an Improved Wolfe Line Search
- A New Conjugate Gradient Method with Guaranteed Descent and an Efficient Line Search
- Learning the parts of objects by non-negative matrix factorization
- Gradient-based variable forgetting factor RLS algorithm in time-varying environments
- Some descent three-term conjugate gradient methods and their global convergence
- CUTEr and SifDec
- Two hybrid nonlinear conjugate gradient methods based on a modified secant equation
- A descent family of Dai–Liao conjugate gradient methods
- Methods of conjugate gradients for solving linear systems
- A CLASS OF NONMONOTONE SPECTRAL MEMORY GRADIENT METHOD
- New conjugacy conditions and related nonlinear conjugate gradient methods
- Benchmarking optimization software with performance profiles.
- On the nonmonotone line search
This page was built for publication: Nonmonotone quasi-Newton-based conjugate gradient methods with application to signal processing