On Quasi-Newton Forward-Backward Splitting: Proximal Calculus and Convergence
From MaRDI portal
Publication:5235484
DOI10.1137/18M1167152zbMath1461.65128arXiv1801.08691OpenAlexW4289019726MaRDI QIDQ5235484
Stephen R. Becker, Jalal Fadili, Peter Ochs
Publication date: 11 October 2019
Published in: SIAM Journal on Optimization (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1801.08691
Numerical mathematical programming methods (65K05) Convex programming (90C25) Sensitivity, stability, parametric optimization (90C31)
Related Items
An inexact successive quadratic approximation method for a class of difference-of-convex optimization problems, The developments of proximal point algorithms, A hybrid quasi-Newton method with application in sparse recovery, COAP 2021 best paper prize, Newton acceleration on manifolds identified by proximal gradient methods, Smooth over-parameterized solvers for non-smooth structured optimization, Deep-plug-and-play proximal Gauss-Newton method with applications to nonlinear, ill-posed inverse problems, Inexact proximal DC Newton-type method for nonconvex composite functions, Understanding the convergence of the preconditioned PDHG method: a view of indefinite proximal ADMM, Stochastic variable metric proximal gradient with variance reduction for non-convex composite optimization, Minimizing oracle-structured composite functions, An approximate Newton-type proximal method using symmetric rank-one updating formula for minimizing the nonsmooth composite functions, Proximal gradient/semismooth Newton methods for projection onto a polyhedron via the duality-gap-active-set strategy, Template-based image reconstruction facing different topologies, A proximal quasi-Newton method based on memoryless modified symmetric rank-one formula, Convergence of Inexact Forward--Backward Algorithms Using the Forward--Backward Envelope, PNKH-B: A Projected Newton--Krylov Method for Large-Scale Bound-Constrained Optimization, Inexact proximal memoryless quasi-Newton methods based on the Broyden family for minimizing composite functions, Globalized inexact proximal Newton-type methods for nonconvex composite functions, Adaptive FISTA for Nonconvex Optimization, Scaled, Inexact, and Adaptive Generalized FISTA for Strongly Convex Optimization
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems
- Nonsmooth optimization via quasi-Newton methods
- Variable metric forward-backward algorithm for minimizing the sum of a differentiable function and a convex function
- Efficient evaluation of scaled proximal operators
- Variable metric quasi-Fejér monotonicity
- A limited memory steepest descent method
- A new steplength selection for scaled gradient methods with application to image deblurring
- Tame functions are semismooth
- A duality principle for non-convex optimisation and the calculus of variations
- A sparse matrix arithmetic based on \({\mathfrak H}\)-matrices. I: Introduction to \({\mathfrak H}\)-matrices
- Newton's method for a class of nonsmooth functions
- Introductory lectures on convex optimization. A basic course.
- Geometric categories and o-minimal structures
- Nonsmoothness and a variable metric method
- Adaptive restart for accelerated gradient schemes
- Forward-backward quasi-Newton methods for nonsmooth optimization problems
- A nonsmooth version of Newton's method
- Non-smooth non-convex Bregman minimization: unification and new algorithms
- Variable Metric Inexact Line-Search-Based Methods for Nonsmooth Optimization
- The Rate of Convergence of Nesterov's Accelerated Forward-Backward Method is Actually Faster Than $1/k^2$
- A Variable Metric Extension of the Forward–Backward–Forward Algorithm for Monotone Operators
- A Generalized Forward-Backward Splitting
- A Quasi-Newton Approach to Nonsmooth Convex Optimization Problems in Machine Learning
- Proximal Splitting Methods in Signal Processing
- Proximal Newton-Type Methods for Minimizing Composite Functions
- Remark on “algorithm 778: L-BFGS-B: Fortran subroutines for large-scale bound constrained optimization”
- A Fast Algorithm for Sparse Reconstruction Based on Shrinkage, Subspace Optimization, and Continuation
- Newton's Method for B-Differentiable Equations
- A New Active Set Algorithm for Box Constrained Optimization
- New convergence results for the scaled gradient projection method
- A scaled gradient projection method for constrained image deblurring
- EXTENSION OF NEWTON AND QUASI-NEWTON METHODS TO SYSTEMS OF PC^1 EQUATIONS
- Two-Point Step Size Gradient Methods
- Algorithm 778: L-BFGS-B
- Convergence Rates in Forward--Backward Splitting
- Semismooth Newton Methods for Operator Equations in Function Spaces
- Smoothing Methods and Semismooth Methods for Nondifferentiable Operator Equations
- Nonmonotone Spectral Projected Gradient Methods on Convex Sets
- Sparse Reconstruction by Separable Approximation
- Finite-Dimensional Variational Inequalities and Complementarity Problems
- A Limited Memory Algorithm for Bound Constrained Optimization
- Tackling Box-Constrained Optimization via a New Projected Quasi-Newton Approach
- Preconditioned Douglas--Rachford Splitting Methods for Convex-concave Saddle-point Problems
- On the convergence of a linesearch based proximal-gradient method for nonconvex optimization
- Signal Recovery by Proximal Forward-Backward Splitting
- Variable metric forward–backward splitting with applications to monotone inclusions in duality
- Convex Analysis
- IMRO: A Proximal Quasi-Newton Method for Solving $\ell_1$-Regularized Least Squares Problems
- Convex analysis and monotone operator theory in Hilbert spaces