On quasi-Newton forward-backward splitting: proximal calculus and convergence
From MaRDI portal
Publication:5235484
DOI10.1137/18M1167152zbMATH Open1461.65128arXiv1801.08691OpenAlexW4289019726MaRDI QIDQ5235484FDOQ5235484
Authors: S. Becker, Jalal Fadili, Peter Ochs
Publication date: 11 October 2019
Published in: SIAM Journal on Optimization (Search for Journal in Brave)
Abstract: We introduce a framework for quasi-Newton forward--backward splitting algorithms (proximal quasi-Newton methods) with a metric induced by diagonal rank- symmetric positive definite matrices. This special type of metric allows for a highly efficient evaluation of the proximal mapping. The key to this efficiency is a general proximal calculus in the new metric. By using duality, formulas are derived that relate the proximal mapping in a rank- modified metric to the original metric. We also describe efficient implementations of the proximity calculation for a large class of functions; the implementations exploit the piece-wise linear nature of the dual problem. Then, we apply these results to acceleration of composite convex minimization problems, which leads to elegant quasi-Newton methods for which we prove convergence. The algorithm is tested on several numerical examples and compared to a comprehensive list of alternatives in the literature. Our quasi-Newton splitting algorithm with the prescribed metric compares favorably against state-of-the-art. The algorithm has extensive applications including signal processing, sparse recovery, machine learning and classification to name a few.
Full work available at URL: https://arxiv.org/abs/1801.08691
Recommendations
- Forward-backward quasi-Newton methods for nonsmooth optimization problems
- Proximal Splitting Algorithms for Convex Optimization: A Tour of Recent Advances, with New Twists
- An inexact variable metric proximal point algorithm for generic quasi-Newton acceleration
- A quasi-second-order proximal bundle algorithm
- Proximal quasi-Newton methods for regularized convex optimization with linear and accelerated sublinear convergence rates
Numerical mathematical programming methods (65K05) Convex programming (90C25) Sensitivity, stability, parametric optimization (90C31)
Cites Work
- Algorithm 778: L-BFGS-B
- A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems
- A fast algorithm for sparse reconstruction based on shrinkage, subspace optimization, and continuation
- IMRO: A proximal quasi-Newton method for solving \(\ell_1\)-regularized least squares problems
- A Limited Memory Algorithm for Bound Constrained Optimization
- Title not available (Why is that?)
- Convex analysis and monotone operator theory in Hilbert spaces
- A sparse matrix arithmetic based on \({\mathfrak H}\)-matrices. I: Introduction to \({\mathfrak H}\)-matrices
- Introductory lectures on convex optimization. A basic course.
- Adaptive restart for accelerated gradient schemes
- A quasi-Newton approach to nonsmooth convex optimization problems in machine learning
- Remark on ``Algorithm 778: L-BFGS-B: Fortran subroutines for large-scale bound constrained optimization
- Title not available (Why is that?)
- Nonmonotone Spectral Projected Gradient Methods on Convex Sets
- Tackling box-constrained optimization via a new projected quasi-Newton approach
- Two-Point Step Size Gradient Methods
- Title not available (Why is that?)
- Convex Analysis
- Proximal splitting methods in signal processing
- Sparse Reconstruction by Separable Approximation
- Signal Recovery by Proximal Forward-Backward Splitting
- Geometric categories and o-minimal structures
- A nonsmooth version of Newton's method
- A scaled gradient projection method for constrained image deblurring
- Finite-Dimensional Variational Inequalities and Complementarity Problems
- Title not available (Why is that?)
- Title not available (Why is that?)
- Nonsmooth optimization via quasi-Newton methods
- A Variable Metric Extension of the Forward–Backward–Forward Algorithm for Monotone Operators
- Newton's Method for B-Differentiable Equations
- A New Active Set Algorithm for Box Constrained Optimization
- Convergence Rates in Forward--Backward Splitting
- Variable metric forward-backward splitting with applications to monotone inclusions in duality
- Variable metric quasi-Fejér monotonicity
- A limited memory steepest descent method
- A duality principle for non-convex optimisation and the calculus of variations
- Smoothing Methods and Semismooth Methods for Nondifferentiable Operator Equations
- Variable metric forward-backward algorithm for minimizing the sum of a differentiable function and a convex function
- Preconditioned Douglas--Rachford Splitting Methods for Convex-concave Saddle-point Problems
- Semismooth Newton Methods for Operator Equations in Function Spaces
- A generalized forward-backward splitting
- Newton's method for a class of nonsmooth functions
- Title not available (Why is that?)
- Title not available (Why is that?)
- Title not available (Why is that?)
- Title not available (Why is that?)
- EXTENSION OF NEWTON AND QUASI-NEWTON METHODS TO SYSTEMS OF PC^1 EQUATIONS
- Proximal Newton-type methods for minimizing composite functions
- Efficient evaluation of scaled proximal operators
- Tame functions are semismooth
- The rate of convergence of Nesterov's accelerated forward-backward method is actually faster than \(1/k^2\)
- Nonsmoothness and a variable metric method
- Variable metric inexact line-search-based methods for nonsmooth optimization
- A new steplength selection for scaled gradient methods with application to image deblurring
- Forward-backward quasi-Newton methods for nonsmooth optimization problems
- On the convergence of a linesearch based proximal-gradient method for nonconvex optimization
- New convergence results for the scaled gradient projection method
- Non-smooth non-convex Bregman minimization: unification and new algorithms
Cited In (32)
- Convergence of inexact forward-backward algorithms using the forward-backward envelope
- Scaled, inexact, and adaptive generalized FISTA for strongly convex optimization
- A hybrid quasi-Newton method with application in sparse recovery
- Calculating the proximal subdifferential via the quasidifferential
- An inexact variable metric proximal point algorithm for generic quasi-Newton acceleration
- Smooth over-parameterized solvers for non-smooth structured optimization
- Cardinality minimization, constraints, and regularization: a survey
- Proximal quasi-Newton methods for regularized convex optimization with linear and accelerated sublinear convergence rates
- Inexact proximal DC Newton-type method for nonconvex composite functions
- An inexact successive quadratic approximation method for a class of difference-of-convex optimization problems
- Understanding the convergence of the preconditioned PDHG method: a view of indefinite proximal ADMM
- COAP 2021 best paper prize
- A new homotopy proximal variable-metric framework for composite convex minimization
- Newton acceleration on manifolds identified by proximal gradient methods
- Adaptive FISTA for Nonconvex Optimization
- Stochastic variable metric proximal gradient with variance reduction for non-convex composite optimization
- Minimizing oracle-structured composite functions
- Proximal Splitting Algorithms for Convex Optimization: A Tour of Recent Advances, with New Twists
- Inexact proximal memoryless quasi-Newton methods based on the Broyden family for minimizing composite functions
- Globalized inexact proximal Newton-type methods for nonconvex composite functions
- An approximate Newton-type proximal method using symmetric rank-one updating formula for minimizing the nonsmooth composite functions
- Proximal gradient/semismooth Newton methods for projection onto a polyhedron via the duality-gap-active-set strategy
- Practical inexact proximal quasi-Newton method with global complexity analysis
- Template-based image reconstruction facing different topologies
- Forward-backward quasi-Newton methods for nonsmooth optimization problems
- A VMiPG method for composite optimization with nonsmooth term having no closed-form proximal mapping
- PNKH-B: A Projected Newton--Krylov Method for Large-Scale Bound-Constrained Optimization
- Title not available (Why is that?)
- Efficient evaluation of scaled proximal operators
- Deep-plug-and-play proximal Gauss-Newton method with applications to nonlinear, ill-posed inverse problems
- The developments of proximal point algorithms
- A proximal quasi-Newton method based on memoryless modified symmetric rank-one formula
Uses Software
This page was built for publication: On quasi-Newton forward-backward splitting: proximal calculus and convergence
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q5235484)