Forward-backward quasi-Newton methods for nonsmooth optimization problems
From MaRDI portal
Publication:2364124
Abstract: The forward-backward splitting method (FBS) for minimizing a nonsmooth composite function can be interpreted as a (variable-metric) gradient method over a continuously differentiable function which we call forward-backward envelope (FBE). This allows to extend algorithms for smooth unconstrained optimization and apply them to nonsmooth (possibly constrained) problems. Since the FBE and its gradient can be computed by simply evaluating forward-backward steps, the resulting methods rely on the very same black-box oracle as FBS. We propose an algorithmic scheme that enjoys the same global convergence properties of FBS when the problem is convex, or when the objective function possesses the Kurdyka-{L}ojasiewicz property at its critical points. Moreover, when using quasi-Newton directions the proposed method achieves superlinear convergence provided that usual second-order sufficiency conditions on the FBE hold at the limit point of the generated sequence. Such conditions translate into milder requirements on the original function involving generalized second-order differentiability. We show that BFGS fits our framework and that the limited-memory variant L-BFGS is well suited for large-scale problems, greatly outperforming FBS or its accelerated version in practice. The analysis of superlinear convergence is based on an extension of the Dennis and Mor'e theorem for the proposed algorithmic scheme.
Recommendations
- Forward-backward envelope for the sum of two nonconvex functions: further properties and nonmonotone linesearch algorithms
- On quasi-Newton forward-backward splitting: proximal calculus and convergence
- A quasi-Newton approach to nonsmooth convex optimization problems in machine learning
- A bundle-type quasi-Newton method for nonconvex nonsmooth optimization
- An approximate quasi-Newton bundle-type method for nonsmooth optimization
Cites work
- scientific article; zbMATH DE number 439380 (Why is no real title available?)
- scientific article; zbMATH DE number 1818892 (Why is no real title available?)
- scientific article; zbMATH DE number 3529352 (Why is no real title available?)
- scientific article; zbMATH DE number 1243473 (Why is no real title available?)
- scientific article; zbMATH DE number 1329071 (Why is no real title available?)
- scientific article; zbMATH DE number 852532 (Why is no real title available?)
- scientific article; zbMATH DE number 5060482 (Why is no real title available?)
- A Characterization of Superlinear Convergence and Its Application to Quasi-Newton Methods
- A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems
- A Globally and Superlinearly Convergent Algorithm for Nonsmooth Convex Minimization
- A Nonlinear Conjugate Gradient Method with a Strong Global Convergence Property
- A Tool for the Analysis of Quasi-Newton Methods with Application to Unconstrained Minimization
- A family of variable metric proximal methods
- A fast algorithm for sparse reconstruction based on shrinkage, subspace optimization, and continuation
- A global piecewise smooth Newton method for fast large-scale model predictive control
- A perfect example for the BFGS method
- A trust region method for nonsmooth convex optimization
- Amenable functions in optimization
- An accelerated proximal gradient algorithm for nuclear norm regularized linear least squares problems
- An inertial forward-backward algorithm for the minimization of the sum of two nonconvex functions
- Convergence Properties of the BFGS Algoritm
- Convergence of descent methods for semi-algebraic and tame problems: proximal algorithms, forward-backward splitting, and regularized Gauss-Seidel methods
- Convex analysis and monotone operator theory in Hilbert spaces
- Equivalent differentiable optimization problems and descent methods for asymmetric variational inequality problems
- Exact matrix completion via convex optimization
- Exact penalty functions for constrained minimization problems via regularized gap function for variational inequalities
- Finite-Dimensional Variational Inequalities and Complementarity Problems
- First- and Second-Order Epi-Differentiability in Nonlinear Programming
- Further properties of the forward-backward envelope with applications to difference-of-convex programming
- Generalized Hessian Properties of Regularized Nonsmooth Functions
- Gradient methods for minimizing composite functions
- Local convergence of quasi-Newton methods for B-differentiable equations
- Matrix mathematics. Theory, facts, and formulas
- Monotone Operators and the Proximal Point Algorithm
- Newton's Method for B-Differentiable Equations
- On gradients of functions definable in o-minimal structures
- On semi- and subanalytic geometry
- On the convergence of the proximal algorithm for nonsmooth functions involving analytic features
- On the divergence of line search methods
- On the global convergence of the BFGS method for nonconvex unconstrained optimization problems
- On the limited memory BFGS method for large scale optimization
- On the superlinear convergence of the variable metric proximal point algorithm using Broyden and BFGS matrix secant updating
- Practical Aspects of the Moreau--Yosida Regularization: Theoretical Preliminaries
- Practical inexact proximal quasi-Newton method with global complexity analysis
- Proximal Alternating Minimization and Projection Methods for Nonconvex Problems: An Approach Based on the Kurdyka-Łojasiewicz Inequality
- Proximal alternating linearized minimization for nonconvex and nonsmooth problems
- Proximal quasi-Newton methods for nondifferentiable convex optimization
- Proximal splitting methods in signal processing
- Proximité et dualité dans un espace hilbertien
- Quasi-Newton Bundle-Type Methods for Nondifferentiable Convex Optimization
- Quasi-Newton Methods, Motivation and Theory
- Second-Order Optimality Conditions in Nonlinear Programming Obtained by Way of Epi-Derivatives
- Signal Recovery by Proximal Forward-Backward Splitting
- Sparse Reconstruction by Separable Approximation
- Splitting Algorithms for the Sum of Two Nonlinear Operators
- The BFGS method with exact line searches fails for non-convex objective functions
- The Łojasiewicz Inequality for Nonsmooth Subanalytic Functions with Applications to Subgradient Dynamical Systems
- Unconstrained optimization reformulations of variational inequality problems
- Updating Quasi-Newton Matrices with Limited Storage
- Using Complex Variables to Estimate Derivatives of Real Functions
- Variational Analysis
- iPiano: inertial proximal algorithm for nonconvex optimization
Cited in
(46)- Local convergence of the heavy-ball method and iPiano for non-convex optimization
- A Trust-region Method for Nonsmooth Nonconvex Optimization
- Accelerating inexact successive quadratic approximation for regularized optimization through manifold identification
- Role of subgradients in variational analysis of polyhedral functions
- An inexact successive quadratic approximation method for a class of difference-of-convex optimization problems
- An accelerated coordinate gradient descent algorithm for non-separable composite optimization
- Envelope functions: unifications and further properties
- Second order semi-smooth proximal Newton methods in Hilbert spaces
- An abstract convergence framework with application to inertial inexact forward-backward methods
- Manifold sampling for optimization of nonconvex functions that are piecewise linear compositions of smooth components
- Smoothing unadjusted Langevin algorithms for nonsmooth composite potential functions
- The forward-backward splitting method and its convergence rate for the minimization of the sum of two functions in Banach spaces
- Proximal gradient flow and Douglas-Rachford splitting dynamics: global exponential stability via integral quadratic constraints
- Coordinate descent methods beyond smoothness and separability
- A new homotopy proximal variable-metric framework for composite convex minimization
- Forward-backward envelope for the sum of two nonconvex functions: further properties and nonmonotone linesearch algorithms
- A Bregman forward-backward linesearch algorithm for nonconvex composite optimization: superlinear convergence to nonisolated local minima
- Inexact proximal memoryless quasi-Newton methods based on the Broyden family for minimizing composite functions
- Globalized inexact proximal Newton-type methods for nonconvex composite functions
- An inexact variable metric proximal point algorithm for generic quasi-Newton acceleration
- Convergence of inexact forward-backward algorithms using the forward-backward envelope
- An envelope for Davis-Yin splitting and strict saddle-point avoidance
- A Regularized Newton Method for \({\boldsymbol{\ell}}_{q}\) -Norm Composite Optimization Problems
- Massively parallelizable proximal algorithms for large‐scale stochastic optimal control problems
- Forward-Backward Extragradient Methods for Quasimonotone Variational Inequalities
- A proximal quasi-Newton method based on memoryless modified symmetric rank-one formula
- Adaptive FISTA for Nonconvex Optimization
- A Chain Rule for Strict Twice Epi-Differentiability and Its Applications
- A forward-backward-forward algorithm for solving quasimonotone variational inequalities
- A stochastic semismooth Newton method for nonsmooth nonconvex optimization
- Kurdyka-Łojasiewicz exponent via inf-projection
- Globally convergent coderivative-based generalized Newton methods in nonsmooth optimization
- Douglas--Rachford Splitting and ADMM for Nonconvex Optimization: Tight Convergence Results
- Further properties of the forward-backward envelope with applications to difference-of-convex programming
- Inexact reduced gradient methods in nonconvex optimization
- Newton-type methods with the proximal gradient step for sparse estimation
- A VMiPG method for composite optimization with nonsmooth term having no closed-form proximal mapping
- Douglas-Rachford splitting and ADMM for nonconvex optimization: accelerated and Newton-type linesearch algorithms
- A family of optimal weighted conjugate-gradient-type methods for strictly convex quadratic minimization
- The forward-backward envelope for sampling with the overdamped Langevin algorithm
- The developments of proximal point algorithms
- Nonmonotone globalization for Anderson acceleration via adaptive regularization
- Template-based image reconstruction facing different topologies
- On quasi-Newton forward-backward splitting: proximal calculus and convergence
- A stochastic extra-step quasi-Newton method for nonsmooth nonconvex optimization
- Incremental quasi-Newton algorithms for solving a nonconvex, nonsmooth, finite-sum optimization problem
This page was built for publication: Forward-backward quasi-Newton methods for nonsmooth optimization problems
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2364124)