A Bregman Forward-Backward Linesearch Algorithm for Nonconvex Composite Optimization: Superlinear Convergence to Nonisolated Local Minima
From MaRDI portal
Publication:5853567
DOI10.1137/19M1264783zbMath1461.90105arXiv1905.11904OpenAlexW3021534257MaRDI QIDQ5853567
Masoud Ahookhosh, Andreas Themelis, Panagiotis Patrinos
Publication date: 10 March 2021
Published in: SIAM Journal on Optimization (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1905.11904
superlinear convergencenonsmooth nonconvex optimizationrelative smoothnessKL inequalitynonlinear error boundBregman-Moreau and Bregman forward-backward envelopesnonisolated local minima
Nonconvex programming, global optimization (90C26) Nonsmooth analysis (49J52) Set-valued and variational analysis (49J53)
Related Items
Block Bregman Majorization Minimization with Extrapolation, Proximal gradient algorithms under local Lipschitz gradient continuity. A convergence and robustness analysis of PANOC, Dualities for Non-Euclidean Smoothness and Strong Convexity under the Light of Generalized Conjugacy, A Bregman stochastic method for nonconvex nonsmooth problem beyond global Lipschitz gradient continuity, Smoothing unadjusted Langevin algorithms for nonsmooth composite potential functions, Stochastic composition optimization of functions without Lipschitz continuous gradient, A Regularized Newton Method for \({\boldsymbol{\ell}}_{q}\) -Norm Composite Optimization Problems, A class of modified accelerated proximal gradient methods for nonsmooth and nonconvex minimization problems, An approximate Newton-type proximal method using symmetric rank-one updating formula for minimizing the nonsmooth composite functions, Nonlinear Forward-Backward Splitting with Projection Correction, Multi-block Bregman proximal alternating linearized minimization and its application to orthogonal nonnegative matrix factorization, A block inertial Bregman proximal algorithm for nonsmooth nonconvex problems with application to symmetric nonnegative matrix tri-factorization, Accelerated first-order methods for large-scale convex optimization: nearly optimal complexity under strong convexity, A Stochastic Semismooth Newton Method for Nonsmooth Nonconvex Optimization, Bregman Finito/MISO for Nonconvex Regularized Finite Sum Minimization without Lipschitz Gradient Continuity
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems
- An inertial forward-backward algorithm for the minimization of the sum of two nonconvex functions
- Douglas-Rachford splitting for nonconvex optimization with application to nonconvex feasibility problems
- An inertial Tseng's type proximal algorithm for nonsmooth and nonconvex optimization problems
- Gradient methods for minimizing composite functions
- Proximal alternating linearized minimization for nonconvex and nonsmooth problems
- Nonlinear error bounds via a change of function
- The Moreau envelope function and proximal mapping in the sense of the Bregman distance
- A regularized Newton method without line search for unconstrained optimization
- Low rank matrix completion by alternating steepest descent methods
- A coordinate gradient descent method for nonsmooth separable minimization
- On gradients of functions definable in o-minimal structures
- On semi- and subanalytic geometry
- Local behavior of an iterative framework for generalized equations with nonisolated solutions
- Properties of the Moreau-Yosida regularization of a piecewise \(C^2\) convex function
- A simplified view of first order methods for optimization
- From error bounds to the complexity of first-order descent methods for convex functions
- A geometric analysis of phase retrieval
- Regularized Newton methods for convex minimization problems with singular solutions
- Geometric categories and o-minimal structures
- Convergence of descent methods for semi-algebraic and tame problems: proximal algorithms, forward-backward splitting, and regularized Gauss-Seidel methods
- Implementable tensor methods in unconstrained convex optimization
- Convergence of non-smooth descent methods using the Kurdyka-Łojasiewicz inequality
- Bregman proximal mappings and Bregman-Moreau envelopes under relative prox-regularity
- Local convergence of the Levenberg-Marquardt method under Hölder metric subregularity
- Accelerated first-order methods for large-scale convex optimization: nearly optimal complexity under strong convexity
- Optimal subgradient methods: computational properties for large-scale linear inverse problems
- On linear convergence of non-Euclidean gradient methods without strong convexity and Lipschitz gradient continuity
- Forward-backward quasi-Newton methods for nonsmooth optimization problems
- Further properties of the forward-backward envelope with applications to difference-of-convex programming
- Forward-backward splitting with Bregman distances
- Non-smooth non-convex Bregman minimization: unification and new algorithms
- An envelope for Davis-Yin splitting and strict saddle-point avoidance
- First-order methods almost always avoid strict saddle points
- Behavior of accelerated gradient methods near critical points of nonconvex functions
- Variable Metric Inexact Line-Search-Based Methods for Nonsmooth Optimization
- Majorization-Minimization Procedures and Convergence of SQP Methods for Semi-Algebraic and Tame Programs
- Proximal Alternating Minimization and Projection Methods for Nonconvex Problems: An Approach Based on the Kurdyka-Łojasiewicz Inequality
- Clarke Subgradients of Stratifiable Functions
- Characterizations of Łojasiewicz inequalities: Subgradient flows, talweg, convexity
- A generalized proximal point algorithm for certain non-convex minimization problems
- Entropic Proximal Mappings with Applications to Nonlinear Programming
- Quasi-Newton Methods, Motivation and Theory
- Practical Aspects of the Moreau--Yosida Regularization: Theoretical Preliminaries
- Variational Analysis
- ESSENTIAL SMOOTHNESS, ESSENTIAL STRICT CONVEXITY, AND LEGENDRE FUNCTIONS IN BANACH SPACES
- Regularizing with Bregman--Moreau Envelopes
- First Order Methods Beyond Convexity and Lipschitz Gradient Continuity with Applications to Quadratic Inverse Problems
- Forward-Backward Envelope for the Sum of Two Nonconvex Functions: Further Properties and Nonmonotone Linesearch Algorithms
- Relatively Smooth Convex Optimization by First-Order Methods, and Applications
- A Highly Efficient Semismooth Newton Augmented Lagrangian Method for Solving Lasso Problems
- On the Quadratic Convergence of the Cubic Regularization Method under a Local Error Bound Condition
- A Globally and Superlinearly Convergent Algorithm for Nonsmooth Convex Minimization
- Generalized Hessian Properties of Regularized Nonsmooth Functions
- A Characterization of Superlinear Convergence and Its Application to Quasi-Newton Methods
- Finite-Dimensional Variational Inequalities and Complementarity Problems
- Prox-regular functions in variational analysis
- Generalizations of the Dennis--Moré Theorem
- Convex-Concave Backtracking for Inertial Bregman Proximal Gradient Algorithms in Nonconvex Optimization
- Finding zeros of Hölder metrically subregular mappings via globally convergent Levenberg–Marquardt methods
- SuperMann: A Superlinearly Convergent Algorithm for Finding Fixed Points of Nonexpansive Operators
- A Proximal Minimization Algorithm for Structured Nonconvex and Nonsmooth Problems
- Incremental Majorization-Minimization Optimization with Application to Large-Scale Machine Learning
- Newton-Type Methods for Optimization and Variational Problems
- The Łojasiewicz Inequality for Nonsmooth Subanalytic Functions with Applications to Subgradient Dynamical Systems
- Proximité et dualité dans un espace hilbertien
- Convex Analysis
- A Descent Lemma Beyond Lipschitz Gradient Continuity: First-Order Methods Revisited and Applications
- Convex analysis and monotone operator theory in Hilbert spaces