SPIRAL: a superlinearly convergent incremental proximal algorithm for nonconvex finite sum minimization
DOI10.1007/S10589-023-00550-8MaRDI QIDQ6498409FDOQ6498409
Authors: Pourya Behmandpoor, Puya Latafat, Andreas Themelis, M. Moonen, Panagiotis Patrinos
Publication date: 7 May 2024
Published in: Computational Optimization and Applications (Search for Journal in Brave)
Recommendations
- Nonconvex proximal incremental aggregated gradient method with linear convergence
- Stochastic proximal difference-of-convex algorithm with SPIDER for a class of nonconvex nonsmooth regularized problems
- Linear convergence of proximal incremental aggregated gradient method for nonconvex nonsmooth minimization problems
- Block-coordinate and incremental aggregated proximal gradient methods for nonsmooth nonconvex problems
- Global convergence analysis of sparse regular nonconvex optimization problems
Convex programming (90C25) Large-scale problems in mathematical programming (90C06) Nonconvex programming, global optimization (90C26) Methods of quasi-Newton type (90C53) Nonsmooth analysis (49J52) Set-valued and variational analysis (49J53)
Cites Work
- The elements of statistical learning. Data mining, inference, and prediction
- Phase retrieval via Wirtinger flow: theory and algorithms
- Title not available (Why is that?)
- Gradient methods for minimizing composite functions
- Mirror descent and nonlinear projected subgradient methods for convex optimization.
- Parallel stochastic gradient algorithms for large-scale matrix completion
- Convex Analysis
- Incremental majorization-minimization optimization with application to large-scale machine learning
- Proximal alternating linearized minimization for nonconvex and nonsmooth problems
- Finite-Dimensional Variational Inequalities and Complementarity Problems
- Clarke Subgradients of Stratifiable Functions
- Quasi-Newton Methods, Motivation and Theory
- Convergence Analysis of a Proximal-Like Minimization Algorithm Using Bregman Functions
- Gradient Convergence in Gradient methods with Errors
- Stochastic First- and Zeroth-Order Methods for Nonconvex Stochastic Programming
- A Convergent Incremental Gradient Method with a Constant Step Size
- Mini-batch stochastic approximation methods for nonconvex stochastic composite optimization
- On gradients of functions definable in o-minimal structures
- The Łojasiewicz Inequality for Nonsmooth Subanalytic Functions with Applications to Subgradient Dynamical Systems
- Stochastic dual coordinate ascent methods for regularized loss minimization
- An inexact hybrid generalized proximal point algorithm and some new results on the theory of Bregman functions
- On stochastic subgradient mirror-descent algorithm with weighted averaging
- A Characterization of Superlinear Convergence and Its Application to Quasi-Newton Methods
- Minimizing finite sums with the stochastic average gradient
- The Moreau envelope function and proximal mapping in the sense of the Bregman distance
- Lectures on convex optimization
- Surpassing gradient descent provably: a cyclic incremental method with linear convergence rate
- Nonlinear programming
- First order methods beyond convexity and Lipschitz gradient continuity with applications to quadratic inverse problems
- Relatively smooth convex optimization by first-order methods, and applications
- A descent lemma beyond Lipschitz gradient continuity: first-order methods revisited and applications
- A geometric analysis of phase retrieval
- A simplified view of first order methods for optimization
- Local convergence of quasi-Newton methods under metric regularity
- Forward-backward envelope for the sum of two nonconvex functions: further properties and nonmonotone linesearch algorithms
- Solving (most) of a set of quadratic equalities: composite optimization for robust phase retrieval
- IQN: an incremental quasi-Newton method with local superlinear convergence rate
- A Bregman forward-backward linesearch algorithm for nonconvex composite optimization: superlinear convergence to nonisolated local minima
- Why random reshuffling beats stochastic gradient descent
- Global convergence rate of proximal incremental aggregated gradient methods
- Proximal-like incremental aggregated gradient method with linear convergence under Bregman distance growth conditions
- Bregman Finito/MISO for nonconvex regularized finite sum minimization without Lipschitz gradient continuity
- SuperMann: A Superlinearly Convergent Algorithm for Finding Fixed Points of Nonexpansive Operators
- Proximal gradient algorithms under local Lipschitz gradient continuity. A convergence and robustness analysis of PANOC
Cited In (1)
This page was built for publication: SPIRAL: a superlinearly convergent incremental proximal algorithm for nonconvex finite sum minimization
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6498409)