Stochastic algorithms with geometric step decay converge linearly on sharp functions
From MaRDI portal
Publication:6608032
DOI10.1007/S10107-023-02003-WzbMATH Open1547.6506MaRDI QIDQ6608032FDOQ6608032
Authors: Damek Shea Davis, D. Drusvyatskiy, Vasileios Charisopoulos
Publication date: 19 September 2024
Published in: Mathematical Programming. Series A. Series B (Search for Journal in Brave)
Numerical mathematical programming methods (65K05) Nonlinear programming (90C30) Stochastic programming (90C15)
Cites Work
- Phaselift: exact and stable signal recovery from magnitude measurements via convex programming
- PhaseMax: Convex Phase Retrieval via Basis Pursuit
- Variational Analysis
- Adaptive restart for accelerated gradient schemes
- Acceleration of Stochastic Approximation by Averaging
- Title not available (Why is that?)
- A Stochastic Approximation Method
- Primal-dual subgradient methods for convex problems
- Convergence rate of incremental subgradient algorithms
- Loss minimization and parameter estimation with heavy tails
- Title not available (Why is that?)
- On convergence rates of subgradient optimization methods
- Incremental Majorization-Minimization Optimization with Application to Large-Scale Machine Learning
- Calculus without derivatives
- Prox-regular functions in variational analysis
- Dual averaging methods for regularized stochastic learning and online optimization
- Title not available (Why is that?)
- Optimal Stochastic Approximation Algorithms for Strongly Convex Stochastic Composite Optimization, II: Shrinking Procedures and Optimal Algorithms
- Phase retrieval: stability and recovery guarantees
- Identifiable Surfaces in Constrained Optimization
- Active Sets, Nonsmoothness, and Sensitivity
- Optimal methods of smooth convex minimization
- Blind Deconvolution Using Convex Programming
- Rapid, robust, and reliable blind deconvolution via nonconvex optimization
- Stochastic Methods for Composite and Weakly Convex Optimization Problems
- Title not available (Why is that?)
- Minimizing finite sums with the stochastic average gradient
- Title not available (Why is that?)
- Title not available (Why is that?)
- New computational guarantees for solving convex optimization problems with first order methods, via a function growth condition measure
- Faster subgradient methods for functions with Hölderian growth
- Subgradient methods for sharp weakly convex functions
- RSG: Beating Subgradient Method without Smoothness and Strong Convexity
- Solving (most) of a set of quadratic equalities: composite optimization for robust phase retrieval
- Stochastic Model-Based Minimization of Weakly Convex Functions
- Phase retrieval via randomized Kaczmarz: theoretical guarantees
Cited In (1)
This page was built for publication: Stochastic algorithms with geometric step decay converge linearly on sharp functions
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6608032)