On optimal universal first-order methods for minimizing heterogeneous sums
From MaRDI portal
Publication:6191975
DOI10.1007/s11590-023-02060-2arXiv2208.08549OpenAlexW4386968310MaRDI QIDQ6191975
Publication date: 11 March 2024
Published in: Optimization Letters (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/2208.08549
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Universal gradient methods for convex optimization problems
- On gradients of functions definable in o-minimal structures
- On semi- and subanalytic geometry
- From error bounds to the complexity of first-order descent methods for convex functions
- Bundle-level type methods uniformly optimal for smooth and nonsmooth convex optimization
- A simple nearly optimal restart scheme for speeding up first-order methods
- General Hölder smooth convergence rates follow from specialized rates assuming growth bounds
- The university of Florida sparse matrix collection
- Optimal methods of smooth convex minimization
- Double Exponential Families and Their Use in Generalized Linear Regression
- The linear regression model: Lpnorm estimation and the choice of p
- Improved approximation algorithms for maximum cut and satisfiability problems using semidefinite programming
- RSG: Beating Subgradient Method without Smoothness and Strong Convexity
- Optimal Stochastic Approximation Algorithms for Strongly Convex Stochastic Composite Optimization I: A Generic Algorithmic Framework
- An Optimal-Storage Approach to Semidefinite Programming Using Approximate Complementarity
- Sharpness, Restart, and Acceleration
- Convergence Rates for Deterministic and Stochastic Subgradient Methods without Lipschitz Continuity
- The Łojasiewicz Inequality for Nonsmooth Subanalytic Functions with Applications to Subgradient Dynamical Systems
- Optimal Convergence Rates for the Proximal Bundle Method