Pages that link to "Item:Q4558545"
From MaRDI portal
The following pages link to Catalyst Acceleration for First-order Convex Optimization: from Theory to Practice (Q4558545):
Displaying 41 items.
- Accelerated proximal algorithms with a correction term for monotone inclusions (Q832632) (← links)
- On variance reduction for stochastic smooth convex optimization with multiplicative noise (Q1739038) (← links)
- An optimal randomized incremental gradient method (Q1785198) (← links)
- Stochastic quasi-gradient methods: variance reduction via Jacobian sketching (Q2039235) (← links)
- Understanding the acceleration phenomenon via high-resolution differential equations (Q2089769) (← links)
- Accelerated proximal envelopes: application to componentwise methods (Q2116598) (← links)
- On the computational efficiency of catalyst accelerated coordinate descent (Q2117631) (← links)
- Accelerating variance-reduced stochastic gradient methods (Q2118092) (← links)
- Oracle complexity separation in convex optimization (Q2139268) (← links)
- Inexact first-order primal-dual algorithms (Q2181598) (← links)
- Accelerated methods for saddle-point problem (Q2214606) (← links)
- Accelerated proximal point method for maximally monotone operators (Q2235140) (← links)
- Fast gradient descent for convex minimization problems with an oracle producing a \(( \delta, L)\)-model of function at the requested point (Q2278192) (← links)
- Provable accelerated gradient method for nonconvex low rank optimization (Q2303662) (← links)
- Inexact successive quadratic approximation for regularized optimization (Q2419525) (← links)
- An accelerated variance reducing stochastic method with Douglas-Rachford splitting (Q2425236) (← links)
- Accelerated meta-algorithm for convex optimization problems (Q2656390) (← links)
- One-step optimization method for equilibrium problems (Q2673498) (← links)
- Revisiting EXTRA for Smooth Distributed Optimization (Q3300767) (← links)
- The Approximate Duality Gap Technique: A Unified Theory of First-Order Methods (Q4629338) (← links)
- (Q4633055) (← links)
- Stochastic Primal-Dual Coordinate Method for Regularized Empirical Risk Minimization (Q4636997) (← links)
- (Q4637046) (← links)
- Distributed Learning with Sparse Communications by Identification (Q4959464) (← links)
- On the Complexity Analysis of the Primal Solutions for the Accelerated Randomized Dual Coordinate Ascent (Q4969070) (← links)
- A Proximal Bundle Variant with Optimal Iteration-Complexity for a Large Range of Prox Stepsizes (Q5013585) (← links)
- Convergence of Recursive Stochastic Algorithms Using Wasserstein Divergence (Q5018894) (← links)
- (Q5054622) (← links)
- Fast convergence of generalized forward-backward algorithms for structured monotone inclusions (Q5091986) (← links)
- Bregman Proximal Point Algorithm Revisited: A New Inexact Version and Its Inertial Variant (Q5093643) (← links)
- Contracting Proximal Methods for Smooth Convex Optimization (Q5139832) (← links)
- (Q5148937) (← links)
- (Q5148997) (← links)
- (Q5159454) (← links)
- (Q5214264) (← links)
- An Inexact Variable Metric Proximal Point Algorithm for Generic Quasi-Newton Acceleration (Q5231671) (← links)
- Generalized Momentum-Based Methods: A Hamiltonian Perspective (Q5857293) (← links)
- Accelerated variance-reduced methods for saddle-point problems (Q6114960) (← links)
- Principled analyses and design of first-order methods with inexact proximal operators (Q6165584) (← links)
- Adaptive proximal SGD based on new estimating sequences for sparser ERM (Q6196471) (← links)
- A proximal-gradient method for problems with overlapping group-sparse regularization: support identification complexity (Q6641006) (← links)