Pages that link to "Item:Q5962715"
From MaRDI portal
The following pages link to Accelerated proximal stochastic dual coordinate ascent for regularized loss minimization (Q5962715):
Displaying 50 items.
- On optimal probabilities in stochastic coordinate descent methods (Q315487) (← links)
- On the complexity analysis of randomized block-coordinate descent methods (Q494345) (← links)
- An extragradient-based alternating direction method for convex minimization (Q525598) (← links)
- A new accelerated algorithm for ill-conditioned ridge regression problems (Q725822) (← links)
- Stochastic primal dual fixed point method for composite optimization (Q777039) (← links)
- Top-\(k\) multi-class SVM using multiple features (Q781933) (← links)
- Distributed block-diagonal approximation methods for regularized empirical risk minimization (Q782443) (← links)
- High-dimensional model recovery from random sketched data by exploring intrinsic sparsity (Q782446) (← links)
- The complexity of primal-dual fixed point methods for ridge regression (Q1669015) (← links)
- An optimal randomized incremental gradient method (Q1785198) (← links)
- Parallel decomposition methods for linearly constrained problems subject to simple bound with application to the SVMs training (Q1790674) (← links)
- Generalized stochastic Frank-Wolfe algorithm with stochastic ``substitute'' gradient for structured convex optimization (Q2020608) (← links)
- An accelerated directional derivative method for smooth stochastic convex optimization (Q2029381) (← links)
- A unified convergence analysis of stochastic Bregman proximal gradient and extragradient methods (Q2031928) (← links)
- Randomized smoothing variance reduction method for large-scale non-smooth convex optimization (Q2033403) (← links)
- A stochastic extra-step quasi-Newton method for nonsmooth nonconvex optimization (Q2149551) (← links)
- Linear convergence of cyclic SAGA (Q2193004) (← links)
- Primal-dual block-proximal splitting for a class of non-convex problems (Q2218923) (← links)
- Provable accelerated gradient method for nonconvex low rank optimization (Q2303662) (← links)
- Block-proximal methods with spatially adapted acceleration (Q2323015) (← links)
- Convergence rates of accelerated proximal gradient algorithms under independent noise (Q2420162) (← links)
- An accelerated variance reducing stochastic method with Douglas-Rachford splitting (Q2425236) (← links)
- Variance reduction for root-finding problems (Q2689823) (← links)
- Optimization in High Dimensions via Accelerated, Parallel, and Proximal Coordinate Descent (Q2832112) (← links)
- Accelerated, Parallel, and Proximal Coordinate Descent (Q3449571) (← links)
- An Accelerated Randomized Proximal Coordinate Gradient Method and its Application to Regularized Empirical Risk Minimization (Q3451763) (← links)
- (Q4558169) (← links)
- On Faster Convergence of Cyclic Block Coordinate Descent-type Methods for Strongly Convex Minimization (Q4558510) (← links)
- Catalyst Acceleration for First-order Convex Optimization: from Theory to Practice (Q4558545) (← links)
- (Q4558559) (← links)
- (Q4558572) (← links)
- Accelerated Methods for NonConvex Optimization (Q4571877) (← links)
- A Randomized Nonmonotone Block Proximal Gradient Method for a Class of Structured Nonlinear Programming (Q4596724) (← links)
- (Q4633052) (← links)
- (Q4633055) (← links)
- Stochastic Primal-Dual Coordinate Method for Regularized Empirical Risk Minimization (Q4636997) (← links)
- A General Distributed Dual Coordinate Optimization Framework for Regularized Loss Minimization (Q4637039) (← links)
- (Q4637040) (← links)
- Linear Coupling: An Ultimate Unification of Gradient and Mirror Descent (Q4638051) (← links)
- Surpassing Gradient Descent Provably: A Cyclic Incremental Method with Linear Convergence Rate (Q4641666) (← links)
- Random Gradient Extrapolation for Distributed and Stochastic Optimization (Q4687240) (← links)
- On the Complexity Analysis of the Primal Solutions for the Accelerated Randomized Dual Coordinate Ascent (Q4969070) (← links)
- (Q4969074) (← links)
- Spectrum Approximation Beyond Fast Matrix Multiplication: Algorithms and Hardness (Q4993271) (← links)
- (Q4998940) (← links)
- An Optimal High-Order Tensor Method for Convex Optimization (Q5026443) (← links)
- Active Subspace of Neural Networks: Structural Analysis and Universal Attacks (Q5037556) (← links)
- (Q5054606) (← links)
- On the Convergence of Stochastic Primal-Dual Hybrid Gradient (Q5081780) (← links)
- On the Adaptivity of Stochastic Gradient-Based Optimization (Q5114394) (← links)