Deterministic and stochastic primal-dual subgradient algorithms for uniformly convex minimization
From MaRDI portal
Publication:2921184
DOI10.1214/10-SSY010zbMath1297.90097OpenAlexW1980404857MaRDI QIDQ2921184
Anatoli B. Juditsky, Yu. E. Nesterov
Publication date: 7 October 2014
Full work available at URL: https://projecteuclid.org/euclid.ssy/1411044992
large scale stochastic approximationnon-Euclidean first order algorithmsstrongly and uniformly convex optimization
Related Items (39)
Gradient-free two-point methods for solving stochastic nonsmooth convex optimization problems with small non-random noises ⋮ Greedy strategies for convex optimization ⋮ Stochastic forward-backward splitting for monotone inclusions ⋮ OSGA: a fast subgradient algorithm with optimal complexity ⋮ New results on subgradient methods for strongly convex optimization problems with a unified analysis ⋮ Simple and Optimal Methods for Stochastic Variational Inequalities, II: Markovian Noise and Policy Evaluation in Reinforcement Learning ⋮ Accelerated Stochastic Algorithms for Convex-Concave Saddle-Point Problems ⋮ Primal-dual mirror descent method for constraint stochastic optimization problems ⋮ Gradient-free methods for non-smooth convex stochastic optimization with heavy-tailed noise on convex compact ⋮ Stochastic Block Mirror Descent Methods for Nonsmooth and Stochastic Optimization ⋮ Unnamed Item ⋮ Optimal Algorithms for Stochastic Complementary Composite Minimization ⋮ Simple and fast algorithm for binary integer and online linear programming ⋮ First-order methods for convex optimization ⋮ On the Adaptivity of Stochastic Gradient-Based Optimization ⋮ Unifying framework for accelerated randomized methods in convex optimization ⋮ Recent theoretical advances in decentralized distributed convex optimization ⋮ Lower complexity bounds of first-order methods for convex-concave bilinear saddle-point problems ⋮ Universal method for stochastic composite optimization problems ⋮ Solving structured nonsmooth convex optimization with complexity \(\mathcal {O}(\varepsilon ^{-1/2})\) ⋮ RSG: Beating Subgradient Method without Smoothness and Strong Convexity ⋮ On problems in the calculus of variations in increasingly elongated domains ⋮ Multistep stochastic mirror descent for risk-averse convex stochastic programs based on extended polyhedral risk measures ⋮ Convergence of stochastic proximal gradient algorithm ⋮ Inexact stochastic mirror descent for two-stage nonlinear stochastic programs ⋮ Fast gradient descent for convex minimization problems with an oracle producing a \(( \delta, L)\)-model of function at the requested point ⋮ Some worst-case datasets of deterministic first-order methods for solving binary logistic regression ⋮ An accelerated directional derivative method for smooth stochastic convex optimization ⋮ Stochastic intermediate gradient method for convex problems with stochastic inexact oracle ⋮ Algorithms of robust stochastic optimization based on mirror descent method ⋮ Preface ⋮ Accelerated first-order methods for large-scale convex optimization: nearly optimal complexity under strong convexity ⋮ A family of subgradient-based methods for convex optimization problems in a unifying framework ⋮ Online estimation of the asymptotic variance for averaged stochastic gradient algorithms ⋮ Accelerate stochastic subgradient method by leveraging local growth condition ⋮ A dual approach for optimal algorithms in distributed optimization over networks ⋮ A Stochastic Variance Reduced Primal Dual Fixed Point Method for Linearly Constrained Separable Optimization ⋮ Noisy zeroth-order optimization for non-smooth saddle point problems ⋮ Universal intermediate gradient method for convex problems with inexact oracle
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Smooth minimization of non-smooth functions
- Unconstrained recursive importance sampling
- Accelerating the cubic regularization of Newton's method on convex problems
- Characteristic functions of directed graphs and applications to stochastic equilibrium problems
- On uniformly convex functionals
- Uniformly convex and uniformly smooth convex functions
- On nonparametric tests of positivity/monotonicity/convexity
- Confidence level solutions for stochastic programming
- Robust Stochastic Approximation Approach to Stochastic Programming
- Information-Theoretic Lower Bounds on the Oracle Complexity of Stochastic Convex Optimization
- Information-Based Complexity, Feedback and Dynamics in Convex Programming
- Excessive Gap Technique in Nonsmooth Convex Minimization
- On uniformly convex functions
This page was built for publication: Deterministic and stochastic primal-dual subgradient algorithms for uniformly convex minimization