An accelerated directional derivative method for smooth stochastic convex optimization
From MaRDI portal
Publication:2029381
DOI10.1016/j.ejor.2020.08.027zbMath1487.90524arXiv1804.02394OpenAlexW2796416676MaRDI QIDQ2029381
Eduard Gorbunov, Pavel Dvurechensky, Alexander V. Gasnikov
Publication date: 3 June 2021
Published in: European Journal of Operational Research (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1804.02394
Numerical mathematical programming methods (65K05) Convex programming (90C25) Derivative-free methods and methods using generalized derivatives (90C56) Stochastic programming (90C15)
Related Items (11)
Oracle complexity separation in convex optimization ⋮ A theoretical and empirical comparison of gradient approximations in derivative-free optimization ⋮ Generalized mirror prox algorithm for monotone variational inequalities: Universality and inexact oracle ⋮ An Accelerated Method for Derivative-Free Smooth Stochastic Convex Optimization ⋮ Gradient-free federated learning methods with \(l_1\) and \(l_2\)-randomization for non-smooth convex stochastic optimization problems ⋮ Non-smooth setting of stochastic decentralized convex optimization problem over time-varying graphs ⋮ First-order methods for convex optimization ⋮ Unifying framework for accelerated randomized methods in convex optimization ⋮ Recent Theoretical Advances in Non-Convex Optimization ⋮ Decentralized and parallel primal and dual accelerated methods for stochastic convex programming problems ⋮ A stochastic subspace approach to gradient-free optimization in high dimensions
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Lakes and rivers in the landscape: a quasi-variational inequality approach
- An optimal method for stochastic composite optimization
- Universal gradient methods for convex optimization problems
- An analytical study on peeling of an adhesively bonded joint based on a viscoelastic Bernoulli-Euler beam model
- Gradient-free proximal methods with inexact oracle for convex stochastic nonsmooth optimization problems on the simplex
- Stochastic intermediate gradient method for convex problems with stochastic inexact oracle
- Introductory lectures on convex optimization. A basic course.
- Gradient-free two-point methods for solving stochastic nonsmooth convex optimization problems with small non-random noises
- Global convergence rate analysis of unconstrained optimization methods based on probabilistic models
- Dual approaches to the minimization of strongly convex functionals with a simple structure under affine constraints
- A unified framework for stochastic optimization
- Gradient methods for problems with inexact model of the objective
- Accelerated gradient-free optimization methods with a non-Euclidean proximal operator
- Stochastic online optimization. Single-point and multi-point non-linear multi-armed bandits. Convex and strongly-convex case
- Random gradient-free minimization of convex functions
- Mirror descent and convex optimization problems with non-smooth inequality constraints
- Stochastic intermediate gradient method for convex optimization problems
- Efficiency of Coordinate Descent Methods on Huge-Scale Optimization Problems
- Deterministic and stochastic primal-dual subgradient algorithms for uniformly convex minimization
- Stochastic Block Mirror Descent Methods for Nonsmooth and Stochastic Optimization
- Efficiency of the Accelerated Coordinate Descent Method on Structured Optimization Problems
- Optimal Rates for Zero-Order Convex Optimization: The Power of Two Function Evaluations
- Fast Primal-Dual Gradient Method for Strongly Convex Minimization Problems with Linear Constraints
- Accelerated, Parallel, and Proximal Coordinate Descent
- Coderivative Analysis of Quasi‐variational Inequalities with Applications to Stability and Optimization
- A QUASI-VARIATIONAL INEQUALITY PROBLEM IN SUPERCONDUCTIVITY
- Introduction to Derivative-Free Optimization
- Introduction to Stochastic Search and Optimization
- Derivative-Free Optimization of Noisy Functions via Quasi-Newton Methods
- About the Power Law of the PageRank Vector Component Distribution. Part 1. Numerical Methods for Finding the PageRank Vector
- About the Power Law of the PageRank Vector Component Distribution. Part 2. The Buckley–Osthus Model, Verification of the Power Law for This Model, and Setup of Real Search Engines
- Katyusha: the first direct acceleration of stochastic gradient methods
- Derivative-free optimization methods
- A simple automatic derivative evaluation program
- An Optimal Algorithm for Bandit and Zero-Order Convex Optimization with Two-Point Feedback
- Stochastic First- and Zeroth-Order Methods for Nonconvex Stochastic Programming
- Stochastic Approximation of Minima with Improved Asymptotic Speed
- Accelerated proximal stochastic dual coordinate ascent for regularized loss minimization
- Mini-batch stochastic approximation methods for nonconvex stochastic composite optimization
This page was built for publication: An accelerated directional derivative method for smooth stochastic convex optimization