Accelerated randomized mirror descent algorithms for composite non-strongly convex optimization
From MaRDI portal
Publication:2420797
DOI10.1007/s10957-018-01469-5zbMath1461.65154arXiv1605.06892OpenAlexW2884784523MaRDI QIDQ2420797
Huan Xu, Cuong Viet Nguyen, Jiashi Feng, Canyi Lu, Le Thi Khanh Hien
Publication date: 7 June 2019
Published in: Journal of Optimization Theory and Applications (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1605.06892
Numerical mathematical programming methods (65K05) Large-scale problems in mathematical programming (90C06) Nonlinear programming (90C30)
Related Items
An aggressive reduction on the complexity of optimization for non-strongly convex objectives, A stochastic variance reduction algorithm with Bregman distances for structured composite problems, Stochastic incremental mirror descent algorithms with Nesterov smoothing, An inexact primal-dual smoothing framework for large-scale non-bilinear saddle point problems
Uses Software
Cites Work
- Smooth minimization of non-smooth functions
- A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems
- Gradient methods for minimizing composite functions
- First-order methods of smooth convex optimization with inexact oracle
- Fixed point and Bregman iterative methods for matrix rank minimization
- Introductory lectures on convex optimization. A basic course.
- Error bounds for proximal point subproblems and associated inexact proximal point algorithms
- Accelerated and Inexact Forward-Backward Algorithms
- A Singular Value Thresholding Algorithm for Matrix Completion
- NESTA: A Fast and Accurate First-Order Method for Sparse Recovery
- Accelerated, Parallel, and Proximal Coordinate Descent
- First-Order Methods for Sparse Covariance Selection
- Robust Stochastic Approximation Approach to Stochastic Programming
- Monotone Operators and the Proximal Point Algorithm
- Convergence of Proximal-Like Algorithms
- Numerical methods for nondifferentiable convex optimization
- Katyusha: the first direct acceleration of stochastic gradient methods
- A Proximal Stochastic Gradient Method with Progressive Variance Reduction
- Total Variation Projection With First Order Schemes
- Interior Gradient and Proximal Methods for Convex and Conic Optimization
- SSVM: A smooth support vector machine for classification
- Unnamed Item
- Unnamed Item
- Unnamed Item