Block Stochastic Gradient Iteration for Convex and Nonconvex Optimization
DOI10.1137/140983938zbMath1342.93125arXiv1408.2597OpenAlexW2963264932MaRDI QIDQ2945126
Publication date: 9 September 2015
Published in: SIAM Journal on Optimization (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1408.2597
nonsmooth optimizationconvex optimizationnonconvex optimizationstochastic gradient methodblock coordinate updates
Numerical mathematical programming methods (65K05) Convex programming (90C25) Nonconvex programming, global optimization (90C26) Nonlinear programming (90C30) Numerical optimization and variational techniques (65K10) Numerical methods based on necessary conditions (49M05) Stochastic programming (90C15) Optimal stochastic control (93E20) Acceleration of convergence in numerical analysis (65B99)
Related Items
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Primal-dual subgradient methods for convex problems
- A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems
- Accelerated gradient methods for nonconvex nonlinear and stochastic programming
- An optimal method for stochastic composite optimization
- On the complexity analysis of randomized block-coordinate descent methods
- Pegasos: primal estimated sub-gradient solver for SVM
- A coordinate gradient descent method for nonsmooth separable minimization
- New method of stochastic approximation type
- On the Douglas-Rachford splitting method and the proximal point algorithm for maximal monotone operators
- On the convergence of the coordinate descent method for convex differentiable minimization
- Asymptotic properties of the Fenchel dual functional and applications to decomposition problems
- Introductory lectures on convex optimization. A basic course.
- A globally convergent algorithm for nonconvex optimization based on block coordinate update
- On the convergence of the block nonlinear Gauss-Seidel method under convex constraints
- Quasi-monotone subgradient methods for nonsmooth convex minimization
- Parallel stochastic gradient algorithms for large-scale matrix completion
- Iteration complexity of randomized block-coordinate descent methods for minimizing a composite function
- Exact matrix completion via convex optimization
- The Sample Average Approximation Method for Stochastic Discrete Optimization
- Block Coordinate Descent Methods for Semidefinite Programming
- A Unified Convergence Analysis of Block Successive Minimization Methods for Nonsmooth Optimization
- A Block Coordinate Descent Method for Regularized Multiconvex Optimization with Applications to Nonnegative Tensor Factorization and Completion
- Hybrid Deterministic-Stochastic Methods for Data Fitting
- Efficiency of Coordinate Descent Methods on Huge-Scale Optimization Problems
- Stochastic Block Mirror Descent Methods for Nonsmooth and Stochastic Optimization
- Robust Stochastic Approximation Approach to Stochastic Programming
- Monotone Operators and the Proximal Point Algorithm
- On convergence rates of subgradient optimization methods
- Variational Analysis
- On the Nonasymptotic Convergence of Cyclic Coordinate Descent Methods
- On the Convergence of Block Coordinate Descent Type Methods
- Optimal Stochastic Approximation Algorithms for Strongly Convex Stochastic Composite Optimization, II: Shrinking Procedures and Optimal Algorithms
- Stochastic First- and Zeroth-Order Methods for Nonconvex Stochastic Programming
- Asymptotic Distribution of Stochastic Approximation Procedures
- A Stochastic Approximation Method
- On a Stochastic Approximation Method
- Convergence of a block coordinate descent method for nondifferentiable minimization
- Mini-batch stochastic approximation methods for nonconvex stochastic composite optimization