scientific article; zbMATH DE number 6982977
From MaRDI portal
Publication:4558559
zbMath1475.90044MaRDI QIDQ4558559
Publication date: 22 November 2018
Full work available at URL: http://jmlr.csail.mit.edu/papers/v18/16-410.html
Title: zbMATH Open Web Interface contents unavailable due to conflicting licenses.
Related Items (14)
Oracle complexity separation in convex optimization ⋮ On the Complexity Analysis of the Primal Solutions for the Accelerated Randomized Dual Coordinate Ascent ⋮ Laplacian smoothing gradient descent ⋮ An aggressive reduction on the complexity of optimization for non-strongly convex objectives ⋮ Nonconvex optimization with inertial proximal stochastic variance reduction gradient ⋮ Block mirror stochastic gradient method for stochastic optimization ⋮ Adaptive proximal SGD based on new estimating sequences for sparser ERM ⋮ SGEM: stochastic gradient with energy and momentum ⋮ Accelerated stochastic variance reduction for a class of convex optimization problems ⋮ Optimization for deep learning: an overview ⋮ Unnamed Item ⋮ Accelerated directional search with non-Euclidean prox-structure ⋮ An Adaptive Gradient Method with Energy and Momentum ⋮ Accelerating variance-reduced stochastic gradient methods
Uses Software
Cites Work
- Unnamed Item
- Smooth minimization of non-smooth functions
- An optimal method for stochastic composite optimization
- On the complexity analysis of randomized block-coordinate descent methods
- Introductory lectures on convex optimization. A basic course.
- Efficiency of Coordinate Descent Methods on Huge-Scale Optimization Problems
- An optimal algorithm for stochastic strongly-convex optimization
- Nearly-Linear Time Positive LP Solver with Faster Convergence Rate
- An Accelerated Randomized Proximal Coordinate Gradient Method and its Application to Regularized Empirical Risk Minimization
- Accelerated Methods for NonConvex Optimization
- Using Optimization to Obtain a Width-Independent, Parallel, Simpler, and Faster Positive SDP Solver
- Optimization Algorithms for Faster Computational Geometry
- Stochastic Primal-Dual Coordinate Method for Regularized Empirical Risk Minimization
- Linear Coupling: An Ultimate Unification of Gradient and Mirror Descent
- Finding approximate local minima faster than gradient descent
- A Proximal Stochastic Gradient Method with Progressive Variance Reduction
- Incremental Majorization-Minimization Optimization with Application to Large-Scale Machine Learning
- Using Optimization to Break the Epsilon Barrier: A Faster and Simpler Width-Independent Algorithm for Solving Positive Linear Programs in Parallel
- Stochastic Dual Coordinate Ascent Methods for Regularized Loss Minimization
- Prediction by Categorical Features: Generalization Properties and Application to Feature Ranking
- Accelerated proximal stochastic dual coordinate ascent for regularized loss minimization
This page was built for publication: