An acceleration procedure for optimal first-order methods
From MaRDI portal
Publication:5746718
DOI10.1080/10556788.2013.835812zbMath1282.90118arXiv1207.3951OpenAlexW2024891446MaRDI QIDQ5746718
Michel Baes, Michael Bürgisser
Publication date: 7 February 2014
Published in: Optimization Methods and Software (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1207.3951
Semidefinite programming (90C22) Convex programming (90C25) Large-scale problems in mathematical programming (90C06)
Related Items
Distance computation of ontology vector for ontology similarity measuring and ontology mapping ⋮ Accelerated first-order methods for large-scale convex optimization: nearly optimal complexity under strong convexity
Uses Software
Cites Work
- Primal-dual subgradient methods for convex problems
- Smooth minimization of non-smooth functions
- A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems
- Gradient methods for minimizing composite functions
- Subgradient methods for huge-scale optimization problems
- Convex optimization methods for dimension reduction and coefficient estimation in multivariate linear regression
- Smoothing technique and its applications in semidefinite optimization
- Templates for convex cone problems with applications to sparse signal recovery
- Efficiency of Coordinate Descent Methods on Huge-Scale Optimization Problems
- Smooth Optimization with Approximate Gradient
- Robust Stochastic Approximation Approach to Stochastic Programming
- A Spectral Bundle Method for Semidefinite Programming
- Prox-Method with Rate of Convergence O(1/t) for Variational Inequalities with Lipschitz Continuous Monotone Operators and Smooth Convex-Concave Saddle Point Problems
This page was built for publication: An acceleration procedure for optimal first-order methods