Optimal subgradient algorithms for large-scale convex optimization in simple domains
Publication:1689457
DOI10.1007/s11075-017-0297-xzbMath1411.90261arXiv1501.01451OpenAlexW1557199071WikidataQ59607367 ScholiaQ59607367MaRDI QIDQ1689457
Masoud Ahookhosh, Arnold Neumaier
Publication date: 12 January 2018
Published in: Numerical Algorithms (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1501.01451
nonsmooth optimizationsubgradient methodhigh-dimensional datastructured convex optimizationoptimal complexityfirst-order black-box information
Analysis of algorithms and problem complexity (68Q25) Numerical mathematical programming methods (65K05) Convex programming (90C25) Abstract computational complexity for mathematical programming problems (90C60) Numerical methods based on nonlinear programming (49M37)
Related Items (7)
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Smooth minimization of non-smooth functions
- A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems
- OSGA: a fast subgradient algorithm with optimal complexity
- Gradient methods for minimizing composite functions
- First-order methods of smooth convex optimization with inexact oracle
- Universal gradient methods for convex optimization problems
- Primal-dual first-order methods with \({\mathcal {O}(1/\varepsilon)}\) iteration-complexity for cone programming
- Introductory lectures on convex optimization. A basic course.
- Solving structured nonsmooth convex optimization with complexity \(\mathcal {O}(\varepsilon ^{-1/2})\)
- Templates for convex cone problems with applications to sparse signal recovery
- New variants of bundle methods
- A first-order primal-dual algorithm for convex problems with applications to imaging
- A double smoothing technique for solving unconstrained nondifferentiable convex optimization problems
- Fine tuning Nesterov's steepest descent algorithm for differentiable convex programming
- On efficiency of nonmonotone Armijo-type line searches
- An optimal subgradient algorithm for large-scale bound-constrained convex optimization
- An optimal subgradient algorithm with subspace search for costly convex optimization problems
- An inexact line search approach using modified nonmonotone strategy for unconstrained optimization
- Regularization tools version \(4.0\) for matlab \(7.3\)
- Bundle-level type methods uniformly optimal for smooth and nonsmooth convex optimization
- Incremental Subgradient Methods for Nondifferentiable Optimization
- Constrained Total Variation Deblurring Models and Fast Algorithms Based on Alternating Direction Method of Multipliers
- Proximal Splitting Methods in Signal Processing
- Smoothing and First Order Methods: A Unified Framework
- Double Smoothing Technique for Large-Scale Linearly Constrained Convex Optimization
- Introduction to Nonsmooth Optimization
- Two-Point Step Size Gradient Methods
- Solving Ill-Conditioned and Singular Linear Systems: A Tutorial on Regularization
- Introduction to Numerical Analysis
- A Nonnegatively Constrained Convex Programming Method for Image Reconstruction
- Gradient Convergence in Gradient methods with Errors
- Nonmonotone Spectral Projected Gradient Methods on Convex Sets
- Excessive Gap Technique in Nonsmooth Convex Minimization
- An Optimal Algorithm for Constrained Differentiable Convex Optimization
- A Primal-Dual Splitting Algorithm for Finding Zeros of Sums of Maximal Monotone Operators
- A Douglas--Rachford Type Primal-Dual Method for Solving Inclusions with Mixtures of Composite and Parallel-Sum Type Monotone Operators
- Model Selection and Estimation in Regression with Grouped Variables
- Interior Gradient and Proximal Methods for Convex and Conic Optimization
- Ridge Regression: Biased Estimation for Nonorthogonal Problems
This page was built for publication: Optimal subgradient algorithms for large-scale convex optimization in simple domains