An optimal subgradient algorithm with subspace search for costly convex optimization problems
DOI10.1007/s41980-018-0171-1zbMath1412.90105OpenAlexW2897257647MaRDI QIDQ2415906
Masoud Ahookhosh, Arnold Neumaier
Publication date: 23 May 2019
Published in: Bulletin of the Iranian Mathematical Society (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s41980-018-0171-1
nonsmooth optimizationconvex optimizationsubgradient methodsoptimal complexityfirst-order black-box informationcostly linear operatormultidimensional subspace search
Analysis of algorithms and problem complexity (68Q25) Numerical mathematical programming methods (65K05) Convex programming (90C25) Abstract computational complexity for mathematical programming problems (90C60) Numerical methods based on nonlinear programming (49M37)
Related Items
Uses Software
Cites Work
- Smooth minimization of non-smooth functions
- A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems
- OSGA: a fast subgradient algorithm with optimal complexity
- Gradient methods for minimizing composite functions
- First-order methods of smooth convex optimization with inexact oracle
- Universal gradient methods for convex optimization problems
- Primal-dual first-order methods with \({\mathcal {O}(1/\varepsilon)}\) iteration-complexity for cone programming
- Pegasos: primal estimated sub-gradient solver for SVM
- A subspace version of the Powell-Yuan trust-region algorithm for equality constrained optimization
- On the limited memory BFGS method for large scale optimization
- Introductory lectures on convex optimization. A basic course.
- Optimal subgradient algorithms for large-scale convex optimization in simple domains
- Solving structured nonsmooth convex optimization with complexity \(\mathcal {O}(\varepsilon ^{-1/2})\)
- Mirror descent and nonlinear projected subgradient methods for convex optimization.
- Fine tuning Nesterov's steepest descent algorithm for differentiable convex programming
- Cutting-plane training of structural SVMs
- Optimal subgradient methods: computational properties for large-scale linear inverse problems
- An optimal subgradient algorithm for large-scale bound-constrained convex optimization
- Coordinate and subspace optimization methods for linear least squares with non-quadratic regularization
- A subspace implementation of quasi-Newton trust region methods for unconstrained optimization
- Bundle-level type methods uniformly optimal for smooth and nonsmooth convex optimization
- Memory gradient method for the minimization of functions
- Study on a supermemory gradient method for the minimization of functions
- Optimization with Sparsity-Inducing Penalties
- Primal Methods are Better than Dual Methods for Solving Overdetermined Linear Systems in the $l_\infty $ Sense?
- Minimization Techniques for Piecewise Differentiable Functions: The $l_1$ Solution to an Overdetermined Linear System
- Solving Ill-Conditioned and Singular Linear Systems: A Tutorial on Regularization
- A Subspace Study on Conjugate Gradient Algorithms
- A Majorize–Minimize Strategy for Subspace Optimization Applied to Image Restoration
- An Optimal Algorithm for Constrained Differentiable Convex Optimization
- Interior Gradient and Proximal Methods for Convex and Conic Optimization
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item