Optimal subgradient algorithms for large-scale convex optimization in simple domains
From MaRDI portal
Abstract: This paper shows that the optimal subgradient algorithm, OSGA, proposed in cite{NeuO} can be used for solving structured large-scale convex constrained optimization problems. Only first-order information is required, and the optimal complexity bounds for both smooth and nonsmooth problems are attained. More specifically, we consider two classes of problems: (i) a convex objective with a simple closed convex domain, where the orthogonal projection on this feasible domain is efficiently available; (ii) a convex objective with a simple convex functional constraint. If we equip OSGA with an appropriate prox-function, the OSGA subproblem can be solved either in a closed form or by a simple iterative scheme, which is especially important for large-scale problems. We report numerical results for some applications to show the efficiency of the proposed scheme. A software package implementing OSGA for above domains is available.
Recommendations
- An optimal subgradient algorithm for large-scale bound-constrained convex optimization
- OSGA: a fast subgradient algorithm with optimal complexity
- Non-Euclidean restricted memory level method for large-scale convex optimization
- An Optimal Algorithm for Constrained Differentiable Convex Optimization
- Accelerated first-order methods for large-scale convex optimization: nearly optimal complexity under strong convexity
Cites work
- scientific article; zbMATH DE number 1818892 (Why is no real title available?)
- scientific article; zbMATH DE number 3850830 (Why is no real title available?)
- scientific article; zbMATH DE number 3790208 (Why is no real title available?)
- scientific article; zbMATH DE number 845714 (Why is no real title available?)
- A Douglas--Rachford Type Primal-Dual Method for Solving Inclusions with Mixtures of Composite and Parallel-Sum Type Monotone Operators
- A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems
- A Nonnegatively Constrained Convex Programming Method for Image Reconstruction
- A double smoothing technique for solving unconstrained nondifferentiable convex optimization problems
- A first-order primal-dual algorithm for convex problems with applications to imaging
- A primal-dual splitting algorithm for finding zeros of sums of maximal monotone operators
- A survey of nonlinear conjugate gradient methods
- An Optimal Algorithm for Constrained Differentiable Convex Optimization
- An inexact line search approach using modified nonmonotone strategy for unconstrained optimization
- An introduction to total variation for image analysis
- An optimal subgradient algorithm for large-scale bound-constrained convex optimization
- An optimal subgradient algorithm with subspace search for costly convex optimization problems
- Blockwise sparse regression
- Bundle-level type methods uniformly optimal for smooth and nonsmooth convex optimization
- Constrained total variation deblurring models and fast algorithms based on alternating direction method of multipliers
- Double smoothing technique for large-scale linearly constrained convex optimization
- Excessive Gap Technique in Nonsmooth Convex Minimization
- Fine tuning Nesterov's steepest descent algorithm for differentiable convex programming
- First-order methods of smooth convex optimization with inexact oracle
- Gradient Convergence in Gradient methods with Errors
- Gradient methods for minimizing composite functions
- Incremental subgradient methods for nondifferentiable optimization
- Interior Gradient and Proximal Methods for Convex and Conic Optimization
- Introduction to Numerical Analysis
- Introduction to nonsmooth optimization. Theory, practice and software
- Introductory lectures on convex optimization. A basic course.
- Model Selection and Estimation in Regression with Grouped Variables
- New variants of bundle methods
- Nonmonotone Spectral Projected Gradient Methods on Convex Sets
- OSGA: a fast subgradient algorithm with optimal complexity
- On efficiency of nonmonotone Armijo-type line searches
- Primal-dual first-order methods with \({\mathcal {O}(1/\varepsilon)}\) iteration-complexity for cone programming
- Proximal splitting methods in signal processing
- Regularization tools version \(4.0\) for matlab \(7.3\)
- Ridge Regression: Biased Estimation for Nonorthogonal Problems
- Smooth minimization of non-smooth functions
- Smoothing and first order methods: a unified framework
- Solving Ill-Conditioned and Singular Linear Systems: A Tutorial on Regularization
- Solving structured nonsmooth convex optimization with complexity \(\mathcal {O}(\varepsilon ^{-1/2})\)
- Templates for convex cone problems with applications to sparse signal recovery
- Two-Point Step Size Gradient Methods
- Universal gradient methods for convex optimization problems
Cited in
(11)- Solving structured nonsmooth convex optimization with complexity \(\mathcal {O}(\varepsilon ^{-1/2})\)
- Optimal subgradient methods: computational properties for large-scale linear inverse problems
- Analogues of switching subgradient schemes for relatively Lipschitz-continuous convex programming problems
- Dynamic smoothness parameter for fast gradient methods
- An optimal subgradient algorithm for large-scale bound-constrained convex optimization
- An optimal subgradient algorithm with subspace search for costly convex optimization problems
- Dual subgradient algorithms for large-scale nonsmooth learning problems
- OSGA: a fast subgradient algorithm with optimal complexity
- Accelerated first-order methods for large-scale convex optimization: nearly optimal complexity under strong convexity
- Non-Euclidean restricted memory level method for large-scale convex optimization
- Primal-dual subgradient method for huge-scale linear conic problems
This page was built for publication: Optimal subgradient algorithms for large-scale convex optimization in simple domains
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q1689457)