Random minibatch subgradient algorithms for convex problems with functional constraints
From MaRDI portal
Abstract: In this paper we consider non-smooth convex optimization problems with (possibly) infinite intersection of constraints. In contrast to the classical approach, where the constraints are usually represented as intersection of simple sets, which are easy to project onto, in this paper we consider that each constraint set is given as the level set of a convex but not necessarily differentiable function. For these settings we propose subgradient iterative algorithms with random minibatch feasibility updates. At each iteration, our algorithms take a step aimed at only minimizing the objective function and then a subsequent step minimizing the feasibility violation of the observed minibatch of constraints. The feasibility updates are performed based on either parallel or sequential random observations of several constraint components. We analyze the convergence behavior of the proposed algorithms for the case when the objective function is restricted strongly convex and with bounded subgradients, while the functional constraints are endowed with a bounded first-order black-box oracle. For a diminishing stepsize, we prove sublinear convergence rates for the expected distances of the weighted averages of the iterates from the constraint set, as well as for the expected suboptimality of the function values along the weighted averages. Our convergence rates are known to be optimal for subgradient methods on this class of problems. Moreover, the rates depend explicitly on the minibatch size and show when minibatching helps a subgradient scheme with random feasibility updates.
Recommendations
- Minibatch stochastic subgradient-based projection algorithms for feasibility problems with convex inequalities
- Random algorithms for convex minimization problems
- An optimal randomized incremental gradient method
- Convergence rate of incremental subgradient algorithms
- Convergence of a stochastic subgradient method with averaging for nonsmooth nonconvex constrained optimization
Cites work
- scientific article; zbMATH DE number 3790208 (Why is no real title available?)
- scientific article; zbMATH DE number 1328979 (Why is no real title available?)
- scientific article; zbMATH DE number 1332320 (Why is no real title available?)
- scientific article; zbMATH DE number 7079312 (Why is no real title available?)
- scientific article; zbMATH DE number 3282977 (Why is no real title available?)
- A double smoothing technique for solving unconstrained nondifferentiable convex optimization problems
- A smooth primal-dual optimization framework for nonsmooth composite convex minimization
- Acceleration of Stochastic Approximation by Averaging
- Energy-based sensor network source localization via projection onto convex sets
- Incremental proximal methods for large scale convex optimization
- Introductory lectures on convex optimization. A basic course.
- Iterative regularization with a general penalty term-theory and application to \(L^{1}\) and \(TV\) regularization
- Linear convergence of first order methods for non-strongly convex optimization
- Minimization of unsmooth functionals
- Nonasymptotic convergence of stochastic proximal point methods for constrained convex optimization
- On Projection Algorithms for Solving Convex Feasibility Problems
- Proximal distance algorithms: theory and practice
- Random algorithms for convex minimization problems
- Random algorithms for solving convex inequalities
- Random minibatch subgradient algorithms for convex problems with functional constraints
- Rate of Convergence Analysis of Decomposition Methods Based on the Proximal Method of Multipliers for Convex Minimization
- Robust Stochastic Approximation Approach to Stochastic Programming
- Weak Sharp Minima in Mathematical Programming
Cited in
(8)- General convergence analysis of stochastic first-order methods for composite optimization
- Mini-batch stochastic subgradient for functional constrained optimization
- Random minibatch subgradient algorithms for convex problems with functional constraints
- Faster randomized block Kaczmarz algorithms
- Random algorithms for convex minimization problems
- Minibatch stochastic subgradient-based projection algorithms for feasibility problems with convex inequalities
- Random gradient-free minimization of convex functions
- A stochastic moving ball approximation method for smooth convex constrained minimization
This page was built for publication: Random minibatch subgradient algorithms for convex problems with functional constraints
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2338088)