Mini-batch stochastic subgradient for functional constrained optimization
From MaRDI portal
Publication:6565290
DOI10.1080/02331934.2023.2189015zbMATH Open1542.65069MaRDI QIDQ6565290FDOQ6565290
Authors: Nitesh Kumar Singh, Ion Necoara, Vyacheslav Kungurtsev
Publication date: 1 July 2024
Published in: Optimization (Search for Journal in Brave)
Recommendations
- Random minibatch subgradient algorithms for convex problems with functional constraints
- Minibatch stochastic subgradient-based projection algorithms for feasibility problems with convex inequalities
- Mini-batch stochastic approximation methods for nonconvex stochastic composite optimization
- Primal-Dual Stochastic Gradient Method for Convex Programs with Many Functional Constraints
- Stochastic first-order methods for convex and nonconvex functional constrained optimization
Numerical mathematical programming methods (65K05) Convex programming (90C25) Stochastic programming (90C15)
Cites Work
- The solution path of the generalized lasso
- Acceleration of Stochastic Approximation by Averaging
- A Stochastic Approximation Method
- Robust Stochastic Approximation Approach to Stochastic Programming
- Title not available (Why is that?)
- Title not available (Why is that?)
- Computational complexity of inexact gradient augmented Lagrangian methods: application to constrained MPC
- On optimal probabilities in stochastic coordinate descent methods
- Title not available (Why is that?)
- Convergence of stochastic proximal gradient algorithm
- Efficient online and batch learning using forward backward splitting
- Minimization of unsmooth functionals
- Linear convergence of first order methods for non-strongly convex optimization
- Algorithms for Fitting the Constrained Lasso
- Lectures on convex optimization
- Nonasymptotic convergence of stochastic proximal point methods for constrained convex optimization
- RSG: Beating Subgradient Method without Smoothness and Strong Convexity
- The dual and degrees of freedom of linearly constrained generalized Lasso
- General convergence analysis of stochastic first-order methods for composite optimization
- Random minibatch subgradient algorithms for convex problems with functional constraints
- Title not available (Why is that?)
Cited In (1)
This page was built for publication: Mini-batch stochastic subgradient for functional constrained optimization
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6565290)