Linear convergence of prox-SVRG method for separable non-smooth convex optimization problems under bounded metric subregularity
DOI10.1007/S10957-021-01978-WzbMATH Open1487.90530OpenAlexW4205999994MaRDI QIDQ2115253FDOQ2115253
Publication date: 15 March 2022
Published in: Journal of Optimization Theory and Applications (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s10957-021-01978-w
Recommendations
- Variable metric proximal stochastic variance reduced gradient methods for nonconvex nonsmooth optimization
- scientific article; zbMATH DE number 7404502
- Asymptotic estimates for \(r\)-Whitney numbers of the second kind
- On the linear convergence of a proximal gradient method for a class of nonsmooth convex minimization problems
- Variational analysis perspective on linear convergence of some first order methods for nonsmooth convex optimization problems
calmnesslinear convergencebounded metric subregularityproximal stochastic variance-reduced gradientrandomized block-coordinate proximal gradient
Cites Work
- Title not available (Why is that?)
- Variational Analysis
- Model Selection and Estimation in Regression with Grouped Variables
- The Group Lasso for Logistic Regression
- Convex Analysis
- A coordinate gradient descent method for nonsmooth separable minimization
- Techniques of variational analysis
- Recovery Algorithms for Vector-Valued Data with Joint Sparsity Constraints
- Strongly Regular Generalized Equations
- Approximation accuracy, gradient methods, and error bound for structured convex optimization
- Sparse regression using mixed norms
- Iteration complexity of randomized block-coordinate descent methods for minimizing a composite function
- Optimization in high dimensions via accelerated, parallel, and proximal coordinate descent
- Efficiency of coordinate descent methods on huge-scale optimization problems
- Parallel Random Coordinate Descent Method for Composite Minimization: Convergence Analysis and Error Bounds
- Lipschitz Behavior of Solutions to Convex Minimization Problems
- A Proximal Stochastic Gradient Method with Progressive Variance Reduction
- Stability Theory for Systems of Inequalities. Part I: Linear Systems
- On the Linear Convergence of Descent Methods for Convex Essentially Smooth Minimization
- Characterization of metric regularity of subdifferentials
- Necessary Optimality Conditions for Optimization Problems with Variational Inequality Constraints
- Metric subregularity of the convex subdifferential in Banach spaces
- New Constraint Qualifications for Mathematical Programs with Equilibrium Constraints via Variational Analysis
- Regularity and conditioning of solution mappings in variational analysis
- Random block coordinate descent methods for linearly constrained optimization over networks
- Metric Subregularity of Piecewise Linear Multifunctions and Applications to Piecewise Linear Multiobjective Optimization
- Title not available (Why is that?)
- Error Bounds, Quadratic Growth, and Linear Convergence of Proximal Methods
- A unified approach to error bounds for structured convex optimization problems
- Title not available (Why is that?)
- Variational analysis perspective on linear convergence of some first order methods for nonsmooth convex optimization problems
Cited In (1)
Uses Software
This page was built for publication: Linear convergence of prox-SVRG method for separable non-smooth convex optimization problems under bounded metric subregularity
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2115253)