A sparsity preserving stochastic gradient methods for sparse regression
From MaRDI portal
Publication:457215
DOI10.1007/S10589-013-9633-9zbMATH Open1401.62129OpenAlexW1965193428MaRDI QIDQ457215FDOQ457215
Authors: Qihang Lin, Xi Chen, Javier Peña
Publication date: 26 September 2014
Published in: Computational Optimization and Applications (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s10589-013-9633-9
Recommendations
- Sparse recovery by reduced variance stochastic approximation
- On stochastic accelerated gradient with convergence rate of regression learning
- scientific article; zbMATH DE number 6253925
- Dual averaging methods for regularized stochastic learning and online optimization
- Sparse online learning via truncated gradient
Cites Work
- A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems
- Title not available (Why is that?)
- Model Selection and Estimation in Regression with Grouped Variables
- Smooth minimization of non-smooth functions
- Introductory lectures on convex optimization. A basic course.
- Acceleration of Stochastic Approximation by Averaging
- Title not available (Why is that?)
- A Stochastic Approximation Method
- Primal-dual subgradient methods for convex problems
- Robust Stochastic Approximation Approach to Stochastic Programming
- Title not available (Why is that?)
- An accelerated proximal gradient algorithm for nuclear norm regularized linear least squares problems
- Dual averaging methods for regularized stochastic learning and online optimization
- An optimal method for stochastic composite optimization
- Optimal Stochastic Approximation Algorithms for Strongly Convex Stochastic Composite Optimization I: A Generic Algorithmic Framework
- Optimal stochastic approximation algorithms for strongly convex stochastic composite optimization. II: Shrinking procedures and optimal algorithms
- Stochastic Estimation of the Maximum of a Regression Function
- stochastic quasigradient methods and their application to system optimization†
- On a Stochastic Approximation Method
- Title not available (Why is that?)
- Online learning with samples drawn from non-identical distributions
- Asymptotic Distribution of Stochastic Approximation Procedures
- A method of aggregate stochastic subgradients with on-line stepsize rules for convex stochastic programming problems
- Optimal distributed online prediction using mini-batches
- Randomized smoothing for stochastic optimization
- Trace norm regularization: reformulations, algorithms, and multi-task learning
- Title not available (Why is that?)
- Manifold identification in dual averaging for regularized stochastic online learning
Cited In (16)
- Sparse high-dimensional regression: exact scalable algorithms and phase transitions
- On stochastic accelerated gradient with convergence rate of regression learning
- Stochastic forward-backward splitting for monotone inclusions
- On variance reduction for stochastic smooth convex optimization with multiplicative noise
- Gradient flows and randomised thresholding: sparse inversion and classification
- Utilizing second order information in minibatch stochastic variance reduced proximal iterations
- A general framework for fast stagewise algorithms
- Adaptive proximal SGD based on new estimating sequences for sparser ERM
- A Fast Gradient Method for Nonnegative Sparse Regression With Self-Dictionary
- Convergence of stochastic proximal gradient algorithm
- A note on sparse least-squares regression
- Online sparse identification for regression models
- ADMM Algorithmic Regularization Paths for Sparse Statistical Machine Learning
- Sparse recovery by reduced variance stochastic approximation
- On the convergence rate of sparse grid least squares regression
- Title not available (Why is that?)
This page was built for publication: A sparsity preserving stochastic gradient methods for sparse regression
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q457215)