Boosting with structural sparsity: a differential inclusion approach
From MaRDI portal
Publication:2278448
DOI10.1016/j.acha.2017.12.004zbMath1494.68215arXiv1704.04833OpenAlexW2624594050MaRDI QIDQ2278448
Xinwei Sun, Chendi Huang, Jiechao Xiong, Yuan Yao
Publication date: 5 December 2019
Published in: Applied and Computational Harmonic Analysis (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1704.04833
differential inclusionsconsistencymodel selectionboostinglinearized Bregman iterationvariable splittinggeneralized Lassostructural sparsity
Ridge regression; shrinkage estimators (Lasso) (62J07) Linear regression; mixed models (62J05) Convex programming (90C25) Learning and adaptive systems in artificial intelligence (68T05) Ordinary differential inclusions (34A60)
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Nonlinear total variation based noise removal algorithms
- The Adaptive Lasso and Its Oracle Properties
- The solution path of the generalized lasso
- Sparse recovery via differential inclusions
- Split Bregman method for large scale fused Lasso
- Least angle regression. (With discussion)
- Simultaneous analysis of Lasso and Dantzig selector
- On early stopping in gradient descent learning
- Robust Sparse Analysis Regularization
- The Split Bregman Method for L1-Regularized Problems
- Greed is Good: Algorithmic Results for Sparse Approximation
- Boosting With theL2Loss
- Sparsity and Smoothness Via the Fused Lasso
- Sharp Thresholds for High-Dimensional and Noisy Sparsity Recovery Using $\ell _{1}$-Constrained Quadratic Programming (Lasso)
- Orthogonal Matching Pursuit for Sparse Signal Recovery With Noise
- On the Non-Negative Garrotte Estimator
- Bregman Iterative Algorithms for $\ell_1$-Minimization with Applications to Compressed Sensing