Stability of the elastic net estimator
From MaRDI portal
Publication:895982
DOI10.1016/j.jco.2015.07.002zbMath1330.62280OpenAlexW2188896417MaRDI QIDQ895982
Publication date: 11 December 2015
Published in: Journal of Complexity (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/j.jco.2015.07.002
Classification and discrimination; cluster analysis (statistical aspects) (62H30) Linear regression; mixed models (62J05) Applications of Markov chains and discrete-time Markov processes on general state spaces (social mobility, learning theory, industrial processes, etc.) (60J20) Random matrices (algebraic aspects) (15B52) Convergence of probability measures (60B10) Neural nets and related approaches to inference from stochastic processes (62M45)
Related Items
A new globally convergent algorithm for non-Lipschitz \(\ell_{p}-\ell_q\) minimization, On the grouping effect of the \(l_{1-2}\) models
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- A mathematical introduction to compressive sensing
- The restricted isometry property for time-frequency structured random matrices
- Accelerated Bregman method for linearly constrained \(\ell _1-\ell _2\) minimization
- Restricted isometry property of matrices with independent columns and neighborly polytopes by random sampling
- Explicit constructions of RIP matrices and related problems
- Linear convergence of iterative soft-thresholding
- Elastic-net regularization in learning theory
- A simple proof of the restricted isometry property for random matrices
- A remark on the Lasso and the Dantzig selector
- Stable recovery of analysis based approaches
- Simultaneous analysis of Lasso and Dantzig selector
- Stability and robustness of \(\ell_1\)-minimizations with Weibull matrices and redundant dictionaries
- On the ``degrees of freedom of the lasso
- The Dantzig selector: statistical estimation when \(p\) is much larger than \(n\). (With discussions and rejoinder).
- On grouping effect of elastic net
- Augmented $\ell_1$ and Nuclear-Norm Models with a Globally Linearly Convergent Algorithm
- Elastic-Net Regularization: Iterative Algorithms and Asymptotic Behavior of Solutions
- On sparse reconstruction from Fourier and Gaussian measurements
- Sparse regularization with l q penalty term
- Decoding by Linear Programming
- Near-Optimal Signal Recovery From Random Projections: Universal Encoding Strategies?
- Compressive Sensing by Random Convolution
- Elastic-net regularization: error estimates and active set methods
- Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties
- Uncertainty principles and ideal atomic decomposition
- Sparse Approximation Property and Stable Recovery of Sparse Signals From Noisy Measurements
- Stein Unbiased GrAdient estimator of the Risk (SUGAR) for Multiple Parameter Selection
- Tight Oracle Inequalities for Low-Rank Matrix Recovery From a Minimal Number of Noisy Random Measurements
- Toeplitz Compressed Sensing Matrices With Applications to Sparse Channel Estimation
- Regularization and Variable Selection Via the Elastic Net
- Stable signal recovery from incomplete and inaccurate measurements
- Ridge Regression: Applications to Nonorthogonal Problems
- Compressed sensing
- The elements of statistical learning. Data mining, inference, and prediction
- A unified framework for high-dimensional analysis of \(M\)-estimators with decomposable regularizers