Anℓ1-oracle inequality for the Lasso in multivariate finite mixture of multivariate Gaussian regression models
From MaRDI portal
Publication:2786498
DOI10.1051/ps/2015011zbMath1392.62179arXiv1410.4682OpenAlexW2286729967MaRDI QIDQ2786498
Publication date: 12 February 2016
Published in: ESAIM: Probability and Statistics (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1410.4682
Ridge regression; shrinkage estimators (Lasso) (62J07) Classification and discrimination; cluster analysis (statistical aspects) (62H30)
Related Items (2)
Approximations of conditional probability density functions in Lebesgue spaces via mixture of experts models ⋮ A non-asymptotic approach for model selection via penalization in high-dimensional mixture of experts models
Cites Work
- Exponential screening and optimal rates of sparse estimation
- \(\ell_{1}\)-penalization for mixture regression models
- Concentration inequalities and model selection. Ecole d'Eté de Probabilités de Saint-Flour XXXIII -- 2003.
- Estimation of dependences based on empirical data. Transl. from the Russian by Samuel Kotz
- Least angle regression. (With discussion)
- Weak convergence and empirical processes. With applications to statistics
- On the conditions used to prove oracle results for the Lasso
- The Lasso as an \(\ell _{1}\)-ball model selection procedure
- The adaptive and the thresholded Lasso for potentially misspecified models (and a lower bound for the Lasso)
- Simultaneous analysis of Lasso and Dantzig selector
- Anℓ1-oracle inequality for the Lasso in finite mixture Gaussian regression models
- Unnamed Item
- Unnamed Item
This page was built for publication: Anℓ1-oracle inequality for the Lasso in multivariate finite mixture of multivariate Gaussian regression models