\(L_1\)-penalization in functional linear regression with subgaussian design
From MaRDI portal
Publication:487731
DOI10.5802/jep.11zbMath1308.62143arXiv1307.8137OpenAlexW2963582384MaRDI QIDQ487731
Stanislav Minsker, Vladimir I. Koltchinskii
Publication date: 23 January 2015
Published in: Journal de l'École Polytechnique -- Mathématiques (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1307.8137
Ridge regression; shrinkage estimators (Lasso) (62J07) Linear regression; mixed models (62J05) Nonparametric estimation (62G05)
Related Items
Worst possible sub-directions in high-dimensional models, Lasso in Infinite dimension: application to variable selection in functional multivariate linear regression, Adapting to unknown noise level in sparse deconvolution
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Statistics for high-dimensional data. Methods, theory and applications.
- Oracle inequalities in empirical risk minimization and sparse recovery problems. École d'Été de Probabilités de Saint-Flour XXXVIII-2008.
- Empirical processes with a bounded \(\psi_1\) diameter
- The Dantzig selector and sparsity oracle inequalities
- A reproducing kernel Hilbert space approach to functional linear regression
- Nuclear-norm penalization and optimal rates for noisy low-rank matrix completion
- Functional linear regression that's interpretable
- Sparsity in penalized empirical risk minimization
- Prediction in functional linear regression
- The restricted isometry property and its implications for compressed sensing
- Smoothing splines estimators for functional linear regression
- Sparse recovery in convex hulls via entropy penalization
- A tail inequality for suprema of unbounded empirical processes with applications to Markov chains
- Functional data analysis
- Multivariate integration and approximation for random fields satisfying Sacks-Ylvisaker conditions
- Weak convergence and empirical processes. With applications to statistics
- \(\ell _{1}\)-regularized linear regression: persistence and oracle inequalities
- The Lasso as an \(\ell _{1}\)-ball model selection procedure
- Generalized functional linear models
- Simultaneous analysis of Lasso and Dantzig selector
- High-dimensional generalized linear models and the lasso
- Sparsity oracle inequalities for the Lasso
- Tail bounds via generic chaining
- How Correlations Influence Lasso Prediction
- The Generic Chaining
- Towards a Mathematical Theory of Super‐resolution
- Stable signal recovery from incomplete and inaccurate measurements
- Designs for Regression Problems with Correlated Errors