Worst possible sub-directions in high-dimensional models
From MaRDI portal
Publication:268764
DOI10.1016/j.jmva.2015.09.018zbMath1334.62133arXiv1403.7023OpenAlexW1819596675MaRDI QIDQ268764
Publication date: 15 April 2016
Published in: Journal of Multivariate Analysis (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1403.7023
Related Items
An introduction to recent advances in high/infinite dimensional statistics, The benefit of group sparsity in group inference with de-biased scaled group Lasso, Confidence intervals for high-dimensional inverse covariance estimation, Honest confidence regions and optimality in high-dimensional precision matrix estimation
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- On asymptotically optimal confidence regions and tests for high-dimensional models
- Confidence intervals for high-dimensional inverse covariance estimation
- A partial overview of the theory of statistics with functional data
- Inference for functional data with applications
- Variable selection in infinite-dimensional problems
- \(L_1\)-penalization in functional linear regression with subgaussian design
- Statistics for high-dimensional data. Methods, theory and applications.
- Nuclear-norm penalization and optimal rates for noisy low-rank matrix completion
- Factor models and variable selection in high-dimensional regression analysis
- Generic chaining and the \(\ell _{1}\)-penalty
- High-dimensional covariance estimation by minimizing \(\ell _{1}\)-penalized log-determinant divergence
- Simultaneous analysis of Lasso and Dantzig selector
- High-dimensional generalized linear models and the lasso
- Sup-norm convergence rate and sign concentration property of Lasso and Dantzig estimators
- Functional data analysis.
- Nonparametric functional data analysis. Theory and practice.
- High-dimensional graphs and variable selection with the Lasso
- Confidence Intervals and Hypothesis Testing for High-Dimensional Regression
- Hypothesis Testing in High-Dimensional Regression Under the Gaussian Random Design Model: Asymptotic Theory
- Square-root lasso: pivotal recovery of sparse signals via conic programming
- Inference on Treatment Effects after Selection among High-Dimensional Controls
- Sharp Thresholds for High-Dimensional and Noisy Sparsity Recovery Using $\ell _{1}$-Constrained Quadratic Programming (Lasso)
- Uniform post-selection inference for least absolute deviation regression and other Z-estimation problems
- Weakly decomposable regularization penalties and structured sparsity
- Confidence Intervals for Low Dimensional Parameters in High Dimensional Linear Models
- A unified framework for high-dimensional analysis of \(M\)-estimators with decomposable regularizers