Detection boundary in sparse regression
From MaRDI portal
Publication:1952112
DOI10.1214/10-EJS589zbMath1329.62314arXiv1009.1706MaRDI QIDQ1952112
Nicolas Verzelen, Alexandre B. Tsybakov, Yuri I. Ingster
Publication date: 27 May 2013
Published in: Electronic Journal of Statistics (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1009.1706
sparsity; high-dimensional regression; minimax hypothesis testing; detection boundary; sparse vectors
62G10: Nonparametric hypothesis testing
62G20: Asymptotic properties of nonparametric inference
62J05: Linear regression; mixed models
62G05: Nonparametric estimation
62C20: Minimax procedures in statistical decision theory
Related Items
Adaptive Global Testing for Functional Linear Models, Global and Simultaneous Hypothesis Testing for High-Dimensional Logistic Regression Models, Comments on: ``High-dimensional simultaneous inference with the bootstrap, Moderate-Dimensional Inferences on Quadratic Functionals in Ordinary Least Squares, Honest Confidence Sets for High-Dimensional Regression by Projection and Shrinkage, Group Inference in High Dimensions with Applications to Hierarchical Testing, Higher Criticism for Discriminating Word-Frequency Tables and Testing Authorship, Sparse Sliced Inverse Regression Via Lasso, Higher criticism for large-scale inference, especially for rare and weak effects, Global testing against sparse alternatives in time-frequency analysis, Statistical significance in high-dimensional linear models, Optimal detection of sparse principal components in high dimension, Detection boundary and higher criticism approach for rare and weak genetic effects, Global testing under sparse alternatives: ANOVA, multiple comparisons and the higher criticism, Heritability estimation in high dimensional sparse linear mixed models, On signal detection and confidence sets for low rank inference problems, To the memory of Yu. I. Ingster, Significance testing in non-sparse high-dimensional linear models, Hypothesis testing for high-dimensional multinomials: a selective review, Powerful test based on conditional effects for genome-wide screening, Detection thresholds for the \(\beta\)-model on sparse graphs, Change detection via affine and quadratic detectors, Detectability of nonparametric signals: higher criticism versus likelihood ratio, Combinatorial inference for graphical models, Adaptive estimation of the sparsity in the Gaussian vector model, Adaptive estimation of high-dimensional signal-to-noise ratios, A novel detection scheme with multiple observations for sparse signal based on likelihood ratio test with sparse estimation, Global testing against sparse alternatives under Ising models, Optimality and sub-optimality of PCA. I: Spiked random matrix models, Minimax risks for sparse regressions: ultra-high dimensional phenomenons, Detecting a vector based on linear measurements, Detection of sparse additive functions, Optimal sparsity testing in linear regression model, A Bayesian-motivated test for high-dimensional linear regression models with fixed design matrix, Optimal adaptivity of signed-polygon statistics for network testing, Testing degree corrections in stochastic block models, The all-or-nothing phenomenon in sparse linear regression, Two-sample testing of high-dimensional linear regression coefficients via complementary sketching, High-dimensional asymptotics of likelihood ratio tests in the Gaussian sequence model under convex constraints, Testability of high-dimensional linear models with nonsparse structures, Thresholding tests based on affine Lasso to achieve non-asymptotic nominal level and high power under sparse and dense alternatives in high dimension, Higher criticism to compare two large frequency tables, with sensitivity to possible rare and weak differences, Minimax rate of testing in sparse linear regression, TFisher: a powerful truncation and weighting procedure for combining \(p\)-values, Adaptive confidence sets in shape restricted regression, Optimal testing for planted satisfiability problems, A global homogeneity test for high-dimensional linear regression, Hypothesis testing for densities and high-dimensional multinomials: sharp local minimax rates, Signal detection via Phi-divergences for general mixtures, Hypothesis testing for high-dimensional sparse binary regression, Distribution-free tests for sparse heterogeneous mixtures, Accuracy assessment for high-dimensional linear regression, Confidence sets in sparse regression, Community detection in dense random networks, Estimation of the \(\ell_2\)-norm and testing in sparse linear regression with unknown variance, Most powerful test against a sequence of high dimensional local alternatives, Model Selection for Classification with a Large Number of Classes
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Innovated higher criticism for detecting sparse signals in correlated noise
- Global testing under sparse alternatives: ANOVA, multiple comparisons and the higher criticism
- Some problems of hypothesis testing leading to infinitely divisible distributions
- Nonparametric goodness-of-fit testing under Gaussian models
- Adaptive detection of a signal of growing dimension. I
- Adaptive detection of a signal of growing dimension. II
- Higher criticism for detecting sparse heterogeneous mixtures.
- Minimax risks for sparse regressions: ultra-high dimensional phenomenons
- Goodness-of-fit tests for high-dimensional Gaussian linear models
- Simultaneous analysis of Lasso and Dantzig selector
- Goodness-of-fit tests via phi-divergences
- The Dantzig selector: statistical estimation when \(p\) is much larger than \(n\). (With discussions and rejoinder).
- Estimation and confidence sets for sparse normal mixtures
- Detection of a signal of known shape in a multichannel system
- Higher criticism thresholding: Optimal feature selection when useful features are rare and weak
- Classification of sparse high-dimensional vectors
- Feature selection by higher criticism thresholding achieves the optimal phase diagram
- Information-Theoretic Limits on Sparsity Recovery in the High-Dimensional and Noisy Setting
- Sharp Thresholds for High-Dimensional and Noisy Sparsity Recovery Using $\ell _{1}$-Constrained Quadratic Programming (Lasso)
- Distilled Sensing: Adaptive Sampling for Sparse Detection and Estimation
- Compressed sensing