Tight conditions for consistency of variable selection in the context of high dimensionality
From MaRDI portal
Publication:741803
DOI10.1214/12-AOS1046zbMath1373.62154arXiv1106.4293MaRDI QIDQ741803
Arnak S. Dalalyan, Laetitia Comminges
Publication date: 15 September 2014
Published in: The Annals of Statistics (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1106.4293
Nonparametric regression and quantile regression (62G08) Estimation in multivariate analysis (62H12) Asymptotic properties of nonparametric inference (62G20) Linear regression; mixed models (62J05)
Related Items (24)
All-in-one robust estimator of the Gaussian mean ⋮ A nonparametric procedure for linear and nonlinear variable screening ⋮ Nonlinear Variable Selection via Deep Neural Networks ⋮ Estimating linear functionals of a sparse family of Poisson means ⋮ GRID: a variable selection and structure discovery method for high dimensional nonparametric regression ⋮ Bias-corrected inference for multivariate nonparametric regression: model selection and oracle property ⋮ High-dimensional estimation with geometric constraints: Table 1. ⋮ Fundamental limits of exact support recovery in high dimensions ⋮ Minimax testing of a composite null hypothesis defined via a quadratic functional in the model of regression ⋮ Statistical inference in compound functional models ⋮ Kernel Meets Sieve: Post-Regularization Confidence Bands for Sparse Additive Model ⋮ Adaptive variable selection in nonparametric sparse regression ⋮ Learning general sparse additive models from point queries in high dimensions ⋮ Sharp variable selection of a sparse submatrix in a high-dimensional noisy matrix ⋮ Nonparametric Statistics and High/Infinite Dimensional Data ⋮ Randomized maximum-contrast selection: subagging for large-scale regression ⋮ Variable selection with Hamming loss ⋮ Tight conditions for consistency of variable selection in the context of high dimensionality ⋮ Variable selection consistency of Gaussian process regression ⋮ Sparse nonparametric model for regression with functional covariate ⋮ Selection Consistency of Generalized Information Criterion for Sparse Logistic Model ⋮ Multidimensional linear functional estimation in sparse Gaussian models and robust estimation of the mean ⋮ Minimax-optimal nonparametric regression in high dimensions ⋮ Optimal detection of the feature matching map in presence of noise and outliers
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Nearly unbiased variable selection under minimax concave penalty
- Minimal conditions for consistent variable selection in high dimension
- Bayes and empirical-Bayes multiplicity adjustment in the variable-selection problem
- Sparsity in multiple kernel learning
- Oracle inequalities and optimal inference under group sparsity
- Iterative feature selection in least square regression estimation
- Tight conditions for consistency of variable selection in the context of high dimensionality
- Asymptotic statistical equivalence for scalar ergodic diffusions
- High-dimensional variable selection
- Optimal adaptive estimation of a quadratic functional
- Estimation and detection of high-variable functions from Sloan-Woźniakowski space
- Asymptotic equivalence for nonparametric regression with multivariate and random design
- High-dimensional Ising model selection using \(\ell _{1}\)-regularized logistic regression
- The benefit of group sparsity
- The composite absolute penalties family for grouped and hierarchical variable selection
- Estimating the dimension of a model
- Asymptotic equivalence of nonparametric regression and white noise
- Equivalence theory for density estimation, Poisson processes and Gaussian white noise with drift
- Adaptive estimation of a quadratic functional by model selection.
- Minimax risks for sparse regressions: ultra-high dimensional phenomenons
- Detection of sparse additive functions
- Selection of variables and dimension reduction in high-dimensional non-parametric regression
- Dimension reduction and variable selection in case control studies via regularized likelihood optimization
- Estimation and detection of functions from anisotropic Sobolev classes
- Support union recovery in high-dimensional multivariate regression
- Rodeo: Sparse, greedy nonparametric regression
- Lattice points in high-dimensional spheres
- Feature selection by higher criticism thresholding achieves the optimal phase diagram
- Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties
- Information-Theoretic Limits on Sparsity Recovery in the High-Dimensional and Noisy Setting
- Nonconcave Penalized Likelihood With NP-Dimensionality
- Minimax-optimal rates for sparse additive models over kernel classes via convex programming
- Model Selection and Estimation in Regression with Grouped Variables
- Some Comments on C P
- Introduction to nonparametric estimation
This page was built for publication: Tight conditions for consistency of variable selection in the context of high dimensionality