On asymptotically optimal confidence regions and tests for high-dimensional models
DOI10.1214/14-AOS1221zbMATH Open1305.62259arXiv1303.0518OpenAlexW3099550161MaRDI QIDQ95759FDOQ95759
Ruben Dezeure, Sara van de Geer, Sara Van De Geer, Peter Bühlmann, Yaacov Ritov, Ya’Acov Ritov, Ruben Dezeure, Peter Bühlmann
Publication date: 1 June 2014
Published in: The Annals of Statistics (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1303.0518
Recommendations
- Confidence intervals for low dimensional parameters in high dimensional linear models
- Confidence Intervals and Hypothesis Testing for High-Dimensional Regression
- A general theory of hypothesis tests and confidence regions for sparse high dimensional models
- Statistical inference for the optimal approximating model
- A unified theory of confidence regions and testing for high-dimensional estimating equations
lassogeneralized linear modellinear modelcentral limit theoremmultiple testingsemiparametric efficiencysparsity
Asymptotic properties of parametric estimators (62F12) Parametric tolerance and confidence regions (62F25) Ridge regression; shrinkage estimators (Lasso) (62J07) Generalized linear models (logistic models) (62J12)
Cites Work
- Title not available (Why is that?)
- Statistics for high-dimensional data. Methods, theory and applications.
- Lasso-type recovery of sparse representations for high-dimensional data
- On the conditions used to prove oracle results for the Lasso
- Simultaneous analysis of Lasso and Dantzig selector
- High-dimensional generalized linear models and the lasso
- The Dantzig selector: statistical estimation when \(p\) is much larger than \(n\). (With discussions and rejoinder).
- High-dimensional graphs and variable selection with the Lasso
- Restricted eigenvalue properties for correlated Gaussian designs
- Confidence Intervals and Hypothesis Testing for High-Dimensional Regression
- p-Values for High-Dimensional Regression
- Square-root lasso: pivotal recovery of sparse signals via conic programming
- Title not available (Why is that?)
- Sharp Thresholds for High-Dimensional and Noisy Sparsity Recovery Using $\ell _{1}$-Constrained Quadratic Programming (Lasso)
- Title not available (Why is that?)
- A unified framework for high-dimensional analysis of \(M\)-estimators with decomposable regularizers
- On asymptotically optimal confidence regions and tests for high-dimensional models
- Sparse inverse covariance estimation with the graphical lasso
- Persistene in high-dimensional linear predictor-selection and the virtue of overparametrization
- Honest confidence regions for nonparametric regression
- Asymptotics for Lasso-type estimators.
- Least squares after model selection in high-dimensional sparse models
- Sign-constrained least squares estimation for high-dimensional regression
- Sparsity oracle inequalities for the Lasso
- Confidence sets in sparse regression
- Boosting for high-dimensional linear models
- Hypothesis Testing in High-Dimensional Regression Under the Gaussian Random Design Model: Asymptotic Theory
- Nemirovski's Inequalities Revisited
- Bootstrapping Lasso Estimators
- Scaled sparse linear regression
- A Selective Overview of Variable Selection in High Dimensional Feature Space (Invited Review Article)
- Valid post-selection inference
- The Group Lasso for Logistic Regression
- Rates of convergence of the adaptive LASSO estimators to the oracle distribution and higher order refinements by the bootstrap
- Statistical significance in high-dimensional linear models
- Root-N-Consistent Semiparametric Regression
- New concentration inequalities for suprema of empirical processes
- A central limit theorem applicable to robust regression estimators
- Quasi-likelihood and/or robust estimation in high dimensions
- Accuracy guaranties for \(\ell_{1}\) recovery of block-sparse signals
- High-dimensional variable selection
- On the distribution of penalized maximum likelihood estimators: the LASSO, SCAD, and thresholding
- The sparsity and bias of the LASSO selection in high-dimensional linear regression
- Confidence sets based on sparse estimators are necessarily large
Cited In (only showing first 100 items - show all)
- Statistical inference in sparse high-dimensional additive models
- Maximum-type tests for high-dimensional regression coefficients using Wilcoxon scores
- More powerful genetic association testing via a new statistical framework for integrative genomics
- Mathematical foundations of machine learning. Abstracts from the workshop held March 21--27, 2021 (hybrid meeting)
- Confidence Intervals for Sparse Penalized Regression With Random Designs
- Inference in high dimensional linear measurement error models
- iFusion: Individualized Fusion Learning
- High-dimensional inference for linear model with correlated errors
- Accuracy assessment for high-dimensional linear regression
- A unified theory of confidence regions and testing for high-dimensional estimating equations
- Title not available (Why is that?)
- Estimation of semiparametric regression model with right-censored high-dimensional data
- Projection-based Inference for High-dimensional Linear Models
- Inference for High-Dimensional Censored Quantile Regression
- Regularization and the small-ball method. I: Sparse recovery
- Discussion of “A Scale-Free Approach for False Discovery Rate Control in Generalized Linear Models” by Dai, Lin, Xing, and Liu
- Discussion on: “A Scale-Free Approach for False Discovery Rate Control in Generalized Linear Models” by Dai, Lin, Zing, Liu
- Discussion of “A Scale-Free Approach for False Discovery Rate Control in Generalized Linear Models”
- Statistical Inference, Learning and Models in Big Data
- Distributed testing and estimation under sparse high dimensional models
- Debiasing the Lasso: optimal sample size for Gaussian designs
- On the post selection inference constant under restricted isometry properties
- Generalized linear models with structured sparsity estimators
- Optimal sparsity testing in linear regression model
- Online rules for control of false discovery rate and false discovery exceedance
- Penalized Regression for Multiple Types of Many Features With Missing Data
- Composite quantile regression for massive datasets
- Robust machine learning by median-of-means: theory and practice
- A Sequential Significance Test for Treatment by Covariate Interactions
- Title not available (Why is that?)
- Uniformly valid confidence sets based on the Lasso
- SONIC: social network analysis with influencers and communities
- Efficient estimation of smooth functionals in Gaussian shift models
- Confidence intervals for the means of the selected populations
- Universality of regularized regression estimators in high dimensions
- Estimating structured high-dimensional covariance and precision matrices: optimal rates and adaptive estimation
- Sparse linear models and \(l_1\)-regularized 2SLS with high-dimensional endogenous regressors and instruments
- Exploiting Disagreement Between High-Dimensional Variable Selectors for Uncertainty Visualization
- Statistical Inference for High-Dimensional Generalized Linear Models With Binary Outcomes
- Asymptotically efficient estimation of smooth functionals of covariance operators
- Linear Hypothesis Testing in Dense High-Dimensional Linear Models
- Title not available (Why is that?)
- Uniformly valid inference based on the Lasso in linear mixed models
- Second-order Stein: SURE for SURE and other applications in high-dimensional inference
- Robust inference for high‐dimensional single index models
- Confidence regions for entries of a large precision matrix
- High-dimensional inference robust to outliers with ℓ1-norm penalization
- Variable selection and debiased estimation for single‐index expectile model
- Inference for high dimensional linear models with error-in-variables
- The robust desparsified lasso and the focused information criterion for high-dimensional generalized linear models
- Honest Confidence Sets for High-Dimensional Regression by Projection and Shrinkage
- High-dimensional inference in misspecified linear models
- UNIFORM INFERENCE IN HIGH-DIMENSIONAL DYNAMIC PANEL DATA MODELS WITH APPROXIMATELY SPARSE FIXED EFFECTS
- Confidence intervals for high-dimensional Cox models
- Variable selection in expectile regression
- Rejoinder on: ``High-dimensional simultaneous inference with the bootstrap
- Comments on: ``High-dimensional simultaneous inference with the bootstrap
- Valid Post-Selection Inference in High-Dimensional Approximately Sparse Quantile Regression Models
- Testing regression coefficients in high-dimensional and sparse settings
- Generalized M-estimators for high-dimensional Tobit I models
- Additive model selection
- Poststratification fusion learning in longitudinal data analysis
- The Lasso with general Gaussian designs with applications to hypothesis testing
- Doubly debiased Lasso: high-dimensional inference under hidden confounding
- Communication-Efficient Distributed Statistical Inference
- The de-biased group Lasso estimation for varying coefficient models
- Semiparametric efficiency bounds for high-dimensional models
- Capturing Spike Variability in Noisy Izhikevich Neurons Using Point Process Generalized Linear Models
- Confidence intervals for sparse precision matrix estimation via Lasso penalized D-trace loss
- Tests for Coefficients in High-dimensional Additive Hazard Models
- Asymptotic normality of robust \(M\)-estimators with convex penalty
- Relaxing the assumptions of knockoffs by conditioning
- Design of c-optimal experiments for high-dimensional linear models
- Confidence sets for high-dimensional empirical linear prediction (HELP) models with dependent error structure
- Optimal designs in sparse linear models
- Optimal linear discriminators for the discrete choice model in growing dimensions
- Inference for high-dimensional instrumental variables regression
- Nearly optimal Bayesian shrinkage for high-dimensional regression
- Inference under Fine-Gray competing risks model with high-dimensional covariates
- Rate optimal estimation and confidence intervals for high-dimensional regression with missing covariates
- Time-dependent Poisson reduced rank models for political text data analysis
- Covariate-adjusted inference for differential analysis of high-dimensional networks
- The likelihood ratio test in high-dimensional logistic regression is asymptotically a rescaled Chi-square
- Title not available (Why is that?)
- Inference for low-rank tensors -- no need to debias
- Testability of high-dimensional linear models with nonsparse structures
- Bootstrap inference for penalized GMM estimators with oracle properties
- Network classification with applications to brain connectomics
- Recent advances in statistical methodologies in evaluating program for high-dimensional data
- Constructing confidence intervals for the signals in sparse phase retrieval
- High-dimensional sufficient dimension reduction through principal projections
- Hypothesis Testing for Network Data with Power Enhancement
- Title not available (Why is that?)
- Title not available (Why is that?)
- Kernel Meets Sieve: Post-Regularization Confidence Bands for Sparse Additive Model
- Rejoinder to “A Tuning-Free Robust and Efficient Approach to High-Dimensional Regression”
- Directed graphs and variable selection in large vector autoregressive models
- Inference without compatibility: using exponential weighting for inference on a parameter of a linear model
- Model selection with mixed variables on the Lasso path
- Efficient distributed estimation of high-dimensional sparse precision matrix for transelliptical graphical models
Uses Software
This page was built for publication: On asymptotically optimal confidence regions and tests for high-dimensional models
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q95759)