On asymptotically optimal confidence regions and tests for high-dimensional models
DOI10.1214/14-AOS1221zbMATH Open1305.62259arXiv1303.0518OpenAlexW3099550161MaRDI QIDQ95759FDOQ95759
Ruben Dezeure, Sara van de Geer, Sara Van De Geer, Peter Bühlmann, Yaacov Ritov, Ya’Acov Ritov, Ruben Dezeure, Peter Bühlmann
Publication date: 1 June 2014
Published in: The Annals of Statistics (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1303.0518
Recommendations
- Confidence intervals for low dimensional parameters in high dimensional linear models
- Confidence Intervals and Hypothesis Testing for High-Dimensional Regression
- A general theory of hypothesis tests and confidence regions for sparse high dimensional models
- Statistical inference for the optimal approximating model
- A unified theory of confidence regions and testing for high-dimensional estimating equations
lassogeneralized linear modellinear modelcentral limit theoremmultiple testingsemiparametric efficiencysparsity
Asymptotic properties of parametric estimators (62F12) Parametric tolerance and confidence regions (62F25) Ridge regression; shrinkage estimators (Lasso) (62J07) Generalized linear models (logistic models) (62J12)
Cites Work
- Title not available (Why is that?)
- Statistics for high-dimensional data. Methods, theory and applications.
- Lasso-type recovery of sparse representations for high-dimensional data
- On the conditions used to prove oracle results for the Lasso
- Simultaneous analysis of Lasso and Dantzig selector
- High-dimensional generalized linear models and the lasso
- The Dantzig selector: statistical estimation when \(p\) is much larger than \(n\). (With discussions and rejoinder).
- High-dimensional graphs and variable selection with the Lasso
- Restricted eigenvalue properties for correlated Gaussian designs
- Confidence Intervals and Hypothesis Testing for High-Dimensional Regression
- p-Values for High-Dimensional Regression
- Square-root lasso: pivotal recovery of sparse signals via conic programming
- Title not available (Why is that?)
- Sharp Thresholds for High-Dimensional and Noisy Sparsity Recovery Using $\ell _{1}$-Constrained Quadratic Programming (Lasso)
- Title not available (Why is that?)
- A unified framework for high-dimensional analysis of \(M\)-estimators with decomposable regularizers
- On asymptotically optimal confidence regions and tests for high-dimensional models
- Sparse inverse covariance estimation with the graphical lasso
- Persistene in high-dimensional linear predictor-selection and the virtue of overparametrization
- Honest confidence regions for nonparametric regression
- Asymptotics for Lasso-type estimators.
- Least squares after model selection in high-dimensional sparse models
- Sign-constrained least squares estimation for high-dimensional regression
- Sparsity oracle inequalities for the Lasso
- Confidence sets in sparse regression
- Boosting for high-dimensional linear models
- Hypothesis Testing in High-Dimensional Regression Under the Gaussian Random Design Model: Asymptotic Theory
- Nemirovski's Inequalities Revisited
- Bootstrapping Lasso Estimators
- Scaled sparse linear regression
- A Selective Overview of Variable Selection in High Dimensional Feature Space (Invited Review Article)
- Valid post-selection inference
- The Group Lasso for Logistic Regression
- Rates of convergence of the adaptive LASSO estimators to the oracle distribution and higher order refinements by the bootstrap
- Statistical significance in high-dimensional linear models
- Root-N-Consistent Semiparametric Regression
- New concentration inequalities for suprema of empirical processes
- A central limit theorem applicable to robust regression estimators
- Quasi-likelihood and/or robust estimation in high dimensions
- Accuracy guaranties for \(\ell_{1}\) recovery of block-sparse signals
- High-dimensional variable selection
- On the distribution of penalized maximum likelihood estimators: the LASSO, SCAD, and thresholding
- The sparsity and bias of the LASSO selection in high-dimensional linear regression
- Confidence sets based on sparse estimators are necessarily large
Cited In (only showing first 100 items - show all)
- Confidence intervals for sparse precision matrix estimation via Lasso penalized D-trace loss
- Tests for Coefficients in High-dimensional Additive Hazard Models
- Asymptotic normality of robust \(M\)-estimators with convex penalty
- Relaxing the assumptions of knockoffs by conditioning
- Design of c-optimal experiments for high-dimensional linear models
- Confidence sets for high-dimensional empirical linear prediction (HELP) models with dependent error structure
- Optimal designs in sparse linear models
- Optimal linear discriminators for the discrete choice model in growing dimensions
- Inference for high-dimensional instrumental variables regression
- Nearly optimal Bayesian shrinkage for high-dimensional regression
- Inference under Fine-Gray competing risks model with high-dimensional covariates
- Rate optimal estimation and confidence intervals for high-dimensional regression with missing covariates
- Time-dependent Poisson reduced rank models for political text data analysis
- Covariate-adjusted inference for differential analysis of high-dimensional networks
- The likelihood ratio test in high-dimensional logistic regression is asymptotically a rescaled Chi-square
- Title not available (Why is that?)
- Inference for low-rank tensors -- no need to debias
- Testability of high-dimensional linear models with nonsparse structures
- Bootstrap inference for penalized GMM estimators with oracle properties
- Network classification with applications to brain connectomics
- Recent advances in statistical methodologies in evaluating program for high-dimensional data
- Constructing confidence intervals for the signals in sparse phase retrieval
- High-dimensional sufficient dimension reduction through principal projections
- Hypothesis Testing for Network Data with Power Enhancement
- Title not available (Why is that?)
- Title not available (Why is that?)
- Kernel Meets Sieve: Post-Regularization Confidence Bands for Sparse Additive Model
- Rejoinder to “A Tuning-Free Robust and Efficient Approach to High-Dimensional Regression”
- Directed graphs and variable selection in large vector autoregressive models
- Inference without compatibility: using exponential weighting for inference on a parameter of a linear model
- Model selection with mixed variables on the Lasso path
- Efficient distributed estimation of high-dimensional sparse precision matrix for transelliptical graphical models
- High-dimensional variable selection via low-dimensional adaptive learning
- Multicarving for high-dimensional post-selection inference
- Online inference in high-dimensional generalized linear models with streaming data
- Efficient estimation of linear functionals of principal components
- Inference on the change point under a high dimensional sparse mean shift
- Lasso meets horseshoe: a survey
- The distribution of the Lasso: uniform control over sparse balls and adaptive parameter tuning
- Predictor ranking and false discovery proportion control in high-dimensional regression
- Statistical inference for Cox proportional hazards models with a diverging number of covariates
- In defense of the indefensible: a very naïve approach to high-dimensional inference
- An overview of tests on high-dimensional means
- Linear hypothesis testing for high dimensional generalized linear models
- Doubly robust tests of exposure effects under high‐dimensional confounding
- Inference for high-dimensional varying-coefficient quantile regression
- Scale calibration for high-dimensional robust regression
- A convex programming solution based debiased estimator for quantile with missing response and high-dimensional covariables
- Distributed adaptive Huber regression
- Confidence intervals for parameters in high-dimensional sparse vector autoregression
- Some perspectives on inference in high dimensions
- Debiasing convex regularized estimators and interval estimation in linear models
- Composite versus model-averaged quantile regression
- Structural inference in sparse high-dimensional vector autoregressions
- A weak‐signal‐assisted procedure for variable selection and statistical inference with an informative subsample
- Network differential connectivity analysis
- Spatially relaxed inference on high-dimensional linear models
- De-biasing the Lasso with degrees-of-freedom adjustment
- Robust post-selection inference of high-dimensional mean regression with heavy-tailed asymmetric or heteroskedastic errors
- A selective review of statistical methods using calibration information from similar studies
- Debiased Inference on Treatment Effect in a High-Dimensional Model
- SLOPE-adaptive variable selection via convex optimization
- Adaptive estimation of high-dimensional signal-to-noise ratios
- High-dimensional inference: confidence intervals, \(p\)-values and R-software \texttt{hdi}
- Discussion of big Bayes stories and BayesBag
- Variable selection for high dimensional Gaussian copula regression model: an adaptive hypothesis testing procedure
- Hierarchical inference for genome-wide association studies: a view on methodology with software
- Greedy algorithms for prediction
- Bootstrap based inference for sparse high-dimensional time series models
- Selective inference with a randomized response
- High-dimensional inference for personalized treatment decision
- Uniformly valid confidence intervals post-model-selection
- Bootstrapping and sample splitting for high-dimensional, assumption-lean inference
- Kernel-penalized regression for analysis of microbiome data
- On the asymptotic variance of the debiased Lasso
- Testing a single regression coefficient in high dimensional linear models
- Uniformly valid post-regularization confidence regions for many functional parameters in z-estimation framework
- Confidence sets in sparse regression
- Nonparametric inference via bootstrapping the debiased estimator
- Convergence rates of least squares regression estimators with heavy-tailed errors
- Confidence Intervals and Hypothesis Testing for High-Dimensional Regression
- A two-step method for estimating high-dimensional Gaussian graphical models
- Confidence intervals for high-dimensional partially linear single-index models
- Testing for high-dimensional network parameters in auto-regressive models
- A statistical mechanics approach to de-biasing and uncertainty estimation in LASSO for random measurements
- Geometric inference for general high-dimensional linear inverse problems
- Title not available (Why is that?)
- Consistent and conservative model selection with the adaptive Lasso in stationary and nonstationary autoregressions
- Oracle inequalities, variable selection and uniform inference in high-dimensional correlated random effects panel data models
- Detecting rare and faint signals via thresholding maximum likelihood estimators
- Asymptotic inference for high-dimensional data
- On the uniform convergence of empirical norms and inner products, with application to causal inference
- A Bootstrap Lasso + Partial Ridge Method to Construct Confidence Intervals for Parameters in High-dimensional Sparse Linear Models
- Model-assisted inference for treatment effects using regularized calibrated estimation with high-dimensional data
- A Projection Based Conditional Dependence Measure with Applications to High-dimensional Undirected Graphical Models
- Title not available (Why is that?)
- Test of significance for high-dimensional longitudinal data
- Confidence intervals for high-dimensional inverse covariance estimation
- A penalized approach to covariate selection through quantile regression coefficient models
- Non-asymptotic error controlled sparse high dimensional precision matrix estimation
Uses Software
This page was built for publication: On asymptotically optimal confidence regions and tests for high-dimensional models
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q95759)