On asymptotically optimal confidence regions and tests for high-dimensional models
From MaRDI portal
(Redirected from Publication:95759)
Abstract: We propose a general method for constructing confidence intervals and statistical tests for single or low-dimensional components of a large parameter vector in a high-dimensional model. It can be easily adjusted for multiplicity taking dependence among tests into account. For linear models, our method is essentially the same as in Zhang and Zhang [J. R. Stat. Soc. Ser. B Stat. Methodol. 76 (2014) 217-242]: we analyze its asymptotic properties and establish its asymptotic optimality in terms of semiparametric efficiency. Our method naturally extends to generalized linear models with convex loss functions. We develop the corresponding theory which includes a careful analysis for Gaussian, sub-Gaussian and bounded correlated designs.
Recommendations
- Confidence intervals for low dimensional parameters in high dimensional linear models
- Confidence Intervals and Hypothesis Testing for High-Dimensional Regression
- A general theory of hypothesis tests and confidence regions for sparse high dimensional models
- Statistical inference for the optimal approximating model
- A unified theory of confidence regions and testing for high-dimensional estimating equations
Cites work
- scientific article; zbMATH DE number 5957408 (Why is no real title available?)
- scientific article; zbMATH DE number 845714 (Why is no real title available?)
- scientific article; zbMATH DE number 3103824 (Why is no real title available?)
- A Selective Overview of Variable Selection in High Dimensional Feature Space (Invited Review Article)
- A central limit theorem applicable to robust regression estimators
- A unified framework for high-dimensional analysis of \(M\)-estimators with decomposable regularizers
- Accuracy guaranties for \(\ell_{1}\) recovery of block-sparse signals
- Asymptotics for Lasso-type estimators.
- Boosting for high-dimensional linear models
- Bootstrapping Lasso estimators
- Confidence Intervals and Hypothesis Testing for High-Dimensional Regression
- Confidence sets based on sparse estimators are necessarily large
- Confidence sets in sparse regression
- High-dimensional generalized linear models and the lasso
- High-dimensional graphs and variable selection with the Lasso
- High-dimensional variable selection
- Honest confidence regions for nonparametric regression
- Hypothesis Testing in High-Dimensional Regression Under the Gaussian Random Design Model: Asymptotic Theory
- Lasso-type recovery of sparse representations for high-dimensional data
- Least squares after model selection in high-dimensional sparse models
- Nemirovski's inequalities revisited
- New concentration inequalities for suprema of empirical processes
- On asymptotically optimal confidence regions and tests for high-dimensional models
- On the conditions used to prove oracle results for the Lasso
- On the distribution of penalized maximum likelihood estimators: the LASSO, SCAD, and thresholding
- Persistene in high-dimensional linear predictor-selection and the virtue of overparametrization
- Quasi-likelihood and/or robust estimation in high dimensions
- Rates of convergence of the adaptive LASSO estimators to the oracle distribution and higher order refinements by the bootstrap
- Restricted eigenvalue properties for correlated Gaussian designs
- Root-N-Consistent Semiparametric Regression
- Scaled sparse linear regression
- Sharp Thresholds for High-Dimensional and Noisy Sparsity Recovery Using $\ell _{1}$-Constrained Quadratic Programming (Lasso)
- Sign-constrained least squares estimation for high-dimensional regression
- Simultaneous analysis of Lasso and Dantzig selector
- Sparse inverse covariance estimation with the graphical lasso
- Sparsity oracle inequalities for the Lasso
- Square-root lasso: pivotal recovery of sparse signals via conic programming
- Statistical significance in high-dimensional linear models
- Statistics for high-dimensional data. Methods, theory and applications.
- The Dantzig selector: statistical estimation when \(p\) is much larger than \(n\). (With discussions and rejoinder).
- The Group Lasso for Logistic Regression
- The sparsity and bias of the LASSO selection in high-dimensional linear regression
- Valid post-selection inference
- \(p\)-values for high-dimensional regression
Cited in
(only showing first 100 items - show all)- A bootstrap Lasso+partial ridge method to construct confidence intervals for parameters in high-dimensional sparse linear models
- Inference for High-Dimensional Linear Mixed-Effects Models: A Quasi-Likelihood Approach
- On signal detection and confidence sets for low rank inference problems
- SLOPE-adaptive variable selection via convex optimization
- Statistical inference in sparse high-dimensional additive models
- Maximum-type tests for high-dimensional regression coefficients using Wilcoxon scores
- Adaptive estimation of high-dimensional signal-to-noise ratios
- Discussion of big Bayes stories and BayesBag
- High-dimensional inference: confidence intervals, \(p\)-values and R-software \texttt{hdi}
- High-dimensional robust inference for censored linear models
- More powerful genetic association testing via a new statistical framework for integrative genomics
- Confidence intervals for sparse precision matrix estimation via Lasso penalized D-trace loss
- Variable selection for high dimensional Gaussian copula regression model: an adaptive hypothesis testing procedure
- A regularization-based adaptive test for high-dimensional GLMs
- Distributed debiased estimation of high-dimensional partially linear models with jumps
- Hierarchical false discovery rate control for high-dimensional survival analysis with interactions
- High-dimensional model averaging for quantile regression
- Hierarchical inference for genome-wide association studies: a view on methodology with software
- Efficient multiple change point detection for high‐dimensional generalized linear models
- Mathematical foundations of machine learning. Abstracts from the workshop held March 21--27, 2021 (hybrid meeting)
- Greedy algorithms for prediction
- False Discovery Rate Control via Data Splitting
- Elastic-net Regularized High-dimensional Negative Binomial Regression: Consistency and Weak Signal Detection
- A statistical mechanics approach to de-biasing and uncertainty estimation in Lasso for random measurements
- A Scale-Free Approach for False Discovery Rate Control in Generalized Linear Models
- Asymptotic normality of robust M-estimators with convex penalty
- Inference in high dimensional linear measurement error models
- The EAS approach to variable selection for multivariate response data in high-dimensional settings
- Generalized matrix decomposition regression: estimation and inference for two-way structured data
- Debiased Lasso for stratified Cox models with application to the national kidney transplant data
- A tuning-free efficient test for marginal linear effects in high-dimensional quantile regression
- High-dimensional inference for linear model with correlated errors
- Relaxing the assumptions of knockoffs by conditioning
- Are Latent Factor Regression and Sparse Regression Adequate?
- Confidence intervals for low dimensional parameters in high dimensional linear models
- Accuracy assessment for high-dimensional linear regression
- Integrative Factor Regression and Its Inference for Multimodal Data Analysis
- A unified theory of confidence regions and testing for high-dimensional estimating equations
- Design of c-optimal experiments for high-dimensional linear models
- A Random Projection Approach to Hypothesis Tests in High-Dimensional Single-Index Models
- Higher-Order Least Squares: Assessing Partial Goodness of Fit of Linear Causal Models
- Estimation of Linear Functionals in High-Dimensional Linear Models: From Sparsity to Nonsparsity
- Inference in High-Dimensional Online Changepoint Detection
- Large-Scale Two-Sample Comparison of Support Sets
- Test of Significance for High-Dimensional Thresholds with Application to Individualized Minimal Clinically Important Difference
- Uniform inference in high-dimensional dynamic panel data models with approximately sparse fixed effects
- Bootstrap based inference for sparse high-dimensional time series models
- Panel data quantile regression with grouped fixed effects
- Distribution-free predictive inference for regression
- On lower bounds for the bias-variance trade-off
- Distributionally robust and generalizable inference
- Selective inference with a randomized response
- Confidence sets for high-dimensional empirical linear prediction (HELP) models with dependent error structure
- Online Debiasing for Adaptively Collected High-Dimensional Data With Applications to Time Series Analysis
- Inferences in panel data with interactive effects using large covariance matrices
- High-dimensional inference for personalized treatment decision
- Uniformly valid confidence intervals post-model-selection
- scientific article; zbMATH DE number 7370643 (Why is no real title available?)
- Optimal designs in sparse linear models
- Markov Neighborhood Regression for High-Dimensional Inference
- Ill-posed estimation in high-dimensional models with instrumental variables
- The costs and benefits of uniformly valid causal inference with high-dimensional nuisance parameters
- Inference for high‐dimensional linear models with locally stationary error processes
- Optimal linear discriminators for the discrete choice model in growing dimensions
- Post-selection Inference of High-dimensional Logistic Regression Under Case–Control Design
- Inference on the best policies with many covariates
- Retire: robust expectile regression in high dimensions
- Reprint: Statistical inference for linear mediation models with high-dimensional mediators and application to studying stock reaction to COVID-19 pandemic
- Sparse generalized Yule-Walker estimation for large spatio-temporal autoregressions with an application to NO\(_2\) satellite data
- Optimal decorrelated score subsampling for generalized linear models with massive data
- Inference for high-dimensional instrumental variables regression
- Statistical inference for linear mediation models with high-dimensional mediators and application to studying stock reaction to COVID-19 pandemic
- Bootstrapping and sample splitting for high-dimensional, assumption-lean inference
- Rejoinder
- Fixed effects testing in high-dimensional linear mixed models
- Testing a single regression coefficient in high dimensional linear models
- Kernel-penalized regression for analysis of microbiome data
- Estimation of semiparametric regression model with right-censored high-dimensional data
- Regularization and the small-ball method. I: Sparse recovery
- Valid post-selection inference in high-dimensional approximately sparse quantile regression models
- On the asymptotic variance of the debiased Lasso
- Projection-based Inference for High-dimensional Linear Models
- Nearly optimal Bayesian shrinkage for high-dimensional regression
- Inference under Fine-Gray competing risks model with high-dimensional covariates
- Rate optimal estimation and confidence intervals for high-dimensional regression with missing covariates
- Time-dependent Poisson reduced rank models for political text data analysis
- Inference for High-Dimensional Censored Quantile Regression
- Covariate-adjusted inference for differential analysis of high-dimensional networks
- Finite sample performance of linear least squares estimation
- FDR control and power analysis for high-dimensional logistic regression via Stabkoff
- Poisson subsampling-based estimation for growing-dimensional expectile regression in massive data
- Distributed testing and estimation under sparse high dimensional models
- Uniformly valid post-regularization confidence regions for many functional parameters in z-estimation framework
- Simultaneous inference for pairwise graphical models with generalized score matching
- Nonparametric inference via bootstrapping the debiased estimator
- Discussion of “A Scale-Free Approach for False Discovery Rate Control in Generalized Linear Models” by Dai, Lin, Xing, and Liu
- Discussion on: “A Scale-Free Approach for False Discovery Rate Control in Generalized Linear Models” by Dai, Lin, Zing, Liu
- Discussion of “A Scale-Free Approach for False Discovery Rate Control in Generalized Linear Models”
- Debiasing the Lasso: optimal sample size for Gaussian designs
- Confidence sets in sparse regression
This page was built for publication: On asymptotically optimal confidence regions and tests for high-dimensional models
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q95759)