On asymptotically optimal confidence regions and tests for high-dimensional models

From MaRDI portal
Publication:95759

DOI10.1214/14-aos1221zbMath1305.62259arXiv1303.0518OpenAlexW3099550161MaRDI QIDQ95759

Ruben Dezeure, Sara van de Geer, Peter Bühlmann, Ya’Acov Ritov, Sara van de Geer, Ya'acov Ritov, Ruben Dezeure, Peter Bühlmann

Publication date: 1 June 2014

Published in: The Annals of Statistics (Search for Journal in Brave)

Full work available at URL: https://arxiv.org/abs/1303.0518



Related Items

Greedy algorithms for prediction, Worst possible sub-directions in high-dimensional models, Covariate-adjusted inference for differential analysis of high-dimensional networks, Statistical inference in sparse high-dimensional additive models, Lasso-driven inference in time and space, Significance testing in non-sparse high-dimensional linear models, Maximum-type tests for high-dimensional regression coefficients using Wilcoxon scores, Mathematical foundations of machine learning. Abstracts from the workshop held March 21--27, 2021 (hybrid meeting), Testability of high-dimensional linear models with nonsparse structures, Inference for low-rank tensors -- no need to debias, Confidence intervals for high-dimensional partially linear single-index models, Recent advances in statistical methodologies in evaluating program for high-dimensional data, On the post selection inference constant under restricted isometry properties, Exact post-selection inference, with application to the Lasso, A unified theory of confidence regions and testing for high-dimensional estimating equations, Constructing confidence intervals for the signals in sparse phase retrieval, High-dimensional sufficient dimension reduction through principal projections, De-biasing the Lasso with degrees-of-freedom adjustment, Doubly robust semiparametric inference using regularized calibrated estimation with high-dimensional data, The benefit of group sparsity in group inference with de-biased scaled group Lasso, Geometric inference for general high-dimensional linear inverse problems, Oracle inequalities, variable selection and uniform inference in high-dimensional correlated random effects panel data models, Testing a single regression coefficient in high dimensional linear models, Regression analysis for microbiome compositional data, Kernel-penalized regression for analysis of microbiome data, Distributed testing and estimation under sparse high dimensional models, Sparse linear models and \(l_1\)-regularized 2SLS with high-dimensional endogenous regressors and instruments, High-dimensional inference for personalized treatment decision, Variable selection for high dimensional Gaussian copula regression model: an adaptive hypothesis testing procedure, Confidence regions for entries of a large precision matrix, Controlling the false discovery rate via knockoffs, Confidence intervals for the means of the selected populations, High-dimensional simultaneous inference with the bootstrap, Rejoinder on: ``High-dimensional simultaneous inference with the bootstrap, SLOPE-adaptive variable selection via convex optimization, Asymptotically honest confidence regions for high dimensional parameters by the desparsified conservative Lasso, High-dimensional inference in misspecified linear models, Generalized M-estimators for high-dimensional Tobit I models, Robust inference on average treatment effects with possibly more covariates than observations, Additive model selection, ROCKET: robust confidence intervals via Kendall's tau for transelliptical graphical models, Uniformly valid post-regularization confidence regions for many functional parameters in z-estimation framework, Debiasing the Lasso: optimal sample size for Gaussian designs, Adaptive estimation of high-dimensional signal-to-noise ratios, Online rules for control of false discovery rate and false discovery exceedance, Regularization and the small-ball method. I: Sparse recovery, Selective inference with a randomized response, Detecting rare and faint signals via thresholding maximum likelihood estimators, Uniformly valid confidence sets based on the Lasso, Nonparametric inference via bootstrapping the debiased estimator, Group Inference in High Dimensions with Applications to Hierarchical Testing, Time-dependent Poisson reduced rank models for political text data analysis, Rate optimal estimation and confidence intervals for high-dimensional regression with missing covariates, Inference under Fine-Gray competing risks model with high-dimensional covariates, Two-directional simultaneous inference for high-dimensional models, Testing Endogeneity with High Dimensional Covariates, Model-assisted inference for treatment effects using regularized calibrated estimation with high-dimensional data, On asymptotically optimal confidence regions and tests for high-dimensional models, Lasso Inference for High-Dimensional Time Series, desla, Distribution-Free Predictive Inference For Regression, Transfer Learning under High-dimensional Generalized Linear Models, Confidence intervals for high-dimensional inverse covariance estimation, High-dimensional inference: confidence intervals, \(p\)-values and R-software \texttt{hdi}, Inference in high dimensional linear measurement error models, Exact adaptive confidence intervals for linear regression coefficients, An Automated Approach Towards Sparse Single-Equation Cointegration Modelling, Optimal sparsity testing in linear regression model, Bootstrap based inference for sparse high-dimensional time series models, Inference without compatibility: using exponential weighting for inference on a parameter of a linear model, Model selection with mixed variables on the Lasso path, A Projection Based Conditional Dependence Measure with Applications to High-dimensional Undirected Graphical Models, Efficient estimation of smooth functionals in Gaussian shift models, Efficient distributed estimation of high-dimensional sparse precision matrix for transelliptical graphical models, Flexible and Interpretable Models for Survival Data, Semiparametric efficiency bounds for high-dimensional models, Non-asymptotic error controlled sparse high dimensional precision matrix estimation, High-dimensional variable selection via low-dimensional adaptive learning, Multicarving for high-dimensional post-selection inference, Asymptotic normality and optimalities in estimation of large Gaussian graphical models, Honest confidence regions and optimality in high-dimensional precision matrix estimation, Gaussian graphical model estimation with false discovery rate control, The de-biased group Lasso estimation for varying coefficient models, Second-order Stein: SURE for SURE and other applications in high-dimensional inference, The distribution of the Lasso: uniform control over sparse balls and adaptive parameter tuning, An overview of tests on high-dimensional means, Optimal linear discriminators for the discrete choice model in growing dimensions, Inference for high-dimensional varying-coefficient quantile regression, Scale calibration for high-dimensional robust regression, High-dimensional inference for linear model with correlated errors, In defense of the indefensible: a very naïve approach to high-dimensional inference, Some perspectives on inference in high dimensions, Distributed adaptive Huber regression, A convex programming solution based debiased estimator for quantile with missing response and high-dimensional covariables, Confidence intervals for parameters in high-dimensional sparse vector autoregression, Spatially relaxed inference on high-dimensional linear models, Network differential connectivity analysis, Asymptotic normality of robust \(M\)-estimators with convex penalty, Design of c-optimal experiments for high-dimensional linear models, Discussion of big Bayes stories and BayesBag, Discussion on ‘A review of distributed statistical inference’, A selective review of statistical methods using calibration information from similar studies, The robust desparsified lasso and the focused information criterion for high-dimensional generalized linear models, Markov Neighborhood Regression for High-Dimensional Inference, Testing Shape Constraints in Lasso Regularized Joinpoint Regression, Projection-based Inference for High-dimensional Linear Models, Inference for high dimensional linear models with error-in-variables, Exploiting Disagreement Between High-Dimensional Variable Selectors for Uncertainty Visualization, New Tests for High-Dimensional Linear Regression Based on Random Projection, Statistical Inference for High-Dimensional Models via Recursive Online-Score Estimation, Targeted Inference Involving High-Dimensional Data Using Nuisance Penalized Regression, On high-dimensional Poisson models with measurement error: hypothesis testing for nonlinear nonconvex optimization, Doubly robust tests of exposure effects under high‐dimensional confounding, Statistical inference for Cox proportional hazards models with a diverging number of covariates, Generalized linear models with structured sparsity estimators, Efficient multiple change point detection for high‐dimensional generalized linear models, Inducement of population sparsity, A structured brain‐wide and genome‐wide association study using ADNI PET images, Statistical Inference, Learning and Models in Big Data, Inference on heterogeneous treatment effects in high‐dimensional dynamic panels under weak dependence, Automatic bias correction for testing in high‐dimensional linear models, Score Tests With Incomplete Covariates and High-Dimensional Auxiliary Variables, Uniformly valid inference based on the Lasso in linear mixed models, Variable selection and debiased estimation for single‐index expectile model, Poststratification fusion learning in longitudinal data analysis, A weak‐signal‐assisted procedure for variable selection and statistical inference with an informative subsample, Uniformly valid inference for partially linear high-dimensional single-index models, A Scale-Free Approach for False Discovery Rate Control in Generalized Linear Models, Discussion of “A Scale-Free Approach for False Discovery Rate Control in Generalized Linear Models” by Dai, Lin, Xing, and Liu, Discussion on: “A Scale-Free Approach for False Discovery Rate Control in Generalized Linear Models” by Dai, Lin, Zing, Liu, Discussion of “A Scale-Free Approach for False Discovery Rate Control in Generalized Linear Models”, Orthogonalized Kernel Debiased Machine Learning for Multimodal Data Analysis, Sparse Topic Modeling: Computational Efficiency, Near-Optimal Algorithms, and Statistical Inference, CEDAR: Communication Efficient Distributed Analysis for Regressions, Debiased lasso for generalized linear models with a diverging number of covariates, Penalized Regression for Multiple Types of Many Features With Missing Data, Weak Signal Identification and Inference in Penalized Likelihood Models for Categorical Responses, A multitest procedure for testing MTP2 for Gaussian distributions, Deconfounding and Causal Regularisation for Stability and External Validity, POST-SELECTION INFERENCE IN THREE-DIMENSIONAL PANEL DATA, Double bias correction for high-dimensional sparse additive hazards regression with covariate measurement errors, Estimation of semiparametric regression model with right-censored high-dimensional data, Simultaneous test for linear model via projection, Controlling False Discovery Rate Using Gaussian Mirrors, Model-Assisted Uniformly Honest Inference for Optimal Treatment Regimes in High Dimension, Treatment Effect Estimation Under Additive Hazards Models With High-Dimensional Confounding, Honest Confidence Sets for High-Dimensional Regression by Projection and Shrinkage, Scalable and efficient inference via CPE, Statistical Inference for High-Dimensional Generalized Linear Models With Binary Outcomes, Inference for High-Dimensional Linear Mixed-Effects Models: A Quasi-Likelihood Approach, Individual Data Protected Integrative Regression Analysis of High-Dimensional Heterogeneous Data, Integrative Factor Regression and Its Inference for Multimodal Data Analysis, Tuning parameter selection for penalized estimation via \(R^2\), Debiasing convex regularized estimators and interval estimation in linear models, High-dimensional robust inference for censored linear models, The EAS approach to variable selection for multivariate response data in high-dimensional settings, Directed graphs and variable selection in large vector autoregressive models, Generalized matrix decomposition regression: estimation and inference for two-way structured data, Debiased Lasso for stratified Cox models with application to the national kidney transplant data, A tuning-free efficient test for marginal linear effects in high-dimensional quantile regression, Robust inference for high‐dimensional single index models, Online inference in high-dimensional generalized linear models with streaming data, False Discovery Rate Control via Data Splitting, Distributionally robust and generalizable inference, Inference for high‐dimensional linear models with locally stationary error processes, Post-selection Inference of High-dimensional Logistic Regression Under Case–Control Design, Sparse generalized Yule-Walker estimation for large spatio-temporal autoregressions with an application to NO\(_2\) satellite data, Spike-and-Slab Group Lassos for Grouped Regression and Sparse Generalized Additive Models, A Sparse Random Projection-Based Test for Overall Qualitative Treatment Effects, iFusion: Individualized Fusion Learning, Unnamed Item, Sparse Identification and Estimation of Large-Scale Vector AutoRegressive Moving Averages, Discussion, Moderate-Dimensional Inferences on Quadratic Functionals in Ordinary Least Squares, UNIFORM INFERENCE IN HIGH-DIMENSIONAL DYNAMIC PANEL DATA MODELS WITH APPROXIMATELY SPARSE FIXED EFFECTS, Confidence Intervals for Sparse Penalized Regression With Random Designs, Communication-efficient estimation of high-dimensional quantile regression, A Bootstrap Lasso + Partial Ridge Method to Construct Confidence Intervals for Parameters in High-dimensional Sparse Linear Models, THE FACTOR-LASSO AND K-STEP BOOTSTRAP APPROACH FOR INFERENCE IN HIGH-DIMENSIONAL ECONOMIC APPLICATIONS, Comment on “A Tuning-Free Robust and Efficient Approach to High-Dimensional Regression”, Rejoinder to “A Tuning-Free Robust and Efficient Approach to High-Dimensional Regression”, Fixed Effects Testing in High-Dimensional Linear Mixed Models, Kernel Meets Sieve: Post-Regularization Confidence Bands for Sparse Additive Model, A Sequential Significance Test for Treatment by Covariate Interactions, Capturing Spike Variability in Noisy Izhikevich Neurons Using Point Process Generalized Linear Models, Permutation testing in high-dimensional linear models: an empirical investigation, A significance test for graph‐constrained estimation, Estimating structured high-dimensional covariance and precision matrices: optimal rates and adaptive estimation, Unnamed Item, Optimal Estimation of Genetic Relatedness in High-Dimensional Linear Models, Communication-Efficient Distributed Statistical Inference, Valid Post-Selection Inference in High-Dimensional Approximately Sparse Quantile Regression Models, Unnamed Item, Unnamed Item, Comments on: ``High-dimensional simultaneous inference with the bootstrap, Bootstrap inference for penalized GMM estimators with oracle properties, An upper bound for functions of estimators in high dimensions, High-dimensional statistical inference via DATE, Partitioned Approach for High-dimensional Confidence Intervals with Large Split Sizes, Regularized projection score estimation of treatment effects in high-dimensional quantile regression, Elastic-net Regularized High-dimensional Negative Binomial Regression: Consistency and Weak Signal Detection, Hypothesis Testing for Network Data with Power Enhancement, Hypothesis testing for high-dimensional multivariate regression with false discovery rate control, Scalable inference for high-dimensional precision matrix, The asymptotic distribution of the MLE in high-dimensional logistic models: arbitrary covariance, Post-model-selection inference in linear regression models: an integrated review, Meta-analytic Gaussian network aggregation, Objective Bayesian edge screening and structure selection for Ising networks, Thresholding tests based on affine Lasso to achieve non-asymptotic nominal level and high power under sparse and dense alternatives in high dimension, Doubly debiased Lasso: high-dimensional inference under hidden confounding, A generalized likelihood-based Bayesian approach for scalable joint regression and covariance selection in high dimensions, More powerful genetic association testing via a new statistical framework for integrative genomics, Confidence intervals for high-dimensional Cox models, Single-index composite quantile regression for ultra-high-dimensional data, Variable selection in expectile regression, A unifying framework of high-dimensional sparse estimation with difference-of-convex (DC) regularizations, Unnamed Item, Unnamed Item, Unnamed Item, Inferences in panel data with interactive effects using large covariance matrices, A penalized approach to covariate selection through quantile regression coefficient models, Asymptotically efficient estimation of smooth functionals of covariance operators, Robust post-selection inference of high-dimensional mean regression with heavy-tailed asymmetric or heteroskedastic errors, Statistical inference for model parameters in stochastic gradient descent, Uniformly valid confidence intervals post-model-selection, Efficient estimation of linear functionals of principal components, Composite quantile regression for massive datasets, Accuracy assessment for high-dimensional linear regression, SONIC: social network analysis with influencers and communities, Hierarchical inference for genome-wide association studies: a view on methodology with software, Tests for Coefficients in High-dimensional Additive Hazard Models, Predictor ranking and false discovery proportion control in high-dimensional regression, Inference for high-dimensional instrumental variables regression, Estimation of a multiplicative correlation structure in the large dimensional case, Debiasing the debiased Lasso with bootstrap, Detangling robustness in high dimensions: composite versus model-averaged estimation, Nearly optimal Bayesian shrinkage for high-dimensional regression, Unnamed Item, Unnamed Item, Unnamed Item, Unnamed Item, The costs and benefits of uniformly valid causal inference with high-dimensional nuisance parameters, Double-estimation-friendly inference for high-dimensional misspecified models, Statistical proof? The problem of irreproducibility, Robust machine learning by median-of-means: theory and practice, A two-step method for estimating high-dimensional Gaussian graphical models, Unnamed Item, Statistical inference via conditional Bayesian posteriors in high-dimensional linear regression, Goodness-of-Fit Tests for High Dimensional Linear Models, Confidence intervals for sparse precision matrix estimation via Lasso penalized D-trace loss, Communication-efficient sparse composite quantile regression for distributed data, Most powerful test against a sequence of high dimensional local alternatives, Structural inference in sparse high-dimensional vector autoregressions, Confidence sets in sparse regression, Test of significance for high-dimensional longitudinal data, Relaxing the assumptions of knockoffs by conditioning, Inference on the change point under a high dimensional sparse mean shift, On the uniform convergence of empirical norms and inner products, with application to causal inference, Ill-posed estimation in high-dimensional models with instrumental variables, A High‐dimensional Focused Information Criterion, A statistical mechanics approach to de-biasing and uncertainty estimation in LASSO for random measurements, On High-Dimensional Constrained Maximum Likelihood Inference, Debiased Inference on Treatment Effect in a High-Dimensional Model, Finite sample performance of linear least squares estimation, Variable importance assessments and backward variable selection for multi-sample problems, Unnamed Item, Unnamed Item, Unnamed Item, Unnamed Item, Testing regression coefficients in high-dimensional and sparse settings, Inference of large modified Poisson-type graphical models: application to RNA-seq data in childhood atopic asthma studies, Pivotal estimation via square-root lasso in nonparametric regression, Unnamed Item, Unnamed Item, The likelihood ratio test in high-dimensional logistic regression is asymptotically a rescaled Chi-square, CONSISTENT AND CONSERVATIVE MODEL SELECTION WITH THE ADAPTIVE LASSO IN STATIONARY AND NONSTATIONARY AUTOREGRESSIONS, Network classification with applications to brain connectomics, Testing for high-dimensional network parameters in auto-regressive models, Variable selection via adaptive false negative control in linear regression, Bootstrapping and sample splitting for high-dimensional, assumption-lean inference, ROS regression: integrating regularization with optimal scaling regression, Lasso meets horseshoe: a survey, Distributed simultaneous inference in generalized linear models via confidence distribution, On rank estimators in increasing dimensions, Testing for Inequality Constraints in Singular Models by Trimming or Winsorizing the Variance Matrix, Optimal designs in sparse linear models, Perturbation bootstrap in adaptive Lasso, Convergence rates of least squares regression estimators with heavy-tailed errors, Composite versus model-averaged quantile regression, Unnamed Item, Sparse Poisson regression with penalized weighted score function, Nonsparse Learning with Latent Variables, On the asymptotic variance of the debiased Lasso, Linear hypothesis testing for high dimensional generalized linear models, Inter-Subject Analysis: A Partial Gaussian Graphical Model Approach, Global and Simultaneous Hypothesis Testing for High-Dimensional Logistic Regression Models, Panel data quantile regression with grouped fixed effects, Linear Hypothesis Testing in Dense High-Dimensional Linear Models, Retire: robust expectile regression in high dimensions, Inference on the best policies with many covariates, Reprint: Statistical inference for linear mediation models with high-dimensional mediators and application to studying stock reaction to COVID-19 pandemic, Optimal decorrelated score subsampling for generalized linear models with massive data, Statistical inference for linear mediation models with high-dimensional mediators and application to studying stock reaction to COVID-19 pandemic, Inference for sparse linear regression based on the leave-one-covariate-out solution path, Inference for High-Dimensional Censored Quantile Regression, Online Debiasing for Adaptively Collected High-Dimensional Data With Applications to Time Series Analysis, Two-stage communication-efficient distributed sparse M-estimation with missing data, High-dimensional inference robust to outliers with ℓ1-norm penalization, Neighborhood-based cross fitting approach to treatment effects with high-dimensional data, On lower bounds for the bias-variance trade-off, Universality of regularized regression estimators in high dimensions, The Lasso with general Gaussian designs with applications to hypothesis testing, Statistical performance of quantile tensor regression with convex regularization, Robust Signal Recovery for High-Dimensional Linear Log-Contrast Models with Compositional Covariates, StarTrek: combinatorial variable selection with false discovery rate control, Communication‐efficient low‐dimensional parameter estimation and inference for high‐dimensional Lp$$ {L}^p $$‐quantile regression, On estimation of nonparametric regression models with autoregressive and moving average errors, Estimation and inference in sparse multivariate regression and conditional Gaussian graphical models under an unbalanced distributed setting, Unnamed Item


Uses Software


Cites Work