Confidence Intervals and Hypothesis Testing for High-Dimensional Regression

From MaRDI portal
Publication:2934105

zbMath1319.62145arXiv1306.3171MaRDI QIDQ2934105

Adel Javanmard, Andrea Montanari

Publication date: 8 December 2014

Full work available at URL: https://arxiv.org/abs/1306.3171



Related Items

A review of distributed statistical inference, The robust desparsified lasso and the focused information criterion for high-dimensional generalized linear models, Markov Neighborhood Regression for High-Dimensional Inference, IFAA: Robust Association Identification and Inference for Absolute Abundance in Microbiome Analyses, Partitioned Approach for High-dimensional Confidence Intervals with Large Split Sizes, Regularized projection score estimation of treatment effects in high-dimensional quantile regression, Distributed Sufficient Dimension Reduction for Heterogeneous Massive Data, Scalable inference for high-dimensional precision matrix, High-Dimensional Inference for Cluster-Based Graphical Models, Projection-based Inference for High-dimensional Linear Models, More powerful genetic association testing via a new statistical framework for integrative genomics, Hypothesis Testing in High-Dimensional Instrumental Variables Regression With an Application to Genomics Data, Confidence intervals for high-dimensional Cox models, Unnamed Item, Unnamed Item, Unnamed Item, A penalized approach to covariate selection through quantile regression coefficient models, Inference for high dimensional linear models with error-in-variables, Penalised robust estimators for sparse and high-dimensional linear models, Penalized expectile regression: an alternative to penalized quantile regression, Asymptotic Theory of \(\boldsymbol \ell _1\) -Regularized PDE Identification from a Single Noisy Trajectory, New Tests for High-Dimensional Linear Regression Based on Random Projection, Statistical Inference for High-Dimensional Models via Recursive Online-Score Estimation, Targeted Inference Involving High-Dimensional Data Using Nuisance Penalized Regression, Statistical inference for Cox proportional hazards models with a diverging number of covariates, Statistical Inference, Learning and Models in Big Data, Inference on heterogeneous treatment effects in high‐dimensional dynamic panels under weak dependence, Automatic bias correction for testing in high‐dimensional linear models, Variable selection and debiased estimation for single‐index expectile model, A weak‐signal‐assisted procedure for variable selection and statistical inference with an informative subsample, A Scale-Free Approach for False Discovery Rate Control in Generalized Linear Models, Discussion of “A Scale-Free Approach for False Discovery Rate Control in Generalized Linear Models”, Sparse Topic Modeling: Computational Efficiency, Near-Optimal Algorithms, and Statistical Inference, CEDAR: Communication Efficient Distributed Analysis for Regressions, Debiased lasso for generalized linear models with a diverging number of covariates, Weak Signal Identification and Inference in Penalized Likelihood Models for Categorical Responses, POST-SELECTION INFERENCE IN THREE-DIMENSIONAL PANEL DATA, Double bias correction for high-dimensional sparse additive hazards regression with covariate measurement errors, Simultaneous test for linear model via projection, Model-Assisted Uniformly Honest Inference for Optimal Treatment Regimes in High Dimension, Honest Confidence Sets for High-Dimensional Regression by Projection and Shrinkage, Scalable and efficient inference via CPE, Debiased machine learning of set-identified linear models, Unnamed Item, Unnamed Item, Unnamed Item, Unnamed Item, The costs and benefits of uniformly valid causal inference with high-dimensional nuisance parameters, Statistical inference via conditional Bayesian posteriors in high-dimensional linear regression, Goodness-of-Fit Tests for High Dimensional Linear Models, Confidence intervals for sparse precision matrix estimation via Lasso penalized D-trace loss, Communication-efficient sparse composite quantile regression for distributed data, Distributed Bayesian posterior voting strategy for massive data, iFusion: Individualized Fusion Learning, Most powerful test against a sequence of high dimensional local alternatives, Unnamed Item, Bias corrected regularization kernel method in ranking, Confidence sets in sparse regression, Sparse Identification and Estimation of Large-Scale Vector AutoRegressive Moving Averages, Discussion, UNIFORM INFERENCE IN HIGH-DIMENSIONAL DYNAMIC PANEL DATA MODELS WITH APPROXIMATELY SPARSE FIXED EFFECTS, A Large-Scale Constrained Joint Modeling Approach for Predicting User Activity, Engagement, and Churn With Application to Freemium Mobile Games, Confidence Intervals for Sparse Penalized Regression With Random Designs, Communication-efficient estimation of high-dimensional quantile regression, A Bootstrap Lasso + Partial Ridge Method to Construct Confidence Intervals for Parameters in High-dimensional Sparse Linear Models, Comment on “A Tuning-Free Robust and Efficient Approach to High-Dimensional Regression”, Fixed Effects Testing in High-Dimensional Linear Mixed Models, Kernel Meets Sieve: Post-Regularization Confidence Bands for Sparse Additive Model, A High‐dimensional Focused Information Criterion, A statistical mechanics approach to de-biasing and uncertainty estimation in LASSO for random measurements, On High-Dimensional Constrained Maximum Likelihood Inference, Unnamed Item, Unnamed Item, Unnamed Item, Unnamed Item, Unnamed Item, Bootstrapping and sample splitting for high-dimensional, assumption-lean inference, Lasso meets horseshoe: a survey, OR Forum—An Algorithmic Approach to Linear Regression, Optimal designs in sparse linear models, Unnamed Item, Optimal Estimation of Genetic Relatedness in High-Dimensional Linear Models, Composite versus model-averaged quantile regression, Unnamed Item, Nonsparse Learning with Latent Variables, On the asymptotic variance of the debiased Lasso, A knockoff filter for high-dimensional selective inference, Inter-Subject Analysis: A Partial Gaussian Graphical Model Approach, Global and Simultaneous Hypothesis Testing for High-Dimensional Logistic Regression Models, Unnamed Item, Comments on: ``High-dimensional simultaneous inference with the bootstrap, Linear Hypothesis Testing in Dense High-Dimensional Linear Models, Unnamed Item, Discussion: ``A significance test for the lasso, Discussion: ``A significance test for the lasso, Discussion: ``A significance test for the lasso, Discussion: ``A significance test for the lasso, Discussion: ``A significance test for the lasso, High-dimensional statistical inference via DATE, Statistical Inference for High-Dimensional Generalized Linear Models With Binary Outcomes, Kernel Ordinary Differential Equations, Inference for High-Dimensional Linear Mixed-Effects Models: A Quasi-Likelihood Approach, Individual Data Protected Integrative Regression Analysis of High-Dimensional Heterogeneous Data, Integrative Factor Regression and Its Inference for Multimodal Data Analysis, Tuning parameter selection for penalized estimation via \(R^2\), Debiasing convex regularized estimators and interval estimation in linear models, High-dimensional robust inference for censored linear models, Directed graphs and variable selection in large vector autoregressive models, Generalized matrix decomposition regression: estimation and inference for two-way structured data, Debiased Lasso for stratified Cox models with application to the national kidney transplant data, A tuning-free efficient test for marginal linear effects in high-dimensional quantile regression, Robust inference for high‐dimensional single index models, Online inference in high-dimensional generalized linear models with streaming data, False Discovery Rate Control via Data Splitting, Inference for high‐dimensional linear models with locally stationary error processes, Post-selection Inference of High-dimensional Logistic Regression Under Case–Control Design, Retire: robust expectile regression in high dimensions, Optimal decorrelated score subsampling for generalized linear models with massive data, Inference for sparse linear regression based on the leave-one-covariate-out solution path, Inference for High-Dimensional Censored Quantile Regression, Online Debiasing for Adaptively Collected High-Dimensional Data With Applications to Time Series Analysis, High-dimensional inference robust to outliers with ℓ1-norm penalization, Neighborhood-based cross fitting approach to treatment effects with high-dimensional data, Universality of regularized regression estimators in high dimensions, The Lasso with general Gaussian designs with applications to hypothesis testing, Rank-Based Greedy Model Averaging for High-Dimensional Survival Data, StarTrek: combinatorial variable selection with false discovery rate control, Communication‐efficient low‐dimensional parameter estimation and inference for high‐dimensional Lp$$ {L}^p $$‐quantile regression, Moderate-Dimensional Inferences on Quadratic Functionals in Ordinary Least Squares, LIC criterion for optimal subset selection in distributed interval estimation, Unnamed Item, Powerful knockoffs via minimizing reconstructability, Worst possible sub-directions in high-dimensional models, Covariate-adjusted inference for differential analysis of high-dimensional networks, Statistical inference in sparse high-dimensional additive models, Significance testing in non-sparse high-dimensional linear models, Familywise error rate control via knockoffs, Mathematical foundations of machine learning. Abstracts from the workshop held March 21--27, 2021 (hybrid meeting), Testability of high-dimensional linear models with nonsparse structures, Inference for low-rank tensors -- no need to debias, Confidence intervals for high-dimensional partially linear single-index models, Recent advances in statistical methodologies in evaluating program for high-dimensional data, Exact post-selection inference, with application to the Lasso, Sparse matrix linear models for structured high-throughput data, A unified theory of confidence regions and testing for high-dimensional estimating equations, Semi-supervised empirical risk minimization: using unlabeled data to improve prediction, High-dimensional sufficient dimension reduction through principal projections, De-biasing the Lasso with degrees-of-freedom adjustment, Doubly robust semiparametric inference using regularized calibrated estimation with high-dimensional data, The asymptotic distribution of the MLE in high-dimensional logistic models: arbitrary covariance, Post-model-selection inference in linear regression models: an integrated review, Meta-analytic Gaussian network aggregation, The benefit of group sparsity in group inference with de-biased scaled group Lasso, Geometric inference for general high-dimensional linear inverse problems, Thresholding tests based on affine Lasso to achieve non-asymptotic nominal level and high power under sparse and dense alternatives in high dimension, Regression analysis for microbiome compositional data, AIC for the Lasso in generalized linear models, Doubly debiased Lasso: high-dimensional inference under hidden confounding, Distributed testing and estimation under sparse high dimensional models, Hierarchical correction of \(p\)-values via an ultrametric tree running Ornstein-Uhlenbeck process, Sparse linear models and \(l_1\)-regularized 2SLS with high-dimensional endogenous regressors and instruments, Single-index composite quantile regression for ultra-high-dimensional data, A unifying framework of high-dimensional sparse estimation with difference-of-convex (DC) regularizations, Variable selection for high dimensional Gaussian copula regression model: an adaptive hypothesis testing procedure, Robust post-selection inference of high-dimensional mean regression with heavy-tailed asymmetric or heteroskedastic errors, Statistical inference for model parameters in stochastic gradient descent, Efficient estimation of linear functionals of principal components, Hierarchical inference for genome-wide association studies: a view on methodology with software, Controlling the false discovery rate via knockoffs, Inference for high-dimensional instrumental variables regression, Debiasing the debiased Lasso with bootstrap, Detangling robustness in high dimensions: composite versus model-averaged estimation, High-dimensional simultaneous inference with the bootstrap, Rejoinder on: ``High-dimensional simultaneous inference with the bootstrap, SLOPE-adaptive variable selection via convex optimization, Statistical unfolding of elementary particle spectra: empirical Bayes estimation and bias-corrected uncertainty quantification, Asymptotically honest confidence regions for high dimensional parameters by the desparsified conservative Lasso, Relaxing the assumptions of knockoffs by conditioning, Analysis of phenotypic- and estimated breeding values (EBV) to dissect the genetic architecture of complex traits in a Scots pine three-generation pedigree design, Ill-posed estimation in high-dimensional models with instrumental variables, High-dimensional inference in misspecified linear models, Robust regression with compositional covariates, Innovated scalable efficient inference for ultra-large graphical models, ROCKET: robust confidence intervals via Kendall's tau for transelliptical graphical models, Uniformly valid post-regularization confidence regions for many functional parameters in z-estimation framework, Debiasing the Lasso: optimal sample size for Gaussian designs, Adaptive estimation of high-dimensional signal-to-noise ratios, Online rules for control of false discovery rate and false discovery exceedance, On sequential confidence estimation of parameters of stochastic dynamical systems with conditionally Gaussian noises, Selective inference with a randomized response, Uniformly valid confidence sets based on the Lasso, Usage of the GO estimator in high dimensional linear models, A significance test for the lasso, Discussion: ``A significance test for the lasso, Rejoinder: ``A significance test for the lasso, Nonparametric inference via bootstrapping the debiased estimator, Group Inference in High Dimensions with Applications to Hierarchical Testing, Rate optimal estimation and confidence intervals for high-dimensional regression with missing covariates, A global homogeneity test for high-dimensional linear regression, Testing Endogeneity with High Dimensional Covariates, Model-assisted inference for treatment effects using regularized calibrated estimation with high-dimensional data, On asymptotically optimal confidence regions and tests for high-dimensional models, Lasso Inference for High-Dimensional Time Series, Distribution-Free Predictive Inference For Regression, Confidence intervals for high-dimensional inverse covariance estimation, Testing for high-dimensional network parameters in auto-regressive models, Variable selection via adaptive false negative control in linear regression, High-dimensional inference: confidence intervals, \(p\)-values and R-software \texttt{hdi}, Inference in high dimensional linear measurement error models, Optimal sparsity testing in linear regression model, Inference without compatibility: using exponential weighting for inference on a parameter of a linear model, Flexible and Interpretable Models for Survival Data, Semiparametric efficiency bounds for high-dimensional models, Non-asymptotic error controlled sparse high dimensional precision matrix estimation, Multicarving for high-dimensional post-selection inference, Honest confidence regions and optimality in high-dimensional precision matrix estimation, The de-biased group Lasso estimation for varying coefficient models, Second-order Stein: SURE for SURE and other applications in high-dimensional inference, The distribution of the Lasso: uniform control over sparse balls and adaptive parameter tuning, Augmented minimax linear estimation, Scale calibration for high-dimensional robust regression, High-dimensional inference for linear model with correlated errors, In defense of the indefensible: a very naïve approach to high-dimensional inference, Some perspectives on inference in high dimensions, A convex programming solution based debiased estimator for quantile with missing response and high-dimensional covariables, Confidence intervals for parameters in high-dimensional sparse vector autoregression, Spatially relaxed inference on high-dimensional linear models, Network differential connectivity analysis, Asymptotic normality of robust \(M\)-estimators with convex penalty, Design of c-optimal experiments for high-dimensional linear models, High-dimensional linear models with many endogenous variables