Confidence Intervals for Low Dimensional Parameters in High Dimensional Linear Models

From MaRDI portal
Publication:5743269


DOI10.1111/rssb.12026zbMath1411.62196arXiv1110.2563OpenAlexW2069119359MaRDI QIDQ5743269

No author found.

Publication date: 9 May 2019

Published in: Journal of the Royal Statistical Society Series B: Statistical Methodology (Search for Journal in Brave)

Full work available at URL: https://arxiv.org/abs/1110.2563



Related Items

Doubly robust tests of exposure effects under high‐dimensional confounding, Statistical inference for Cox proportional hazards models with a diverging number of covariates, A new test for high‐dimensional regression coefficients in partially linear models, Inducement of population sparsity, Assessing mediating effects of high‐dimensional microbiome measurements in dietary intervention studies, Inference on heterogeneous treatment effects in high‐dimensional dynamic panels under weak dependence, Score Tests With Incomplete Covariates and High-Dimensional Auxiliary Variables, Variable selection and debiased estimation for single‐index expectile model, Poststratification fusion learning in longitudinal data analysis, A weak‐signal‐assisted procedure for variable selection and statistical inference with an informative subsample, Assessing the Most Vulnerable Subgroup to Type II Diabetes Associated with Statin Usage: Evidence from Electronic Health Record Data, A Scale-Free Approach for False Discovery Rate Control in Generalized Linear Models, Discussion of “A Scale-Free Approach for False Discovery Rate Control in Generalized Linear Models” by Dai, Lin, Xing, and Liu, Discussion of “A Scale-Free Approach for False Discovery Rate Control in Generalized Linear Models”, Orthogonalized Kernel Debiased Machine Learning for Multimodal Data Analysis, Sparse Topic Modeling: Computational Efficiency, Near-Optimal Algorithms, and Statistical Inference, CEDAR: Communication Efficient Distributed Analysis for Regressions, Debiased lasso for generalized linear models with a diverging number of covariates, Penalized Regression for Multiple Types of Many Features With Missing Data, Weak Signal Identification and Inference in Penalized Likelihood Models for Categorical Responses, Deconfounding and Causal Regularisation for Stability and External Validity, POST-SELECTION INFERENCE IN THREE-DIMENSIONAL PANEL DATA, Double bias correction for high-dimensional sparse additive hazards regression with covariate measurement errors, Simultaneous test for linear model via projection, Controlling False Discovery Rate Using Gaussian Mirrors, Model-Assisted Uniformly Honest Inference for Optimal Treatment Regimes in High Dimension, Honest Confidence Sets for High-Dimensional Regression by Projection and Shrinkage, Scalable and efficient inference via CPE, Debiased machine learning of set-identified linear models, Statistical Inference for High-Dimensional Generalized Linear Models With Binary Outcomes, Kernel Ordinary Differential Equations, Inference for High-Dimensional Linear Mixed-Effects Models: A Quasi-Likelihood Approach, Testing Mediation Effects Using Logic of Boolean Matrices, Individual Data Protected Integrative Regression Analysis of High-Dimensional Heterogeneous Data, Integrative Factor Regression and Its Inference for Multimodal Data Analysis, Tuning parameter selection for penalized estimation via \(R^2\), Debiasing convex regularized estimators and interval estimation in linear models, High-dimensional robust inference for censored linear models, Directed graphs and variable selection in large vector autoregressive models, Generalized matrix decomposition regression: estimation and inference for two-way structured data, Debiased Lasso for stratified Cox models with application to the national kidney transplant data, Robust inference for high‐dimensional single index models, Online inference in high-dimensional generalized linear models with streaming data, False Discovery Rate Control via Data Splitting, Distributionally robust and generalizable inference, Inference for high‐dimensional linear models with locally stationary error processes, Post-selection Inference of High-dimensional Logistic Regression Under Case–Control Design, Retire: robust expectile regression in high dimensions, Inference on the best policies with many covariates, Reprint: Statistical inference for linear mediation models with high-dimensional mediators and application to studying stock reaction to COVID-19 pandemic, Optimal decorrelated score subsampling for generalized linear models with massive data, Statistical inference for linear mediation models with high-dimensional mediators and application to studying stock reaction to COVID-19 pandemic, Inference for sparse linear regression based on the leave-one-covariate-out solution path, Inference for High-Dimensional Censored Quantile Regression, Online Debiasing for Adaptively Collected High-Dimensional Data With Applications to Time Series Analysis, High-dimensional inference robust to outliers with ℓ1-norm penalization, Neighborhood-based cross fitting approach to treatment effects with high-dimensional data, On lower bounds for the bias-variance trade-off, Universality of regularized regression estimators in high dimensions, The Lasso with general Gaussian designs with applications to hypothesis testing, Communication‐efficient low‐dimensional parameter estimation and inference for high‐dimensional Lp$$ {L}^p $$‐quantile regression, A penalised bootstrap estimation procedure for the explained Gini coefficient, Estimation and inference in sparse multivariate regression and conditional Gaussian graphical models under an unbalanced distributed setting, Discussion, Moderate-Dimensional Inferences on Quadratic Functionals in Ordinary Least Squares, LIC criterion for optimal subset selection in distributed interval estimation, Unnamed Item, The robust desparsified lasso and the focused information criterion for high-dimensional generalized linear models, Markov Neighborhood Regression for High-Dimensional Inference, IFAA: Robust Association Identification and Inference for Absolute Abundance in Microbiome Analyses, Partitioned Approach for High-dimensional Confidence Intervals with Large Split Sizes, Regularized projection score estimation of treatment effects in high-dimensional quantile regression, Elastic-net Regularized High-dimensional Negative Binomial Regression: Consistency and Weak Signal Detection, Hypothesis testing for high-dimensional multivariate regression with false discovery rate control, Distributed Sufficient Dimension Reduction for Heterogeneous Massive Data, F-test and z-test for high-dimensional regression models with a factor structure, Scalable inference for high-dimensional precision matrix, High-Dimensional Inference for Cluster-Based Graphical Models, Projection-based Inference for High-dimensional Linear Models, Hypothesis Testing in High-Dimensional Instrumental Variables Regression With an Application to Genomics Data, Confidence intervals for high-dimensional Cox models, Unnamed Item, Unnamed Item, Inferences in panel data with interactive effects using large covariance matrices, A penalized approach to covariate selection through quantile regression coefficient models, Inference for high dimensional linear models with error-in-variables, Exploiting Disagreement Between High-Dimensional Variable Selectors for Uncertainty Visualization, Conditional Test for Ultrahigh Dimensional Linear Regression Coefficients, Penalized expectile regression: an alternative to penalized quantile regression, New Tests for High-Dimensional Linear Regression Based on Random Projection, Statistical Inference for High-Dimensional Models via Recursive Online-Score Estimation, Predictor ranking and false discovery proportion control in high-dimensional regression, Targeted Inference Involving High-Dimensional Data Using Nuisance Penalized Regression, Inference for high-dimensional instrumental variables regression, Debiasing the debiased Lasso with bootstrap, Nearly optimal Bayesian shrinkage for high-dimensional regression, A resampling approach for confidence intervals in linear time-series models after model selection, Unnamed Item, Unnamed Item, Unnamed Item, Unnamed Item, Unnamed Item, The costs and benefits of uniformly valid causal inference with high-dimensional nuisance parameters, Double-estimation-friendly inference for high-dimensional misspecified models, Robust machine learning by median-of-means: theory and practice, A two-step method for estimating high-dimensional Gaussian graphical models, Unnamed Item, Statistical inference via conditional Bayesian posteriors in high-dimensional linear regression, Goodness-of-Fit Tests for High Dimensional Linear Models, Confidence intervals for sparse precision matrix estimation via Lasso penalized D-trace loss, Communication-efficient sparse composite quantile regression for distributed data, A Sparse Random Projection-Based Test for Overall Qualitative Treatment Effects, iFusion: Individualized Fusion Learning, Structural inference in sparse high-dimensional vector autoregressions, Unnamed Item, \(R\)-optimal designs for trigonometric regression models, Bias corrected regularization kernel method in ranking, UNIFORM INFERENCE IN HIGH-DIMENSIONAL DYNAMIC PANEL DATA MODELS WITH APPROXIMATELY SPARSE FIXED EFFECTS, Test of significance for high-dimensional longitudinal data, Relaxing the assumptions of knockoffs by conditioning, Confidence Intervals for Sparse Penalized Regression With Random Designs, A Bootstrap Lasso + Partial Ridge Method to Construct Confidence Intervals for Parameters in High-dimensional Sparse Linear Models, THE FACTOR-LASSO AND K-STEP BOOTSTRAP APPROACH FOR INFERENCE IN HIGH-DIMENSIONAL ECONOMIC APPLICATIONS, Ill-posed estimation in high-dimensional models with instrumental variables, Comment on “A Tuning-Free Robust and Efficient Approach to High-Dimensional Regression”, Fixed Effects Testing in High-Dimensional Linear Mixed Models, Kernel Meets Sieve: Post-Regularization Confidence Bands for Sparse Additive Model, A High‐dimensional Focused Information Criterion, A statistical mechanics approach to de-biasing and uncertainty estimation in LASSO for random measurements, A Bayesian approach with generalized ridge estimation for high-dimensional regression and testing, On High-Dimensional Constrained Maximum Likelihood Inference, Debiased Inference on Treatment Effect in a High-Dimensional Model, Variable importance assessments and backward variable selection for multi-sample problems, A Sequential Significance Test for Treatment by Covariate Interactions, Variable selection in the Box-Cox power transformation model, Unnamed Item, Unnamed Item, Unnamed Item, Innovated scalable efficient inference for ultra-large graphical models, Inference of large modified Poisson-type graphical models: application to RNA-seq data in childhood atopic asthma studies, Permutation testing in high-dimensional linear models: an empirical investigation, Unnamed Item, Testing for high-dimensional network parameters in auto-regressive models, Variable selection via adaptive false negative control in linear regression, Bootstrapping and sample splitting for high-dimensional, assumption-lean inference, Lasso meets horseshoe: a survey, Distributed simultaneous inference in generalized linear models via confidence distribution, On rank estimators in increasing dimensions, Weak signals in high‐dimensional regression: Detection, estimation and prediction, Optimal designs in sparse linear models, Optimal Estimation of Genetic Relatedness in High-Dimensional Linear Models, Perturbation bootstrap in adaptive Lasso, Valid Post-Selection Inference in High-Dimensional Approximately Sparse Quantile Regression Models, Unnamed Item, Nonsparse Learning with Latent Variables, On the asymptotic variance of the debiased Lasso, Posterior asymptotic normality for an individual coordinate in high-dimensional linear regression, Spectral analysis of high-dimensional time series, A knockoff filter for high-dimensional selective inference, Inter-Subject Analysis: A Partial Gaussian Graphical Model Approach, Global and Simultaneous Hypothesis Testing for High-Dimensional Logistic Regression Models, Unnamed Item, Linear Hypothesis Testing in Dense High-Dimensional Linear Models, Bootstrap inference for penalized GMM estimators with oracle properties, Unnamed Item, High-dimensional statistical inference via DATE, Worst possible sub-directions in high-dimensional models, Covariate-adjusted inference for differential analysis of high-dimensional networks, Statistical inference in sparse high-dimensional additive models, Lasso-driven inference in time and space, Significance testing in non-sparse high-dimensional linear models, Variable selection in high-dimensional linear model with possibly asymmetric errors, Mathematical foundations of machine learning. Abstracts from the workshop held March 21--27, 2021 (hybrid meeting), Testability of high-dimensional linear models with nonsparse structures, Inference for low-rank tensors -- no need to debias, Bayesian high-dimensional semi-parametric inference beyond sub-Gaussian errors, Confidence intervals for high-dimensional partially linear single-index models, Recent advances in statistical methodologies in evaluating program for high-dimensional data, On the post selection inference constant under restricted isometry properties, A unified theory of confidence regions and testing for high-dimensional estimating equations, Constructing confidence intervals for the signals in sparse phase retrieval, High-dimensional sufficient dimension reduction through principal projections, De-biasing the Lasso with degrees-of-freedom adjustment, Doubly robust semiparametric inference using regularized calibrated estimation with high-dimensional data, The asymptotic distribution of the MLE in high-dimensional logistic models: arbitrary covariance, Post-model-selection inference in linear regression models: an integrated review, Thresholding tests based on affine Lasso to achieve non-asymptotic nominal level and high power under sparse and dense alternatives in high dimension, Oracle inequalities, variable selection and uniform inference in high-dimensional correlated random effects panel data models, Testing a single regression coefficient in high dimensional linear models, Doubly debiased Lasso: high-dimensional inference under hidden confounding, Ridge regression revisited: debiasing, thresholding and bootstrap, Kernel-penalized regression for analysis of microbiome data, Distributed testing and estimation under sparse high dimensional models, Hierarchical correction of \(p\)-values via an ultrametric tree running Ornstein-Uhlenbeck process, Sparse linear models and \(l_1\)-regularized 2SLS with high-dimensional endogenous regressors and instruments, Single-index composite quantile regression for ultra-high-dimensional data, High-dimensional inference for personalized treatment decision, A unifying framework of high-dimensional sparse estimation with difference-of-convex (DC) regularizations, Gene set priorization guided by regulatory networks with p-values through kernel mixed model, Robust post-selection inference of high-dimensional mean regression with heavy-tailed asymmetric or heteroskedastic errors, Statistical inference for model parameters in stochastic gradient descent, Uniformly valid confidence intervals post-model-selection, Efficient estimation of linear functionals of principal components, Hierarchical inference for genome-wide association studies: a view on methodology with software, R-optimal designs for multi-factor models with heteroscedastic errors, Confidence intervals for the means of the selected populations, High-dimensional simultaneous inference with the bootstrap, Rejoinder on: ``High-dimensional simultaneous inference with the bootstrap, Testing covariates in high dimension linear regression with latent factors, Asymptotically honest confidence regions for high dimensional parameters by the desparsified conservative Lasso, Combinatorial inference for graphical models, Generalized M-estimators for high-dimensional Tobit I models, Robust inference on average treatment effects with possibly more covariates than observations, Additive model selection, ROCKET: robust confidence intervals via Kendall's tau for transelliptical graphical models, Uniformly valid post-regularization confidence regions for many functional parameters in z-estimation framework, Debiasing the Lasso: optimal sample size for Gaussian designs, Adaptive estimation of high-dimensional signal-to-noise ratios, Online rules for control of false discovery rate and false discovery exceedance, Regularization and the small-ball method. I: Sparse recovery, Selective inference with a randomized response, Detecting rare and faint signals via thresholding maximum likelihood estimators, Beyond support in two-stage variable selection, Beyond Gaussian approximation: bootstrap for maxima of sums of independent random vectors, Nonparametric inference via bootstrapping the debiased estimator, Group Inference in High Dimensions with Applications to Hierarchical Testing, Time-dependent Poisson reduced rank models for political text data analysis, Solution paths for the generalized Lasso with applications to spatially varying coefficients regression, Rate optimal estimation and confidence intervals for high-dimensional regression with missing covariates, Inference under Fine-Gray competing risks model with high-dimensional covariates, Two-directional simultaneous inference for high-dimensional models, Testing Endogeneity with High Dimensional Covariates, Model-assisted inference for treatment effects using regularized calibrated estimation with high-dimensional data, Lasso Inference for High-Dimensional Time Series, Distribution-Free Predictive Inference For Regression, High-dimensional inference: confidence intervals, \(p\)-values and R-software \texttt{hdi}, Inference in high dimensional linear measurement error models, Exact adaptive confidence intervals for linear regression coefficients, Optimal sparsity testing in linear regression model, Bootstrap based inference for sparse high-dimensional time series models, Inference without compatibility: using exponential weighting for inference on a parameter of a linear model, Model selection with mixed variables on the Lasso path, A Projection Based Conditional Dependence Measure with Applications to High-dimensional Undirected Graphical Models, Efficient estimation of smooth functionals in Gaussian shift models, Flexible and Interpretable Models for Survival Data, Semiparametric efficiency bounds for high-dimensional models, High-dimensional variable selection via low-dimensional adaptive learning, Multicarving for high-dimensional post-selection inference, Honest confidence regions and optimality in high-dimensional precision matrix estimation, The de-biased group Lasso estimation for varying coefficient models, Second-order Stein: SURE for SURE and other applications in high-dimensional inference, The distribution of the Lasso: uniform control over sparse balls and adaptive parameter tuning, An overview of tests on high-dimensional means, A Bayesian-motivated test for high-dimensional linear regression models with fixed design matrix, Augmented minimax linear estimation, Inference for high-dimensional varying-coefficient quantile regression, High-dimensional inference for linear model with correlated errors, In defense of the indefensible: a very naïve approach to high-dimensional inference, Some perspectives on inference in high dimensions, Distributed adaptive Huber regression, A convex programming solution based debiased estimator for quantile with missing response and high-dimensional covariables, Confidence intervals for parameters in high-dimensional sparse vector autoregression, Spatially relaxed inference on high-dimensional linear models, Network differential connectivity analysis, Asymptotic normality of robust \(M\)-estimators with convex penalty, Design of c-optimal experiments for high-dimensional linear models


Uses Software


Cites Work