Valid post-selection inference

From MaRDI portal
Publication:355109

DOI10.1214/12-AOS1077zbMath1267.62080arXiv1306.1059OpenAlexW2009462809MaRDI QIDQ355109

Kai Zhang, Andreas Buja, Linda Zhao, Lawrence D. Brown, Richard A. Berk

Publication date: 24 July 2013

Published in: The Annals of Statistics (Search for Journal in Brave)

Full work available at URL: https://arxiv.org/abs/1306.1059



Related Items

Inference for low‐ and high‐dimensional inhomogeneous Gibbs point processes, Uniformly valid inference based on the Lasso in linear mixed models, Mixed-effect models with trees, Filtering the Rejection Set While Preserving False Discovery Rate Control, Controlling False Discovery Rate Using Gaussian Mirrors, A Normality Test for High-dimensional Data Based on the Nearest Neighbor Approach, Scalable and efficient inference via CPE, Kernel Ordinary Differential Equations, Heterogeneous heterogeneity by default: Testing categorical moderators in mixed‐effects meta‐analysis, False Discovery Rate Control via Data Splitting, Distributionally robust and generalizable inference, An evolutionary estimation procedure for generalized semilinear regression trees, Penalized estimation of a class of single‐index varying‐coefficient models for integrative genomic analysis, Bounds in \(L^1\) Wasserstein distance on the normal approximation of general M-estimators, Forward-selected panel data approach for program evaluation, Inference for High-Dimensional Censored Quantile Regression, Variable Selection for Global Fréchet Regression, Neighborhood-based cross fitting approach to treatment effects with high-dimensional data, Empirical likelihood based tests for detecting the presence of significant predictors in marginal quantile regression, Post-selection inference via algorithmic stability, Approximate Selective Inference via Maximum Likelihood, Conformal Prediction Credibility Intervals, Unnamed Item, Unnamed Item, The robust desparsified lasso and the focused information criterion for high-dimensional generalized linear models, On the impact of model selection on predictor identification and parameter inference, Markov Neighborhood Regression for High-Dimensional Inference, Optimal configurations of lines and a statistical application, Regularized projection score estimation of treatment effects in high-dimensional quantile regression, On the post selection inference constant under restricted isometry properties, Exact post-selection inference, with application to the Lasso, Bayesian Inference Is Unaffected by Selection: Fact or Fiction?, Post-model-selection inference in linear regression models: an integrated review, Assumption Lean Regression, Selective inference after likelihood- or test-based model selection in linear models, Least-Square Approximation for a Distributed System, Projection-based Inference for High-dimensional Linear Models, Uniform asymptotic inference and the bootstrap after model selection, Estimation and Inference of Heterogeneous Treatment Effects using Random Forests, Testing for Neglected Nonlinearity Using Regularized Artificial Neural Networks, Estimation of selected parameters, Valid post-selection inference, Rates of convergence of the adaptive LASSO estimators to the oracle distribution and higher order refinements by the bootstrap, Uniformly valid confidence intervals post-model-selection, Post-selection inference of generalized linear models based on the lasso and the elastic net, Constraints versus Priors, Rejoinder on: ``Hierarchical inference for genome-wide association studies: a view on methodology with software, Controlling the false discovery rate via knockoffs, Targeted Inference Involving High-Dimensional Data Using Nuisance Penalized Regression, Optimal finite sample post-selection confidence distributions in generalized linear models, Integrative Bayesian Models Using Post-Selective Inference: A Case Study in Radiogenomics, Cellwise outlier detection with false discovery rate control, A structured brain‐wide and genome‐wide association study using ADNI PET images, A nonparametric sequential learning procedure for estimating the pure premium, Unlucky Number 13? Manipulating Evidence Subject to Snooping, Score Tests With Incomplete Covariates and High-Dimensional Auxiliary Variables, Selective inference after feature selection via multiscale bootstrap, Focused model selection for linear mixed models with an application to whale ecology, Models as approximations. I. Consequences illustrated with linear regression, Models as approximations. II. A model-free theory of parametric regression, Larry Brown's contributions to parametric inference, decision theory and foundations: a survey, Statistical theory powering data science, The costs and benefits of uniformly valid causal inference with high-dimensional nuisance parameters, Statistical proof? The problem of irreproducibility, Rejoinder on: ``High-dimensional simultaneous inference with the bootstrap, Post hoc confidence bounds on false positives using reference families, Statistical learning and selective inference, SLOPE-adaptive variable selection via convex optimization, A simulation based method for assessing the statistical significance of logistic regression models after common variable selection procedures, Asymptotically honest confidence regions for high dimensional parameters by the desparsified conservative Lasso, High-dimensional CLT: improvements, non-uniform extensions and large deviations, Valid post-selection inference in model-free linear regression, A Bootstrap Lasso + Partial Ridge Method to Construct Confidence Intervals for Parameters in High-dimensional Sparse Linear Models, Penalized likelihood and multiple testing, On the least-squares model averaging interval estimator, Robust inference on average treatment effects with possibly more covariates than observations, FANOK: Knockoffs in Linear Time, Likelihood Ratio Test in Multivariate Linear Regression: from Low to High Dimension, Weighted-average least squares estimation of generalized linear models, Exact post-selection inference for the generalized Lasso path, Unnamed Item, Logistic regression: from art to science, Uniformly valid confidence sets based on the Lasso, Simultaneous high-probability bounds on the false discovery proportion in structured, regression and online settings, On asymptotically optimal confidence regions and tests for high-dimensional models, Lasso Inference for High-Dimensional Time Series, Distribution-Free Predictive Inference For Regression, Confidence intervals for high-dimensional inverse covariance estimation, High-dimensional inference: confidence intervals, \(p\)-values and R-software \texttt{hdi}, Bootstrapping and sample splitting for high-dimensional, assumption-lean inference, An Automated Approach Towards Sparse Single-Equation Cointegration Modelling, Only closed testing procedures are admissible for controlling false discovery proportions, Spatial Variable Selection and An Application to Virginia Lyme Disease Emergence, Multicarving for high-dimensional post-selection inference, Selective inference for latent block models, Sparse estimation of Cox proportional hazards models via approximated information criteria, Inference for \(L_2\)-boosting, Exploration of the variability of variable selection based on distances between bootstrap sample results, Selective inference via marginal screening for high dimensional classification, Bayesian Semiparametric Functional Mixed Models for Serially Correlated Functional Data, With Application to Glaucoma Data, Excess Optimism: How Biased is the Apparent Error of an Estimator Tuned by SURE?, Selective inference for additive and linear mixed models, Informative goodness-of-fit for multivariate distributions, In defense of the indefensible: a very naïve approach to high-dimensional inference, Some perspectives on inference in high dimensions, Confidence intervals for parameters in high-dimensional sparse vector autoregression, Selection-Corrected Statistical Inference for Region Detection With High-Throughput Assays, Spatially relaxed inference on high-dimensional linear models, A knockoff filter for high-dimensional selective inference, Log-Linear Bayesian Additive Regression Trees for Multinomial Logistic and Count Regression Models, On the Length of Post-Model-Selection Confidence Intervals Conditional on Polyhedral Constraints, On Hodges' superefficiency and merits of oracle property in model selection, Frequentist model averaging in structural equation modelling, Conditional selective inference for robust regression and outlier detection using piecewise-linear homotopy continuation, A Multi-resolution Theory for Approximating Infinite-p-Zero-n: Transitional Inference, Individualized Predictions, and a World Without Bias-Variance Tradeoff, Robust Q-Learning, Confidence Sets Based on Thresholding Estimators in High-Dimensional Gaussian Regression Models, Optimal model averaging for divergent-dimensional Poisson regressions, The Perils of Balance Testing in Experimental Design: Messy Analyses of Clean Data, Statistical Inference Enables Bad Science; Statistical Thinking Enables Good Science, Selection of mixed copula for association modeling with tied observations, On various confidence intervals post-model-selection, High-dimensional statistical inference via DATE


Uses Software


Cites Work