Exact post-selection inference, with application to the Lasso

From MaRDI portal
Publication:292865


DOI10.1214/15-AOS1371zbMath1341.62061arXiv1311.6238MaRDI QIDQ292865

Jonathan E. Taylor, Dennis L. Sun, Yuekai Sun, Jason D. Lee

Publication date: 9 June 2016

Published in: The Annals of Statistics (Search for Journal in Brave)

Full work available at URL: https://arxiv.org/abs/1311.6238



Related Items

Partitioned Approach for High-dimensional Confidence Intervals with Large Split Sizes, Regularized projection score estimation of treatment effects in high-dimensional quantile regression, A scalable surrogate \(L_0\) sparse regression method for generalized linear models with applications to large scale data, Mathematical foundations of machine learning. Abstracts from the workshop held March 21--27, 2021 (hybrid meeting), On the post selection inference constant under restricted isometry properties, Post-model-selection inference in linear regression models: an integrated review, High-Dimensional Inference for Cluster-Based Graphical Models, Assumption Lean Regression, Demystifying the bias from selective inference: a revisit to Dawid's treatment selection problem, Regression analysis for microbiome compositional data, AIC for the Lasso in generalized linear models, Selective inference after likelihood- or test-based model selection in linear models, Ridge regression revisited: debiasing, thresholding and bootstrap, Least-Square Approximation for a Distributed System, Projection-based Inference for High-dimensional Linear Models, Uniform asymptotic inference and the bootstrap after model selection, Degrees of freedom for piecewise Lipschitz estimators, Scalable methods for Bayesian selective inference, Data shared Lasso: a novel tool to discover uplift, Unnamed Item, Unnamed Item, A penalized approach to covariate selection through quantile regression coefficient models, An improved algorithm for high-dimensional continuous threshold expectile model with variance heterogeneity, Exploiting Disagreement Between High-Dimensional Variable Selectors for Uncertainty Visualization, Uniformly valid confidence intervals post-model-selection, Conditional Test for Ultrahigh Dimensional Linear Regression Coefficients, On the distribution, model selection properties and uniqueness of the Lasso estimator in low and high dimensions, Post-selection inference of generalized linear models based on the lasso and the elastic net, Exponentially tilted likelihood inference on growing dimensional unconditional moment models, Predictor ranking and false discovery proportion control in high-dimensional regression, Optimal finite sample post-selection confidence distributions in generalized linear models, The LASSO on latent indices for regression modeling with ordinal categorical predictors, Confidence intervals for the means of the selected populations, Selective inference after feature selection via multiscale bootstrap, Exact statistical inference for the Wasserstein distance by selective inference. Selective inference for the Wasserstein distance, Debiasing the debiased Lasso with bootstrap, Unnamed Item, Unnamed Item, Models as approximations. I. Consequences illustrated with linear regression, The costs and benefits of uniformly valid causal inference with high-dimensional nuisance parameters, Statistical proof? The problem of irreproducibility, Rejoinder on: ``High-dimensional simultaneous inference with the bootstrap, Lasso guarantees for \(\beta \)-mixing heavy-tailed time series, Post hoc confidence bounds on false positives using reference families, Valid post-selection inference in model-free linear regression, Relaxing the assumptions of knockoffs by conditioning, A Large-Scale Constrained Joint Modeling Approach for Predicting User Activity, Engagement, and Churn With Application to Freemium Mobile Games, Confidence Intervals for Sparse Penalized Regression With Random Designs, A Bootstrap Lasso + Partial Ridge Method to Construct Confidence Intervals for Parameters in High-dimensional Sparse Linear Models, Robust regression: an inferential method for determining which independent variables are most important, FANOK: Knockoffs in Linear Time, Marginal false discovery rate for a penalized transformation survival model, Exact post-selection inference for the generalized Lasso path, Unnamed Item, Efficient least angle regression for identification of linear-in-the-parameters models, ROCKET: robust confidence intervals via Kendall's tau for transelliptical graphical models, Selective inference with a randomized response, Uniformly valid confidence sets based on the Lasso, Solution paths for the generalized Lasso with applications to spatially varying coefficients regression, Two-directional simultaneous inference for high-dimensional models, Testing the simplifying assumption in high-dimensional vine copulas, Lasso Inference for High-Dimensional Time Series, Unnamed Item, Distribution-Free Predictive Inference For Regression, Network classification with applications to brain connectomics, Testing for high-dimensional network parameters in auto-regressive models, High-dimensional inference: confidence intervals, \(p\)-values and R-software \texttt{hdi}, A General Framework for Estimation and Inference From Clusters of Features, Bootstrapping and sample splitting for high-dimensional, assumption-lean inference, Exact adaptive confidence intervals for linear regression coefficients, An Automated Approach Towards Sparse Single-Equation Cointegration Modelling, Inference without compatibility: using exponential weighting for inference on a parameter of a linear model, Model selection with mixed variables on the Lasso path, Flexible and Interpretable Models for Survival Data, Inference after estimation of breaks, Spatial Variable Selection and An Application to Virginia Lyme Disease Emergence, On rank estimators in increasing dimensions, Statistical methods for replicability assessment, Multicarving for high-dimensional post-selection inference, Selective inference for latent block models, Inference for \(L_2\)-boosting, Special feature: Information theory and statistics, Selective inference via marginal screening for high dimensional classification, Integrative methods for post-selection inference under convex constraints, Valid Inference Corrected for Outlier Removal, Regularized matrix-variate logistic regression with response subject to misclassification, Excess Optimism: How Biased is the Apparent Error of an Estimator Tuned by SURE?, Confidence intervals centred on bootstrap smoothed estimators, Selective inference for additive and linear mixed models, Inference for high-dimensional varying-coefficient quantile regression, In defense of the indefensible: a very naïve approach to high-dimensional inference, Some perspectives on inference in high dimensions, Unnamed Item, Network differential connectivity analysis, A knockoff filter for high-dimensional selective inference, Linear hypothesis testing for high dimensional generalized linear models, On the Length of Post-Model-Selection Confidence Intervals Conditional on Polyhedral Constraints, Conditional selective inference for robust regression and outlier detection using piecewise-linear homotopy continuation, Post-Selection Inference Following Aggregate Level Hypothesis Testing in Large-Scale Genomic Data, Conditional calibration for false discovery rate control under dependence, Markov Neighborhood Regression for High-Dimensional Inference, Testing Shape Constraints in Lasso Regularized Joinpoint Regression, Statistical Inference for High-Dimensional Models via Recursive Online-Score Estimation, Integrative Bayesian Models Using Post-Selective Inference: A Case Study in Radiogenomics, Variable Selection in Regression-Based Estimation of Dynamic Treatment Regimes, Cellwise outlier detection with false discovery rate control, Lasso regularization within the LocalGLMnet architecture, Automatic bias correction for testing in high‐dimensional linear models, Score Tests With Incomplete Covariates and High-Dimensional Auxiliary Variables, Inference for low‐ and high‐dimensional inhomogeneous Gibbs point processes, Uniformly valid inference based on the Lasso in linear mixed models, A weak‐signal‐assisted procedure for variable selection and statistical inference with an informative subsample, Discussion on: “A Scale-Free Approach for False Discovery Rate Control in Generalized Linear Models” by Dai, Lin, Zing, Liu, Debiased lasso for generalized linear models with a diverging number of covariates, More Powerful Selective Inference for the Graph Fused Lasso, Double bias correction for high-dimensional sparse additive hazards regression with covariate measurement errors, Controlling False Discovery Rate Using Gaussian Mirrors, A Normality Test for High-dimensional Data Based on the Nearest Neighbor Approach, Scalable and efficient inference via CPE, Tuning parameter selection for penalized estimation via \(R^2\), Selective inference for clustering with unknown variance, False Discovery Rate Control via Data Splitting, Distributionally robust and generalizable inference, Selective Inference for Hierarchical Clustering, Inference for High-Dimensional Censored Quantile Regression, Derandomizing Knockoffs, Neighborhood-based cross fitting approach to treatment effects with high-dimensional data, Post-selection inference via algorithmic stability, Carving model-free inference, Approximate Selective Inference via Maximum Likelihood, Unnamed Item, Unnamed Item, Unnamed Item, Selection-Corrected Statistical Inference for Region Detection With High-Throughput Assays, Comments on: ``High-dimensional simultaneous inference with the bootstrap, A Multi-resolution Theory for Approximating Infinite-p-Zero-n: Transitional Inference, Individualized Predictions, and a World Without Bias-Variance Tradeoff, Comments on: ``Hierarchical inference for genome-wide association studies: a view on methodology with software, Statistical Inference Enables Bad Science; Statistical Thinking Enables Good Science, High-dimensional statistical inference via DATE


Uses Software


Cites Work