Exact post-selection inference, with application to the Lasso
From MaRDI portal
Publication:292865
DOI10.1214/15-AOS1371zbMATH Open1341.62061MaRDI QIDQ292865FDOQ292865
Authors: Jason D. Lee, Dennis L. Sun, Yuekai Sun, Jonathan Taylor
Publication date: 9 June 2016
Published in: The Annals of Statistics (Search for Journal in Brave)
Abstract: We develop a general approach to valid inference after model selection. At the core of our framework is a result that characterizes the distribution of a post-selection estimator conditioned on the selection event. We specialize the approach to model selection by the lasso to form valid confidence intervals for the selected coefficients and test whether all relevant variables have been included in the model.
Full work available at URL: https://arxiv.org/abs/1311.6238
Recommendations
Exact distribution theory in statistics (62E15) Parametric hypothesis testing (62F03) Ridge regression; shrinkage estimators (Lasso) (62J07)
Cites Work
- Least angle regression. (With discussion)
- Title not available (Why is that?)
- Confidence Intervals and Hypothesis Testing for High-Dimensional Regression
- Regularization and Variable Selection Via the Elastic Net
- Testing Statistical Hypotheses
- A unified framework for high-dimensional analysis of \(M\)-estimators with decomposable regularizers
- On asymptotically optimal confidence regions and tests for high-dimensional models
- Asymptotics for Lasso-type estimators.
- A significance test for the lasso
- Valid post-selection inference
- MODEL SELECTION AND INFERENCE: FACTS AND FICTION
- Title not available (Why is that?)
- Asymptotics of selective inference
- Can one estimate the conditional distribution of post-model-selection estimators?
- Conditional properties of statistical procedures
- The Lasso problem and uniqueness
- The positive false discovery rate: A Bayesian interpretation and the \(q\)-value
- Selective inference in complex research
- Confidence sets based on penalized maximum likelihood estimators in Gaussian regression
- Rejoinder: ``A significance test for the lasso
- Drop-the-losers design: binomial case
- Uniform asymptotic inference and the bootstrap after model selection
- Title not available (Why is that?)
- A note on data-splitting for the evaluation of significance levels
- Selection Adjusted Confidence Intervals With More Power to Determine the Sign
- False Discovery Rate–Adjusted Multiple Confidence Intervals for Selected Parameters
Cited In (only showing first 100 items - show all)
- The Lasso on latent indices for regression modeling with ordinal categorical predictors
- Variable Selection in Regression-Based Estimation of Dynamic Treatment Regimes
- Relaxing the assumptions of knockoffs by conditioning
- Conditional calibration for false discovery rate control under dependence
- FANOK: knockoffs in linear time
- The costs and benefits of uniformly valid causal inference with high-dimensional nuisance parameters
- Post hoc confidence bounds on false positives using reference families
- Confidence intervals centred on bootstrap smoothed estimators
- Valid Inference Corrected for Outlier Removal
- Title not available (Why is that?)
- Solution paths for the generalized Lasso with applications to spatially varying coefficients regression
- Marginal false discovery rate for a penalized transformation survival model
- On the length of post-model-selection confidence intervals conditional on polyhedral constraints
- Network classification with applications to brain connectomics
- Post-model-selection inference in linear regression models: an integrated review
- Testing shape constraints in Lasso regularized joinpoint regression
- Post-selection inference via algorithmic stability
- A multi-resolution theory for approximating infinite-\(p\)-zero-\(n\): transitional inference, individualized predictions, and a world without bias-variance tradeoff
- Ridge regression revisited: debiasing, thresholding and bootstrap
- Inference without compatibility: using exponential weighting for inference on a parameter of a linear model
- Model selection with mixed variables on the Lasso path
- Inference after estimation of breaks
- Multicarving for high-dimensional post-selection inference
- Statistical methods for replicability assessment
- Prediction regions through inverse regression
- Excess optimism: how biased is the apparent error of an estimator tuned by SURE?
- Integrative methods for post-selection inference under convex constraints
- Predictor ranking and false discovery proportion control in high-dimensional regression
- In defense of the indefensible: a very naïve approach to high-dimensional inference
- Optimal finite sample post-selection confidence distributions in generalized linear models
- Exact statistical inference for the Wasserstein distance by selective inference. Selective inference for the Wasserstein distance
- Selective inference after feature selection via multiscale bootstrap
- On rank estimators in increasing dimensions
- Post-regularization inference for time-varying nonparanormal graphical models
- Tuning parameter selection for penalized estimation via \(R^2\)
- Linear hypothesis testing for high dimensional generalized linear models
- Comments on: ``Hierarchical inference for genome-wide association studies: a view on methodology with software
- Inference for high-dimensional varying-coefficient quantile regression
- Some perspectives on inference in high dimensions
- Debiasing the debiased Lasso with bootstrap
- Models as approximations. I. Consequences illustrated with linear regression
- Special feature: Information theory and statistics
- A weak‐signal‐assisted procedure for variable selection and statistical inference with an informative subsample
- Network differential connectivity analysis
- Inference for \(L_2\)-boosting
- Conditional selective inference for robust regression and outlier detection using piecewise-linear homotopy continuation
- Selection-corrected statistical inference for region detection with high-throughput assays
- Inference for low‐ and high‐dimensional inhomogeneous Gibbs point processes
- A regularization-based adaptive test for high-dimensional GLMs
- High-dimensional inference: confidence intervals, \(p\)-values and R-software \texttt{hdi}
- Inference After Model Selection
- Mathematical foundations of machine learning. Abstracts from the workshop held March 21--27, 2021 (hybrid meeting)
- A scalable surrogate \(L_0\) sparse regression method for generalized linear models with applications to large scale data
- Data shared Lasso: a novel tool to discover uplift
- Selective inference for latent block models
- Selective inference with a randomized response
- Distribution-free predictive inference for regression
- Uniformly valid confidence intervals post-model-selection
- Asymptotics of selective inference
- Lasso guarantees for \(\beta \)-mixing heavy-tailed time series
- Testing regression coefficients after model selection through sign restrictions
- Title not available (Why is that?)
- Valid post-selection inference in high-dimensional approximately sparse quantile regression models
- Projection-based Inference for High-dimensional Linear Models
- Bootstrapping and sample splitting for high-dimensional, assumption-lean inference
- Inference for High-Dimensional Censored Quantile Regression
- A knockoff filter for high-dimensional selective inference
- Discussion on: “A Scale-Free Approach for False Discovery Rate Control in Generalized Linear Models” by Dai, Lin, Zing, Liu
- Exact post-selection inference for the generalized Lasso path
- On the distribution, model selection properties and uniqueness of the Lasso estimator in low and high dimensions
- Selective inference for additive and linear mixed models
- On the post selection inference constant under restricted isometry properties
- Post-selection inference for \(\ell_1\)-penalized likelihood models
- Testing for high-dimensional network parameters in auto-regressive models
- Selective inference with unknown variance via the square-root Lasso
- Post-selection inference following aggregate level hypothesis testing in large-scale genomic data
- Title not available (Why is that?)
- Selective inference after likelihood- or test-based model selection in linear models
- Uniformly valid confidence sets based on the Lasso
- Confidence intervals for the means of the selected populations
- Statistical learning and selective inference
- Confidence intervals for sparse penalized regression with random designs
- Demystifying the bias from selective inference: a revisit to Dawid's treatment selection problem
- AIC for the Lasso in generalized linear models
- Exponentially tilted likelihood inference on growing dimensional unconditional moment models
- Degrees of freedom for piecewise Lipschitz estimators
- Exploiting Disagreement Between High-Dimensional Variable Selectors for Uncertainty Visualization
- Title not available (Why is that?)
- Weak signal identification and inference in penalized model selection
- Scalable methods for Bayesian selective inference
- Regularized matrix-variate logistic regression with response subject to misclassification
- Post-selection point and interval estimation of signal sizes in Gaussian samples
- Uniformly valid inference based on the Lasso in linear mixed models
- Conditional predictive inference post model selection
- Valid post-selection inference in model-free linear regression
- Uniform asymptotic inference and the bootstrap after model selection
- Testing the simplifying assumption in high-dimensional vine copulas
- Valid confidence intervals for post-model-selection predictors
- Least-Square Approximation for a Distributed System
- A penalized approach to covariate selection through quantile regression coefficient models
Uses Software
This page was built for publication: Exact post-selection inference, with application to the Lasso
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q292865)