Exact post-selection inference, with application to the Lasso
From MaRDI portal
Publication:292865
Abstract: We develop a general approach to valid inference after model selection. At the core of our framework is a result that characterizes the distribution of a post-selection estimator conditioned on the selection event. We specialize the approach to model selection by the lasso to form valid confidence intervals for the selected coefficients and test whether all relevant variables have been included in the model.
Recommendations
Cites work
- scientific article; zbMATH DE number 3117973 (Why is no real title available?)
- scientific article; zbMATH DE number 1906319 (Why is no real title available?)
- scientific article; zbMATH DE number 845714 (Why is no real title available?)
- A note on data-splitting for the evaluation of significance levels
- A significance test for the lasso
- A unified framework for high-dimensional analysis of \(M\)-estimators with decomposable regularizers
- Asymptotics for Lasso-type estimators.
- Asymptotics of selective inference
- Can one estimate the conditional distribution of post-model-selection estimators?
- Conditional properties of statistical procedures
- Confidence Intervals and Hypothesis Testing for High-Dimensional Regression
- Confidence sets based on penalized maximum likelihood estimators in Gaussian regression
- Drop-the-losers design: binomial case
- False Discovery Rate–Adjusted Multiple Confidence Intervals for Selected Parameters
- Least angle regression. (With discussion)
- MODEL SELECTION AND INFERENCE: FACTS AND FICTION
- On asymptotically optimal confidence regions and tests for high-dimensional models
- Regularization and Variable Selection Via the Elastic Net
- Rejoinder: ``A significance test for the lasso
- Selection Adjusted Confidence Intervals With More Power to Determine the Sign
- Selective inference in complex research
- Testing Statistical Hypotheses
- The Lasso problem and uniqueness
- The positive false discovery rate: A Bayesian interpretation and the \(q\)-value
- Uniform asymptotic inference and the bootstrap after model selection
- Valid post-selection inference
Cited in
(only showing first 100 items - show all)- Testing shape constraints in Lasso regularized joinpoint regression
- Special feature: Information theory and statistics
- Comments on: ``Hierarchical inference for genome-wide association studies: a view on methodology with software
- The Lasso on latent indices for regression modeling with ordinal categorical predictors
- Inference without compatibility: using exponential weighting for inference on a parameter of a linear model
- Model selection with mixed variables on the Lasso path
- In defense of the indefensible: a very naïve approach to high-dimensional inference
- Inference after estimation of breaks
- Multicarving for high-dimensional post-selection inference
- Statistical methods for replicability assessment
- Network classification with applications to brain connectomics
- Excess optimism: how biased is the apparent error of an estimator tuned by SURE?
- Conditional selective inference for robust regression and outlier detection using piecewise-linear homotopy continuation
- A weak‐signal‐assisted procedure for variable selection and statistical inference with an informative subsample
- Post-regularization inference for time-varying nonparanormal graphical models
- Integrative methods for post-selection inference under convex constraints
- Tuning parameter selection for penalized estimation via \(R^2\)
- Predictor ranking and false discovery proportion control in high-dimensional regression
- Network differential connectivity analysis
- Linear hypothesis testing for high dimensional generalized linear models
- Post-model-selection inference in linear regression models: an integrated review
- Post-selection inference via algorithmic stability
- Conditional calibration for false discovery rate control under dependence
- A multi-resolution theory for approximating infinite-\(p\)-zero-\(n\): transitional inference, individualized predictions, and a world without bias-variance tradeoff
- The costs and benefits of uniformly valid causal inference with high-dimensional nuisance parameters
- Exact statistical inference for the Wasserstein distance by selective inference. Selective inference for the Wasserstein distance
- Selective inference after feature selection via multiscale bootstrap
- scientific article; zbMATH DE number 7750675 (Why is no real title available?)
- Debiasing the debiased Lasso with bootstrap
- Inference for high-dimensional varying-coefficient quantile regression
- Inference for \(L_2\)-boosting
- Some perspectives on inference in high dimensions
- Selection-corrected statistical inference for region detection with high-throughput assays
- Post hoc confidence bounds on false positives using reference families
- Marginal false discovery rate for a penalized transformation survival model
- Models as approximations. I. Consequences illustrated with linear regression
- Relaxing the assumptions of knockoffs by conditioning
- Variable Selection in Regression-Based Estimation of Dynamic Treatment Regimes
- On rank estimators in increasing dimensions
- FANOK: knockoffs in linear time
- On the length of post-model-selection confidence intervals conditional on polyhedral constraints
- Optimal finite sample post-selection confidence distributions in generalized linear models
- Confidence intervals centred on bootstrap smoothed estimators
- Solution paths for the generalized Lasso with applications to spatially varying coefficients regression
- Valid Inference Corrected for Outlier Removal
- Ridge regression revisited: debiasing, thresholding and bootstrap
- Prediction regions through inverse regression
- A penalized approach to covariate selection through quantile regression coefficient models
- Post-selection inference following aggregate level hypothesis testing in large-scale genomic data
- Asymptotics of selective inference
- Uniform asymptotic inference and the bootstrap after model selection
- Valid post-selection inference in model-free linear regression
- A bootstrap Lasso+partial ridge method to construct confidence intervals for parameters in high-dimensional sparse linear models
- Uniformly valid confidence sets based on the Lasso
- Lasso guarantees for \(\beta \)-mixing heavy-tailed time series
- Scalable methods for Bayesian selective inference
- Statistical learning and selective inference
- A scalable surrogate \(L_0\) sparse regression method for generalized linear models with applications to large scale data
- Selective inference for additive and linear mixed models
- Rejoinder on: ``High-dimensional simultaneous inference with the bootstrap
- Inference for High-Dimensional Censored Quantile Regression
- Selective inference with a randomized response
- On the post selection inference constant under restricted isometry properties
- Exploiting Disagreement Between High-Dimensional Variable Selectors for Uncertainty Visualization
- Uniformly valid inference based on the Lasso in linear mixed models
- Exact adaptive confidence intervals for linear regression coefficients
- Comments on: ``High-dimensional simultaneous inference with the bootstrap
- Partitioned Approach for High-dimensional Confidence Intervals with Large Split Sizes
- A knockoff filter for high-dimensional selective inference
- Regularized matrix-variate logistic regression with response subject to misclassification
- An automated approach towards sparse single-equation cointegration modelling
- Exponentially tilted likelihood inference on growing dimensional unconditional moment models
- Inference for low‐ and high‐dimensional inhomogeneous Gibbs point processes
- Flexible and Interpretable Models for Survival Data
- A regularization-based adaptive test for high-dimensional GLMs
- Mathematical foundations of machine learning. Abstracts from the workshop held March 21--27, 2021 (hybrid meeting)
- Post-selection inference for \(\ell_1\)-penalized likelihood models
- Degrees of freedom for piecewise Lipschitz estimators
- ROCKET: robust confidence intervals via Kendall's tau for transelliptical graphical models
- Confidence intervals for sparse penalized regression with random designs
- Conditional predictive inference post model selection
- Distribution-free predictive inference for regression
- Derandomizing Knockoffs
- Valid post-selection inference
- Two-directional simultaneous inference for high-dimensional models
- Post-selection point and interval estimation of signal sizes in Gaussian samples
- Testing regression coefficients after model selection through sign restrictions
- High-dimensional inference: confidence intervals, \(p\)-values and R-software \texttt{hdi}
- Lasso Inference for High-Dimensional Time Series
- Confidence intervals for high-dimensional Cox models
- Testing the simplifying assumption in high-dimensional vine copulas
- Discussion on: “A Scale-Free Approach for False Discovery Rate Control in Generalized Linear Models” by Dai, Lin, Zing, Liu
- Regression analysis for microbiome compositional data
- Demystifying the bias from selective inference: a revisit to Dawid's treatment selection problem
- AIC for the Lasso in generalized linear models
- scientific article; zbMATH DE number 7370575 (Why is no real title available?)
- scientific article; zbMATH DE number 7370644 (Why is no real title available?)
- Weak signal identification and inference in penalized model selection
- Exact post-selection inference for the generalized Lasso path
- Valid post-selection inference in high-dimensional approximately sparse quantile regression models
This page was built for publication: Exact post-selection inference, with application to the Lasso
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q292865)