Post-selection inference for _1-penalized likelihood models
From MaRDI portal
Publication:4960907
DOI10.1002/CJS.11313zbMATH Open1466.62372arXiv1602.07358OpenAlexW2962877661WikidataQ91049491 ScholiaQ91049491MaRDI QIDQ4960907FDOQ4960907
Authors: Jonathan Taylor, Robert Tibshirani
Publication date: 24 April 2020
Published in: The Canadian Journal of Statistics (Search for Journal in Brave)
Abstract: We present a new method for post-selection inference for L1 (lasso)-penalized likelihood models, including generalized regression models. Our approach generalizes the post-selection framework presented in Lee et al (2014). The method provides p-values and confidence intervals that are asymptotically valid, conditional on the inherent selection done by the lasso. We present applications of this work to (regularized) logistic regression, Cox's proportional hazards model and the graphical lasso.
Full work available at URL: https://arxiv.org/abs/1602.07358
Recommendations
- Exact post-selection inference, with application to the Lasso
- Valid post-selection inference
- Exact post-selection inference for the generalized Lasso path
- Selective inference after likelihood- or test-based model selection in linear models
- Selective inference with unknown variance via the square-root Lasso
Censored data models (62N01) Ridge regression; shrinkage estimators (Lasso) (62J07) Generalized linear models (logistic models) (62J12)
Cited In (39)
- Inference for low‐ and high‐dimensional inhomogeneous Gibbs point processes
- A Scale-Free Approach for False Discovery Rate Control in Generalized Linear Models
- Post-selection inference of generalized linear models based on the lasso and the elastic net
- Tuning-free ridge estimators for high-dimensional generalized linear models
- Confidently Comparing Estimates with the c-value
- Uniformly valid confidence intervals post-model-selection
- Exact post-selection inference for adjusted R squared selection
- Exact post-selection inference for the generalized Lasso path
- Simultaneous selection and inference for varying coefficients with zero regions: a soft-thresholding approach
- On the post selection inference constant under restricted isometry properties
- On the length of post-model-selection confidence intervals conditional on polyhedral constraints
- A two-step method for estimating high-dimensional Gaussian graphical models
- Selective inference with unknown variance via the square-root Lasso
- Post-model-selection inference in linear regression models: an integrated review
- Projection-based techniques for high-dimensional optimal transport problems
- On the impact of model selection on predictor identification and parameter inference
- Selective inference after likelihood- or test-based model selection in linear models
- Efficient estimation of the maximal association between multiple predictors and a survival outcome
- Post-selection estimation and testing following aggregate association tests
- Model selection in validation sampling: an asymptotic likelihood-based LASSO approach
- Transformation Models in High Dimensions
- Interpretability of bi-level variable selection methods
- Optimized variable selection via repeated data splitting
- Multicarving for high-dimensional post-selection inference
- Simultaneous spatial smoothing and outlier detection using penalized regression, with application to childhood obesity surveillance from electronic health records
- Controlling False Discovery Rate Using Gaussian Mirrors
- Score Tests With Incomplete Covariates and High-Dimensional Auxiliary Variables
- Improved estimators for semi-supervised high-dimensional regression model
- High-dimensional confounding adjustment using continuous Spike and Slab priors
- Integrative methods for post-selection inference under convex constraints
- Optimal finite sample post-selection confidence distributions in generalized linear models
- Exact post-selection inference, with application to the Lasso
- Modern approaches for evaluating treatment effect heterogeneity from clinical trials and observational data
- Confidence intervals for high-dimensional Cox models
- A Normality Test for High-dimensional Data Based on the Nearest Neighbor Approach
- A (tight) upper bound for the length of confidence intervals with conditional coverage
- Forward stability and model path selection
- Selective inference via marginal screening for high dimensional classification
- Statistical proof? The problem of irreproducibility
This page was built for publication: Post-selection inference for \(\ell_1\)-penalized likelihood models
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q4960907)