Exact post-selection inference for the generalized Lasso path
From MaRDI portal
Publication:1746554
Abstract: We study tools for inference conditioned on model selection events that are defined by the generalized lasso regularization path. The generalized lasso estimate is given by the solution of a penalized least squares regression problem, where the penalty is the l1 norm of a matrix D times the coefficient vector. The generalized lasso path collects these estimates for a range of penalty parameter ({lambda}) values. Leveraging a sequential characterization of this path from Tibshirani & Taylor (2011), and recent advances in post-selection inference from Lee et al. (2016), Tibshirani et al. (2016), we develop exact hypothesis tests and confidence intervals for linear contrasts of the underlying mean vector, conditioned on any model selection event along the generalized lasso path (assuming Gaussian errors in the observations). By inspecting specific choices of D, we obtain post-selection tests and confidence intervals for specific cases of generalized lasso estimates, such as the fused lasso, trend filtering, and the graph fused lasso. In the fused lasso case, the underlying coordinates of the mean are assigned a linear ordering, and our framework allows us to test selectively chosen breakpoints or changepoints in these mean coordinates. This is an interesting and well-studied problem with broad applications, our framework applied to the trend filtering and graph fused lasso serves several applications as well. Aside from the development of selective inference tools, we describe several practical aspects of our methods such as valid post-processing of generalized estimates before performing inference in order to improve power, and problem-specific visualization aids that may be given to the data analyst for he/she to choose linear contrasts to be tested. Many examples, both from simulated and real data sources, are presented to examine the empirical properties of our inference methods.
Recommendations
Cites work
- scientific article; zbMATH DE number 425941 (Why is no real title available?)
- scientific article; zbMATH DE number 1485432 (Why is no real title available?)
- $\ell_1$ Trend Filtering
- A significance test for the lasso
- Adaptive piecewise polynomial estimation via trend filtering
- Asymptotics of selective inference
- CAN ONE ESTIMATE THE UNCONDITIONAL DISTRIBUTION OF POST-MODEL-SELECTION ESTIMATORS?
- Can one estimate the conditional distribution of post-model-selection estimators?
- Confidence regions and tests for a change-point in a sequence of exponential family random variables
- Exact post-selection inference, with application to the Lasso
- Extended Bayesian information criteria for model selection with large model spaces
- Extensions of some classical methods in change point analysis
- Inference about the change-point in a sequence of random variables
- Inference for single and multiple change-points in time series
- Likelihood ratio tests for multiple structural changes
- Multiscale change point inference. With discussion and authors' reply
- Nonlinear total variation based noise removal algorithms
- On total variation minimization and surface evolution using parametric maximum flows
- Pathwise coordinate optimization
- Post-selection point and interval estimation of signal sizes in Gaussian samples
- Selecting the number of principal components: estimation of the true rank of a noisy matrix
- Selective inference with a randomized response
- Sequential selection procedures and false discovery rate control
- Sparsity and Smoothness Via the Fused Lasso
- Spatial smoothing and hot spot detection for CGH data using the fused lasso
- Splines in higher order TV regularization
- THE FINITE-SAMPLE DISTRIBUTION OF POST-MODEL-SELECTION ESTIMATORS AND UNIFORM VERSUS NONUNIFORM APPROXIMATIONS
- The solution path of the generalized lasso
- Trend filtering on graphs
- Uniform asymptotic inference and the bootstrap after model selection
- Valid post-selection inference
- Wild binary segmentation for multiple change-point detection
Cited in
(15)- Changepoint detection by the quantile Lasso method
- Selective Inference for Hierarchical Clustering
- More Powerful Selective Inference for the Graph Fused Lasso
- Integrative methods for post-selection inference under convex constraints
- Post-selection inference for \(\ell_1\)-penalized likelihood models
- Post-model-selection inference in linear regression models: an integrated review
- Activation discovery with FDR control: application to fMRI data
- Trade-off between predictive performance and FDR control for high-dimensional Gaussian model selection
- An integrated panel data approach to modelling economic growth
- Narrowest Significance Pursuit: Inference for Multiple Change-Points in Linear Models
- Solution paths for the generalized Lasso with applications to spatially varying coefficients regression
- Post‐selection inference for changepoint detection algorithms with application to copy number variation data
- Exact post-selection inference, with application to the Lasso
- Post-selection inference of generalized linear models based on the lasso and the elastic net
- Bounded \(p\) values in parametric programming-based selective inference
This page was built for publication: Exact post-selection inference for the generalized Lasso path
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q1746554)