On stepwise pattern recovery of the fused Lasso
From MaRDI portal
(Redirected from Publication:1660156)
Abstract: We study the property of the Fused Lasso Signal Approximator (FLSA) for estimating a blocky signal sequence with additive noise. We transform the FLSA to an ordinary Lasso problem. By studying the property of the design matrix in the transformed Lasso problem, we find that the irrepresentable condition might not hold, in which case we show that the FLSA might not be able to recover the signal pattern. We then apply the newly developed preconditioning method -- Puffer Transformation [Jia and Rohe, 2012] on the transformed Lasso problem. We call the new method the preconditioned fused Lasso and we give non-asymptotic results for this method. Results show that when the signal jump strength (signal difference between two neighboring groups) is big and the noise level is small, our preconditioned fused Lasso estimator gives the correct pattern with high probability. Theoretical results give insight on what controls the signal pattern recovery ability -- it is the noise level {instead of} the length of the sequence. Simulations confirm our theorems and show significant improvement of the preconditioned fused Lasso estimator over the vanilla FLSA.
Recommendations
Cites work
- scientific article; zbMATH DE number 5957408 (Why is no real title available?)
- scientific article; zbMATH DE number 845714 (Why is no real title available?)
- scientific article; zbMATH DE number 7679348 (Why is no real title available?)
- A Singular Value Thresholding Algorithm for Matrix Completion
- Asymptotics for Lasso-type estimators.
- High-dimensional graphs and variable selection with the Lasso
- Information-Theoretic Limits on Sparsity Recovery in the High-Dimensional and Noisy Setting
- Just relax: convex programming methods for identifying sparse signals in noise
- Multiple Change-Point Estimation With a Total Variation Penalty
- Pathwise coordinate optimization
- Persistene in high-dimensional linear predictor-selection and the virtue of overparametrization
- Preconditioning the Lasso for sign consistency
- Properties and refinements of the fused Lasso
- Sparsity and Smoothness Via the Fused Lasso
- Spatial smoothing and hot spot detection for CGH data using the fused lasso
- Stable recovery of sparse overcomplete representations in the presence of noise
- The lasso under Poisson-like heteroscedasticity
- The solution path of the generalized lasso
- The sparsity and bias of the LASSO selection in high-dimensional linear regression
- Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties
Cited in
(15)- Orthogonal one step greedy procedure for heteroscedastic linear models
- Oracle efficient estimation of structural breaks in cointegrating regressions
- The DFS fused Lasso: linear-time denoising over general graphs
- Multiple change-point detection: a selective overview
- A modified information criterion for tuning parameter selection in 1d fused LASSO for inference on multiple change points
- Modified path algorithm of fused Lasso signal approximator for consistent recovery of change points
- Tuning parameter selection in fused lasso signal approximator with false discovery rate control
- Preconditioning the Lasso for sign consistency
- Exact spike train inference via \(\ell_{0}\) optimization
- Prediction bounds for higher order total variation regularized least squares
- Path algorithms for fused lasso signal approximator with application to COVID‐19 spread in Korea
- Optimal covariance change point localization in high dimensions
- Properties and refinements of the fused Lasso
- Empirical priors and posterior concentration in a piecewise polynomial sequence model
- On the total variation regularized estimator over a class of tree graphs
This page was built for publication: On stepwise pattern recovery of the fused Lasso
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q1660156)