Properties and refinements of the fused Lasso
From MaRDI portal
Publication:834368
DOI10.1214/08-AOS665zbMATH Open1173.62027arXiv0805.0234MaRDI QIDQ834368FDOQ834368
Authors: Alessandro Rinaldo
Publication date: 19 August 2009
Published in: The Annals of Statistics (Search for Journal in Brave)
Abstract: We consider estimating an unknown signal, both blocky and sparse, which is corrupted by additive noise. We study three interrelated least squares procedures and their asymptotic properties. The first procedure is the fused lasso, put forward by Friedman et al. [Ann. Appl. Statist. 1 (2007) 302--332], which we modify into a different estimator, called the fused adaptive lasso, with better properties. The other two estimators we discuss solve least squares problems on sieves; one constrains the maximal norm and the maximal total variation seminorm, and the other restricts the number of blocks and the number of nonzero coordinates of the signal. We derive conditions for the recovery of the true block partition and the true sparsity patterns by the fused lasso and the fused adaptive lasso, and we derive convergence rates for the sieve estimators, explicitly in terms of the constraining parameters.
Full work available at URL: https://arxiv.org/abs/0805.0234
Recommendations
- Some properties of generalized fused Lasso and its applications to high dimensional data
- Sparsity and Smoothness Via the Fused Lasso
- Properties and iterative methods for the lasso and its variants
- On the robustness of the generalized fused Lasso to prior specifications
- Coordinate optimization for generalized fused Lasso
- On stepwise pattern recovery of the fused Lasso
- Fused multiple graphical lasso
- Fused Lasso penalized least absolute deviation estimator for high dimensional linear regression
- Properties and iterative methods for the \(Q\)-lasso
- Fused Lasso with the adaptation of parameter ordering in combining multiple studies with repeated measurements
Nonparametric regression and quantile regression (62G08) Asymptotic properties of nonparametric inference (62G20)
Cites Work
- Local extremes, runs, strings and multiresolution. (With discussion)
- Nonlinear total variation based noise removal algorithms
- Weak convergence and empirical processes. With applications to statistics
- Pathwise coordinate optimization
- Title not available (Why is that?)
- Consistencies and rates of convergence of jump-penalized least squares estimators
- Locally adaptive regression splines
- Title not available (Why is that?)
- Title not available (Why is that?)
- Minimax risk over \(l_ p\)-balls for \(l_ q\)-error
- Title not available (Why is that?)
- Concentration inequalities and model selection. Ecole d'Eté de Probabilités de Saint-Flour XXXIII -- 2003.
- The Discontinuity Set of Solutions of the TV Denoising Problem and Some Extensions
- Enumerative combinatorics. Volume 2.
- Best subset selection, persistence in high-dimensional statistical learning and optimization under \(l_1\) constraint
- Title not available (Why is that?)
- Convergence of an Iterative Method for Total Variation Denoising
- Adaptive estimation with soft thresholding penalties
Cited In (54)
- A Unified Framework for Change Point Detection in High-Dimensional Linear Models
- An extended linearized alternating direction method of multipliers for fused-Lasso penalized linear regression
- Pairwise fusion approach incorporating prior constraint information
- Estimating time-varying networks
- Nonuniqueness of solutions of a class of \(\ell_0\)-minimization problems
- On stepwise pattern recovery of the fused Lasso
- The DFS fused Lasso: linear-time denoising over general graphs
- Multiple change-point detection: a selective overview
- The smooth-Lasso and other \(\ell _{1}+\ell _{2}\)-penalized methods
- An algorithm for iterative selection of blocks of features
- Fused Lasso nearly-isotonic signal approximation in general dimensions
- Group sparse structural smoothing recovery: model, statistical properties and algorithm
- Univariate mean change point detection: penalization, CUSUM and optimality
- Fused Lasso algorithm for \(\mathrm{Cox}^{\prime}\) proportional hazards and binomial logit models with application to copy number profiles
- Fused Lasso penalized least absolute deviation estimator for high dimensional linear regression
- Shrinkage estimation of regression models with multiple structural changes
- Sparse regression with multi-type regularized feature modeling
- Modified path algorithm of fused Lasso signal approximator for consistent recovery of change points
- Simultaneous Grouping Pursuit and Feature Selection Over an Undirected Graph
- A general framework for tensor screening through smoothing
- Penalized estimation of threshold auto-regressive models with many components and thresholds
- Title not available (Why is that?)
- Outlier detection in time series via mixed-integer conic quadratic optimization
- Modular proximal optimization for multidimensional total-variation regularization
- Tuning parameter selection in fused lasso signal approximator with false discovery rate control
- Properties and iterative methods for the \(Q\)-lasso
- Simultaneous feature selection and clustering based on square root optimization
- Penalized differential pathway analysis of integrative oncogenomics studies
- Adaptive nonparametric regression with the \(K\)-nearest neighbour fused Lasso
- Solution uniqueness of convex piecewise affine functions based optimization with applications to constrained \(\ell_1\) minimization
- Horseshoe shrinkage methods for Bayesian fusion estimation
- Bayesian fusion estimation via \(t\) shrinkage
- Shrinkage estimation of common breaks in panel data models via adaptive group fused Lasso
- Mixed-effect time-varying network model and application in brain connectivity analysis
- Exact spike train inference via \(\ell_{0}\) optimization
- Dual-density-based reweighted \(\ell_1\)-algorithms for a class of \(\ell_0\)-minimization problems
- Wild binary segmentation for multiple change-point detection
- Path algorithms for fused lasso signal approximator with application to COVID‐19 spread in Korea
- More Powerful Selective Inference for the Graph Fused Lasso
- A new active zero set descent algorithm for least absolute deviation with generalized LASSO penalty
- Detecting possibly frequent change-points: wild binary segmentation 2 and steepest-drop model selection
- Partially Observed Dynamic Tensor Response Regression
- The screening and ranking algorithm to detect DNA copy number variations
- Linearized alternating direction method of multipliers for sparse group and fused Lasso models
- A cluster elastic net for multivariate regression
- Adaptive gPCA: a method for structured dimensionality reduction with applications to microbiome data
- Tail-greedy bottom-up data decompositions and fast multiple change-point detection
- Estimating networks with jumps
- Individualized Multidirectional Variable Selection
- Approximate \(\ell_0\)-penalized estimation of piecewise-constant signals on graphs
- Asymptotic of the number of false change points of the fused lasso signal approximator
- High-dimensional variable selection accounting for heterogeneity in regression coefficients across multiple data sources
- Element-wise estimation error of generalized Fused Lasso
- Empirical priors and posterior concentration in a piecewise polynomial sequence model
Uses Software
This page was built for publication: Properties and refinements of the fused Lasso
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q834368)