Properties and refinements of the fused Lasso
From MaRDI portal
(Redirected from Publication:834368)
Abstract: We consider estimating an unknown signal, both blocky and sparse, which is corrupted by additive noise. We study three interrelated least squares procedures and their asymptotic properties. The first procedure is the fused lasso, put forward by Friedman et al. [Ann. Appl. Statist. 1 (2007) 302--332], which we modify into a different estimator, called the fused adaptive lasso, with better properties. The other two estimators we discuss solve least squares problems on sieves; one constrains the maximal norm and the maximal total variation seminorm, and the other restricts the number of blocks and the number of nonzero coordinates of the signal. We derive conditions for the recovery of the true block partition and the true sparsity patterns by the fused lasso and the fused adaptive lasso, and we derive convergence rates for the sieve estimators, explicitly in terms of the constraining parameters.
Recommendations
- Some properties of generalized fused Lasso and its applications to high dimensional data
- Sparsity and Smoothness Via the Fused Lasso
- Properties and iterative methods for the lasso and its variants
- On the robustness of the generalized fused Lasso to prior specifications
- Coordinate optimization for generalized fused Lasso
- On stepwise pattern recovery of the fused Lasso
- Fused multiple graphical lasso
- Fused Lasso penalized least absolute deviation estimator for high dimensional linear regression
- Properties and iterative methods for the \(Q\)-lasso
- Fused Lasso with the adaptation of parameter ordering in combining multiple studies with repeated measurements
Cites work
- scientific article; zbMATH DE number 5957408 (Why is no real title available?)
- scientific article; zbMATH DE number 5654889 (Why is no real title available?)
- scientific article; zbMATH DE number 49190 (Why is no real title available?)
- scientific article; zbMATH DE number 1215245 (Why is no real title available?)
- scientific article; zbMATH DE number 713342 (Why is no real title available?)
- Adaptive estimation with soft thresholding penalties
- Best subset selection, persistence in high-dimensional statistical learning and optimization under l₁ constraint
- Concentration inequalities and model selection. Ecole d'Eté de Probabilités de Saint-Flour XXXIII -- 2003.
- Consistencies and rates of convergence of jump-penalized least squares estimators
- Convergence of an Iterative Method for Total Variation Denoising
- Enumerative combinatorics. Volume 2.
- Local extremes, runs, strings and multiresolution. (With discussion)
- Locally adaptive regression splines
- Minimax risk over \(l_ p\)-balls for \(l_ q\)-error
- Nonlinear total variation based noise removal algorithms
- Pathwise coordinate optimization
- The Discontinuity Set of Solutions of the TV Denoising Problem and Some Extensions
- Weak convergence and empirical processes. With applications to statistics
Cited in
(54)- Approximate \(\ell_0\)-penalized estimation of piecewise-constant signals on graphs
- Estimating time-varying networks
- Pairwise fusion approach incorporating prior constraint information
- Individualized Multidirectional Variable Selection
- On stepwise pattern recovery of the fused Lasso
- Nonuniqueness of solutions of a class of \(\ell_0\)-minimization problems
- A Unified Framework for Change Point Detection in High-Dimensional Linear Models
- An extended linearized alternating direction method of multipliers for fused-Lasso penalized linear regression
- The DFS fused Lasso: linear-time denoising over general graphs
- Multiple change-point detection: a selective overview
- The smooth-Lasso and other \(\ell _{1}+\ell _{2}\)-penalized methods
- An algorithm for iterative selection of blocks of features
- Univariate mean change point detection: penalization, CUSUM and optimality
- Fused Lasso nearly-isotonic signal approximation in general dimensions
- Group sparse structural smoothing recovery: model, statistical properties and algorithm
- Fused Lasso algorithm for \(\mathrm{Cox}^{\prime}\) proportional hazards and binomial logit models with application to copy number profiles
- Asymptotic of the number of false change points of the fused lasso signal approximator
- Fused Lasso penalized least absolute deviation estimator for high dimensional linear regression
- Shrinkage estimation of regression models with multiple structural changes
- Sparse regression with multi-type regularized feature modeling
- Modified path algorithm of fused Lasso signal approximator for consistent recovery of change points
- Simultaneous Grouping Pursuit and Feature Selection Over an Undirected Graph
- A general framework for tensor screening through smoothing
- Penalized estimation of threshold auto-regressive models with many components and thresholds
- scientific article; zbMATH DE number 7370569 (Why is no real title available?)
- Outlier detection in time series via mixed-integer conic quadratic optimization
- Modular proximal optimization for multidimensional total-variation regularization
- Tuning parameter selection in fused lasso signal approximator with false discovery rate control
- Properties and iterative methods for the \(Q\)-lasso
- Simultaneous feature selection and clustering based on square root optimization
- Penalized differential pathway analysis of integrative oncogenomics studies
- Adaptive nonparametric regression with the \(K\)-nearest neighbour fused Lasso
- Horseshoe shrinkage methods for Bayesian fusion estimation
- Solution uniqueness of convex piecewise affine functions based optimization with applications to constrained \(\ell_1\) minimization
- Bayesian fusion estimation via \(t\) shrinkage
- Shrinkage estimation of common breaks in panel data models via adaptive group fused Lasso
- Exact spike train inference via \(\ell_{0}\) optimization
- High-dimensional variable selection accounting for heterogeneity in regression coefficients across multiple data sources
- Element-wise estimation error of generalized Fused Lasso
- Mixed-effect time-varying network model and application in brain connectivity analysis
- Wild binary segmentation for multiple change-point detection
- Dual-density-based reweighted \(\ell_1\)-algorithms for a class of \(\ell_0\)-minimization problems
- Path algorithms for fused lasso signal approximator with application to COVID‐19 spread in Korea
- More Powerful Selective Inference for the Graph Fused Lasso
- A new active zero set descent algorithm for least absolute deviation with generalized LASSO penalty
- Empirical priors and posterior concentration in a piecewise polynomial sequence model
- Detecting possibly frequent change-points: wild binary segmentation 2 and steepest-drop model selection
- The screening and ranking algorithm to detect DNA copy number variations
- Linearized alternating direction method of multipliers for sparse group and fused Lasso models
- Partially Observed Dynamic Tensor Response Regression
- A cluster elastic net for multivariate regression
- Tail-greedy bottom-up data decompositions and fast multiple change-point detection
- Adaptive gPCA: a method for structured dimensionality reduction with applications to microbiome data
- Estimating networks with jumps
This page was built for publication: Properties and refinements of the fused Lasso
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q834368)