The benefit of group sparsity
From MaRDI portal
Publication:987996
DOI10.1214/09-AOS778zbMATH Open1202.62052arXiv0901.2962MaRDI QIDQ987996FDOQ987996
Authors: Junzhou Huang, Tong Zhang
Publication date: 24 August 2010
Published in: The Annals of Statistics (Search for Journal in Brave)
Abstract: This paper develops a theory for group Lasso using a concept called strong group sparsity. Our result shows that group Lasso is superior to standard Lasso for strongly group-sparse signals. This provides a convincing theoretical justification for using group sparse regularization when the underlying group structure is consistent with the data. Moreover, the theory predicts some limitations of the group Lasso formulation that are confirmed by simulation studies.
Full work available at URL: https://arxiv.org/abs/0901.2962
Recommendations
- The benefit of group sparsity in group inference with de-biased scaled group Lasso
- A Benchmark for Sparse Coding: When Group Sparsity Meets Rank Minimization
- A Penalty Function Promoting Sparsity Within and Across Groups
- An Iterative Sparse-Group Lasso
- Group sparse RLS algorithms
- Group sparse optimization via \(\ell_{p,q}\) regularization
- Group-Sparse Model Selection: Hardness and Relaxations
- Group Sparse Recovery via the $\ell ^0(\ell ^2)$ Penalty: Theory and Algorithm
- Sparse Group Lasso: Optimal Sample Complexity, Convergence Rate, and Statistical Inference
Lassoparameter estimationvariable selectionregressionsparsitygroup Lasso\(L_1\) regularizationgroup sparsity
Nonparametric estimation (62G05) Nonparametric regression and quantile regression (62G08) Linear regression; mixed models (62J05)
Cites Work
- Simultaneous analysis of Lasso and Dantzig selector
- Model Selection and Estimation in Regression with Grouped Variables
- The sparsity and bias of the LASSO selection in high-dimensional linear regression
- Decoding by Linear Programming
- Consistency of the group Lasso and multiple kernel learning
- Some sharp performance bounds for least squares regression with \(L_1\) regularization
- Title not available (Why is that?)
- On the asymptotic properties of the group lasso estimator for linear models
- Compressed Sensing and Redundant Dictionaries
- On the Reconstruction of Block-Sparse Signals With an Optimal Number of Measurements
- Support union recovery in high-dimensional multivariate regression
- Multitask Compressive Sensing
- An Empirical Bayesian Strategy for Solving the Simultaneous Sparse Approximation Problem
- Sparse recovery problems in high dimensions: statistical inference and learning theory. Abstracts from the mini-workshop held March 15th -- March 21st, 2009.
- Information-theoretic upper and lower bounds for statistical estimation
Cited In (82)
- ``Grouping strategies and thresholding for high dimensional linear models: discussion
- Sparsity with sign-coherent groups of variables via the cooperative-Lasso
- A Benchmark for Sparse Coding: When Group Sparsity Meets Rank Minimization
- High-dimensional grouped folded concave penalized estimation via the LLA algorithm
- Error bounds for compressed sensing algorithms with group sparsity: A unified approach
- Compressed sensing of color images
- Group sparse RLS algorithms
- Penalized estimation in additive varying coefficient models using grouped regularization
- Structured variable selection via prior-induced hierarchical penalty functions
- Kernelized elastic net regularization: generalization bounds, and sparse recovery
- Tuning-free heterogeneous inference in massive networks
- Recovery of simultaneous low rank and two-way sparse coefficient matrices, a nonconvex approach
- Discrete optimization methods for group model selection in compressed sensing
- Posterior contraction in group sparse logit models for categorical responses
- Two new approaches to compressed sensing exhibiting both robust sparse recovery and the grouping effect
- Theoretical properties of the overlapping groups Lasso
- Gelfand numbers related to structured sparsity and Besov space embeddings with small mixed smoothness
- Penalized estimation of threshold auto-regressive models with many components and thresholds
- High-order evaluation complexity for convexly-constrained optimization with non-Lipschitzian group sparsity terms
- A partially linear framework for massive heterogeneous data
- Bayesian linear regression for multivariate responses under group sparsity
- Dynamic semi-parametric factor model for functional expectiles
- Structured sparsity: discrete and convex approaches
- Resource-aware MPC for constrained nonlinear systems: a self-triggered control approach
- Tight conditions for consistency of variable selection in the context of high dimensionality
- Trace regression model with simultaneously low rank and row(column) sparse parameter
- Efficient nonconvex sparse group feature selection via continuous and discrete optimization
- Sparse optimization for nonconvex group penalized estimation
- Split Bregman algorithms for sparse group lasso with application to MRI reconstruction
- Split Bregman algorithms for multiple measurement vector problem
- On group-wise \(\ell_p\) regularization: theory and efficient algorithms
- Robust face recognition via block sparse Bayesian learning
- Recovery of block sparse signals under the conditions on block RIC and ROC by BOMP and BOMMP
- A sharp recovery condition for block sparse signals by block orthogonal multi-matching pursuit
- Robust grouped variable selection using distributionally robust optimization
- Lasso in Infinite dimension: application to variable selection in functional multivariate linear regression
- Sparsity constrained estimation in image processing and computer vision
- A doubly sparse approach for group variable selection
- The benefit of group sparsity in group inference with de-biased scaled group Lasso
- Testable uniqueness conditions for empirical assessment of undersampling levels in total variation-regularized X-ray CT
- Fast global convergence of gradient methods for high-dimensional statistical recovery
- On the null space property of \(l_q\)-minimization for \(0 < q \leq 1\) in compressed sensing
- A unified framework for high-dimensional analysis of \(M\)-estimators with decomposable regularizers
- Accuracy guaranties for \(\ell_{1}\) recovery of block-sparse signals
- Robust inference on average treatment effects with possibly more covariates than observations
- A simple Gaussian measurement bound for exact recovery of block-sparse signals
- A note on coding and standardization of categorical variables in (sparse) group Lasso regression
- Quantile regression feature selection and estimation with grouped variables using Huber approximation
- Structured sparsity through convex optimization
- Group Sparse Optimization for Images Recovery Using Capped Folded Concave Functions
- Support union recovery in high-dimensional multivariate regression
- Estimation of (near) low-rank matrices with noise and high-dimensional scaling
- The Convex Mixture Distribution: Granger Causality for Categorical Time Series
- A perturbation analysis of nonconvex block-sparse compressed sensing
- A selective review of group selection in high-dimensional models
- Sparse estimation by exponential weighting
- Oracle inequalities and optimal inference under group sparsity
- Quantile regression with group Lasso for classification
- High-dimensional regression with unknown variance
- Group sparse recovery via group square-root elastic net and the iterative multivariate thresholding-based algorithm
- Continuous exact relaxation and alternating proximal gradient algorithm for partial sparse and partial group sparse optimization problems
- An Interactive Greedy Approach to Group Sparsity in High Dimensions
- Title not available (Why is that?)
- Joint sparse optimization: lower-order regularization method and application in cell fate conversion
- Cardinality minimization, constraints, and regularization: a survey
- Structured, sparse aggregation
- Grouped variable selection with discrete optimization: computational and statistical perspectives
- Local optimality for stationary points of group zero-norm regularized problems and equivalent surrogates
- Overlapping group lasso for high-dimensional generalized linear models
- Salt and pepper noise removal with multi-class dictionary learning and L\(_0\) norm regularizations
- Multivariate log-contrast regression with sub-compositional predictors: testing the association between preterm infants' gut microbiome and neurobehavioral outcomes
- Simultaneous off-the-grid learning of mixtures issued from a continuous dictionary
- A Unified Approach to Sparse Tweedie Modeling of Multisource Insurance Claim Data
- Constrained mix sparse optimization via hard thresholding pursuit
- fg-ORKA: fast and gridless reconstruction of moving and deforming objects in multidimensional data
- Title not available (Why is that?)
- Computation of second-order directional stationary points for group sparse optimization
- Solving constrained nonsmooth group sparse optimization via group Capped-\(\ell_1\) relaxation and group smoothing proximal gradient algorithm
- Individual Data Protected Integrative Regression Analysis of High-Dimensional Heterogeneous Data
- AIC for the group Lasso in generalized linear models
- Moving force identification based on group Lasso and compressed sensing
- Stable recovery of approximately block \(k\)-sparse signals with partial block support information via weighted \(\ell_2/\ell_p\) (\(0 < p \leq 1\)) minimization
This page was built for publication: The benefit of group sparsity
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q987996)