Abstract: This paper develops a theory for group Lasso using a concept called strong group sparsity. Our result shows that group Lasso is superior to standard Lasso for strongly group-sparse signals. This provides a convincing theoretical justification for using group sparse regularization when the underlying group structure is consistent with the data. Moreover, the theory predicts some limitations of the group Lasso formulation that are confirmed by simulation studies.
Recommendations
- The benefit of group sparsity in group inference with de-biased scaled group Lasso
- A Benchmark for Sparse Coding: When Group Sparsity Meets Rank Minimization
- A Penalty Function Promoting Sparsity Within and Across Groups
- An Iterative Sparse-Group Lasso
- Group sparse RLS algorithms
- Group sparse optimization via \(\ell_{p,q}\) regularization
- Group-Sparse Model Selection: Hardness and Relaxations
- Group Sparse Recovery via the $\ell ^0(\ell ^2)$ Penalty: Theory and Algorithm
- Sparse Group Lasso: Optimal Sample Complexity, Convergence Rate, and Statistical Inference
Cites work
- scientific article; zbMATH DE number 194093 (Why is no real title available?)
- An Empirical Bayesian Strategy for Solving the Simultaneous Sparse Approximation Problem
- Compressed Sensing and Redundant Dictionaries
- Consistency of the group Lasso and multiple kernel learning
- Decoding by Linear Programming
- Information-theoretic upper and lower bounds for statistical estimation
- Model Selection and Estimation in Regression with Grouped Variables
- Multitask Compressive Sensing
- On the Reconstruction of Block-Sparse Signals With an Optimal Number of Measurements
- On the asymptotic properties of the group lasso estimator for linear models
- Simultaneous analysis of Lasso and Dantzig selector
- Some sharp performance bounds for least squares regression with L₁ regularization
- Sparse recovery problems in high dimensions: statistical inference and learning theory. Abstracts from the mini-workshop held March 15th -- March 21st, 2009.
- Support union recovery in high-dimensional multivariate regression
- The sparsity and bias of the LASSO selection in high-dimensional linear regression
Cited in
(82)- ``Grouping strategies and thresholding for high dimensional linear models: discussion
- Sparsity with sign-coherent groups of variables via the cooperative-Lasso
- High-dimensional regression with unknown variance
- High-dimensional grouped folded concave penalized estimation via the LLA algorithm
- Compressed sensing of color images
- Error bounds for compressed sensing algorithms with group sparsity: A unified approach
- A Benchmark for Sparse Coding: When Group Sparsity Meets Rank Minimization
- Group sparse recovery via group square-root elastic net and the iterative multivariate thresholding-based algorithm
- Continuous exact relaxation and alternating proximal gradient algorithm for partial sparse and partial group sparse optimization problems
- An Interactive Greedy Approach to Group Sparsity in High Dimensions
- Group sparse RLS algorithms
- scientific article; zbMATH DE number 7370643 (Why is no real title available?)
- Penalized estimation in additive varying coefficient models using grouped regularization
- Structured variable selection via prior-induced hierarchical penalty functions
- Tuning-free heterogeneous inference in massive networks
- Kernelized elastic net regularization: generalization bounds, and sparse recovery
- Recovery of simultaneous low rank and two-way sparse coefficient matrices, a nonconvex approach
- Posterior contraction in group sparse logit models for categorical responses
- Discrete optimization methods for group model selection in compressed sensing
- Joint sparse optimization: lower-order regularization method and application in cell fate conversion
- Structured, sparse aggregation
- Cardinality minimization, constraints, and regularization: a survey
- Two new approaches to compressed sensing exhibiting both robust sparse recovery and the grouping effect
- Theoretical properties of the overlapping groups Lasso
- Gelfand numbers related to structured sparsity and Besov space embeddings with small mixed smoothness
- Penalized estimation of threshold auto-regressive models with many components and thresholds
- High-order evaluation complexity for convexly-constrained optimization with non-Lipschitzian group sparsity terms
- Grouped variable selection with discrete optimization: computational and statistical perspectives
- Local optimality for stationary points of group zero-norm regularized problems and equivalent surrogates
- Salt and pepper noise removal with multi-class dictionary learning and L\(_0\) norm regularizations
- Overlapping group lasso for high-dimensional generalized linear models
- A partially linear framework for massive heterogeneous data
- Bayesian linear regression for multivariate responses under group sparsity
- Dynamic semi-parametric factor model for functional expectiles
- Resource-aware MPC for constrained nonlinear systems: a self-triggered control approach
- Tight conditions for consistency of variable selection in the context of high dimensionality
- Structured sparsity: discrete and convex approaches
- Trace regression model with simultaneously low rank and row(column) sparse parameter
- Efficient nonconvex sparse group feature selection via continuous and discrete optimization
- Multivariate log-contrast regression with sub-compositional predictors: testing the association between preterm infants' gut microbiome and neurobehavioral outcomes
- Split Bregman algorithms for sparse group lasso with application to MRI reconstruction
- Split Bregman algorithms for multiple measurement vector problem
- On group-wise \(\ell_p\) regularization: theory and efficient algorithms
- Simultaneous off-the-grid learning of mixtures issued from a continuous dictionary
- Sparse optimization for nonconvex group penalized estimation
- Robust face recognition via block sparse Bayesian learning
- A Unified Approach to Sparse Tweedie Modeling of Multisource Insurance Claim Data
- Recovery of block sparse signals under the conditions on block RIC and ROC by BOMP and BOMMP
- A sharp recovery condition for block sparse signals by block orthogonal multi-matching pursuit
- Robust grouped variable selection using distributionally robust optimization
- Constrained mix sparse optimization via hard thresholding pursuit
- fg-ORKA: fast and gridless reconstruction of moving and deforming objects in multidimensional data
- The benefit of group sparsity in group inference with de-biased scaled group Lasso
- Lasso in Infinite dimension: application to variable selection in functional multivariate linear regression
- A doubly sparse approach for group variable selection
- scientific article; zbMATH DE number 7594592 (Why is no real title available?)
- Sparsity constrained estimation in image processing and computer vision
- Computation of second-order directional stationary points for group sparse optimization
- Fast global convergence of gradient methods for high-dimensional statistical recovery
- Testable uniqueness conditions for empirical assessment of undersampling levels in total variation-regularized X-ray CT
- On the null space property of \(l_q\)-minimization for \(0 < q \leq 1\) in compressed sensing
- Solving constrained nonsmooth group sparse optimization via group Capped-\(\ell_1\) relaxation and group smoothing proximal gradient algorithm
- Accuracy guaranties for \(\ell_{1}\) recovery of block-sparse signals
- Robust inference on average treatment effects with possibly more covariates than observations
- A unified framework for high-dimensional analysis of \(M\)-estimators with decomposable regularizers
- A note on coding and standardization of categorical variables in (sparse) group Lasso regression
- AIC for the group Lasso in generalized linear models
- Individual Data Protected Integrative Regression Analysis of High-Dimensional Heterogeneous Data
- Estimation of (near) low-rank matrices with noise and high-dimensional scaling
- Quantile regression feature selection and estimation with grouped variables using Huber approximation
- A simple Gaussian measurement bound for exact recovery of block-sparse signals
- Support union recovery in high-dimensional multivariate regression
- Group Sparse Optimization for Images Recovery Using Capped Folded Concave Functions
- Structured sparsity through convex optimization
- A perturbation analysis of nonconvex block-sparse compressed sensing
- The Convex Mixture Distribution: Granger Causality for Categorical Time Series
- Moving force identification based on group Lasso and compressed sensing
- Oracle inequalities and optimal inference under group sparsity
- A selective review of group selection in high-dimensional models
- Sparse estimation by exponential weighting
- Quantile regression with group Lasso for classification
- Stable recovery of approximately block \(k\)-sparse signals with partial block support information via weighted \(\ell_2/\ell_p\) (\(0 < p \leq 1\)) minimization
This page was built for publication: The benefit of group sparsity
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q987996)