The benefit of group sparsity

From MaRDI portal
Revision as of 21:19, 30 January 2024 by Import240129110113 (talk | contribs) (Created automatically from import240129110113)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)

Publication:987996


DOI10.1214/09-AOS778zbMath1202.62052arXiv0901.2962MaRDI QIDQ987996

Tong Zhang, Junzhou Huang

Publication date: 24 August 2010

Published in: The Annals of Statistics (Search for Journal in Brave)

Full work available at URL: https://arxiv.org/abs/0901.2962



Related Items

Posterior contraction in group sparse logit models for categorical responses, Penalized estimation of threshold auto-regressive models with many components and thresholds, The benefit of group sparsity in group inference with de-biased scaled group Lasso, A partially linear framework for massive heterogeneous data, Salt and pepper noise removal with multi-class dictionary learning and L\(_0\) norm regularizations, Testable uniqueness conditions for empirical assessment of undersampling levels in total variation-regularized X-ray CT, Structured Sparsity: Discrete and Convex Approaches, Split Bregman algorithms for sparse group lasso with application to MRI reconstruction, Split Bregman algorithms for multiple measurement vector problem, Sparsity Constrained Estimation in Image Processing and Computer Vision, Robust grouped variable selection using distributionally robust optimization, Trace regression model with simultaneously low rank and row(column) sparse parameter, Structured variable selection via prior-induced hierarchical penalty functions, Overlapping group lasso for high-dimensional generalized linear models, Error bounds for compressed sensing algorithms with group sparsity: A unified approach, On group-wise \(\ell_p\) regularization: theory and efficient algorithms, Bayesian linear regression for multivariate responses under group sparsity, Recovery of block sparse signals under the conditions on block RIC and ROC by BOMP and BOMMP, A doubly sparse approach for group variable selection, ``Grouping strategies and thresholding for high dimensional linear models: discussion, Dynamic semi-parametric factor model for functional expectiles, Quantile regression with group Lasso for classification, Solving constrained nonsmooth group sparse optimization via group Capped-\(\ell_1\) relaxation and group smoothing proximal gradient algorithm, Grouped variable selection with discrete optimization: computational and statistical perspectives, Local optimality for stationary points of group zero-norm regularized problems and equivalent surrogates, Efficient nonconvex sparse group feature selection via continuous and discrete optimization, Group sparse recovery via group square-root elastic net and the iterative multivariate thresholding-based algorithm, Individual Data Protected Integrative Regression Analysis of High-Dimensional Heterogeneous Data, Unnamed Item, Lasso in Infinite dimension: application to variable selection in functional multivariate linear regression, A perturbation analysis of nonconvex block-sparse compressed sensing, Support union recovery in high-dimensional multivariate regression, A sharp recovery condition for block sparse signals by block orthogonal multi-matching pursuit, Theoretical properties of the overlapping groups Lasso, Sparsity with sign-coherent groups of variables via the cooperative-Lasso, Compressed sensing of color images, Kernelized Elastic Net Regularization: Generalization Bounds, and Sparse Recovery, Oracle inequalities and optimal inference under group sparsity, High-dimensional grouped folded concave penalized estimation via the LLA algorithm, Robust face recognition via block sparse Bayesian learning, Unnamed Item, Resource-aware MPC for constrained nonlinear systems: a self-triggered control approach, On the null space property of \(l_q\)-minimization for \(0 < q \leq 1\) in compressed sensing, Discrete optimization methods for group model selection in compressed sensing, Robust inference on average treatment effects with possibly more covariates than observations, Structured, Sparse Aggregation, Estimation of (near) low-rank matrices with noise and high-dimensional scaling, High-order evaluation complexity for convexly-constrained optimization with non-Lipschitzian group sparsity terms, Gelfand numbers related to structured sparsity and Besov space embeddings with small mixed smoothness, Recovery of simultaneous low rank and two-way sparse coefficient matrices, a nonconvex approach, Structured sparsity through convex optimization, A selective review of group selection in high-dimensional models, High-dimensional regression with unknown variance, A unified framework for high-dimensional analysis of \(M\)-estimators with decomposable regularizers, Sparse estimation by exponential weighting, A Note on Coding and Standardization of Categorical Variables in (Sparse) Group Lasso Regression, Computation of second-order directional stationary points for group sparse optimization, Tuning-Free Heterogeneity Pursuit in Massive Networks, Fast global convergence of gradient methods for high-dimensional statistical recovery, Tight conditions for consistency of variable selection in the context of high dimensionality, Accuracy guaranties for \(\ell_{1}\) recovery of block-sparse signals, Penalized estimation in additive varying coefficient models using grouped regularization, AIC for the group Lasso in generalized linear models, Sparse optimization for nonconvex group penalized estimation, A simple Gaussian measurement bound for exact recovery of block-sparse signals, Quantile regression feature selection and estimation with grouped variables using Huber approximation, The Convex Mixture Distribution: Granger Causality for Categorical Time Series, Group Sparse Optimization for Images Recovery Using Capped Folded Concave Functions



Cites Work