The Group Square-Root Lasso: Theoretical Properties and Fast Algorithms

From MaRDI portal
Publication:5346130

DOI10.1109/TIT.2013.2290040zbMath1364.94113arXiv1302.0261OpenAlexW2963825166MaRDI QIDQ5346130

Johannes Lederer, Yiyuan She, Florentina Bunea

Publication date: 8 June 2017

Published in: IEEE Transactions on Information Theory (Search for Journal in Brave)

Full work available at URL: https://arxiv.org/abs/1302.0261



Related Items

Adaptive estimation in multivariate response regression with hidden variables, The benefit of group sparsity in group inference with de-biased scaled group Lasso, Thresholding tests based on affine Lasso to achieve non-asymptotic nominal level and high power under sparse and dense alternatives in high dimension, An Efficient Algorithm for Minimizing Multi Non-Smooth Component Functions, Robust grouped variable selection using distributionally robust optimization, Group sparse recovery via group square-root elastic net and the iterative multivariate thresholding-based algorithm, A dual semismooth Newton based augmented Lagrangian method for large-scale linearly constrained sparse group square-root Lasso problems, Correcting for unknown errors in sparse high-dimensional function approximation, Optimal learning, Sparse additive models in high dimensions with wavelets, Sign-constrained least squares estimation for high-dimensional regression, Oracle inequalities for high-dimensional prediction, Prediction error bounds for linear regression with the TREX, Simultaneous feature selection and clustering based on square root optimization, Selective linearization for multi-block statistical learning, Flexible, boundary adapted, nonparametric methods for the estimation of univariate piecewise-smooth functions, Tuning-Free Heterogeneity Pursuit in Massive Networks, Sharp Oracle Inequalities for Square Root Regularization, Multidimensional linear functional estimation in sparse Gaussian models and robust estimation of the mean, Unnamed Item