On the asymptotic properties of the group lasso estimator for linear models
From MaRDI portal
Publication:1951765
DOI10.1214/08-EJS200zbMath1320.62167OpenAlexW2094311431MaRDI QIDQ1951765
Yuval Nardi, Alessandro Rinaldo
Publication date: 24 May 2013
Published in: Electronic Journal of Statistics (Search for Journal in Brave)
Full work available at URL: https://projecteuclid.org/euclid.ejs/1217450797
Related Items
Posterior contraction in group sparse logit models for categorical responses, Bayesian variable selection and estimation for group Lasso, Group variable selection in cardiopulmonary cerebral resuscitation data for veterinary patients, The benefit of group sparsity in group inference with de-biased scaled group Lasso, A partially linear framework for massive heterogeneous data, Adaptive group Lasso selection in quantile models, Modeling gene-covariate interactions in sparse regression with group structure for genome-wide association studies, Group variable selection via convex log‐exp‐sum penalty with application to a breast cancer survivor study, Trace regression model with simultaneously low rank and row(column) sparse parameter, Bayesian linear regression for multivariate responses under group sparsity, Variable selection based on squared derivative averages, Globally sparse and locally dense signal recovery for compressed sensing, Variable selection and regression analysis for graph-structured covariates with an application to genomics, Grouped variable selection with discrete optimization: computational and statistical perspectives, Joint association and classification analysis of multi‐view data, Non-asymptotic oracle inequalities for the Lasso and Group Lasso in high dimensional logistic model, A penalized two-pass regression to predict stock returns with time-varying risk premia, Individual Data Protected Integrative Regression Analysis of High-Dimensional Heterogeneous Data, A sparse additive model for high-dimensional interactions with an exposure variable, Tuning parameter selection for penalized estimation via \(R^2\), Lasso in Infinite dimension: application to variable selection in functional multivariate linear regression, Multi-Task Learning with High-Dimensional Noisy Images, Sharp support recovery from noisy random measurements by \(\ell_1\)-minimization, Envelope-based sparse reduced-rank regression for multivariate linear model, Theoretical properties of the overlapping groups Lasso, Sparsity with sign-coherent groups of variables via the cooperative-Lasso, The log-linear group-lasso estimator and its asymptotic properties, Oracle inequalities and optimal inference under group sparsity, Consistent group selection with Bayesian high dimensional modeling, Bridge regression: adaptivity and group selection, The benefit of group sparsity, A Compressive Sensing Based Analysis of Anomalies in Generalized Linear Models, A selective review of group selection in high-dimensional models, A unified framework for high-dimensional analysis of \(M\)-estimators with decomposable regularizers, Accuracy guaranties for \(\ell_{1}\) recovery of block-sparse signals, Adaptive sparse group LASSO in quantile regression, AIC for the group Lasso in generalized linear models, Asymptotic theory of the adaptive sparse group Lasso, Broken adaptive ridge regression for right-censored survival data, High-dimensional generalized linear models incorporating graphical structure among predictors, Optimization problems involving group sparsity terms, Lasso regression and its application in forecasting macro economic indicators: a study on Vietnam's exports, A group VISA algorithm for variable selection, On the oracle property of adaptive group Lasso in high-dimensional linear models
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- The Adaptive Lasso and Its Oracle Properties
- Group variable selection via a hierarchical lasso and its oracle property
- The log-linear group-lasso estimator and its asymptotic properties
- Sparsity in penalized empirical risk minimization
- Best subset selection, persistence in high-dimensional statistical learning and optimization under \(l_1\) constraint
- Concentration inequalities and model selection. Ecole d'Eté de Probabilités de Saint-Flour XXXIII -- 2003.
- Lasso-type recovery of sparse representations for high-dimensional data
- Algorithms for simultaneous sparse approximation. I: Greedy pursuit
- Asymptotic behavior of likelihood methods for exponential families when the number of parameters tends to infinity
- Persistene in high-dimensional linear predictor-selection and the virtue of overparametrization
- Asymptotics for Lasso-type estimators.
- Oracle inequalities for inverse problems
- Nonconcave penalized likelihood with a diverging number of parameters.
- Least angle regression. (With discussion)
- On the asymptotics of constrained \(M\)-estimation
- Weak convergence and empirical processes. With applications to statistics
- Simultaneous analysis of Lasso and Dantzig selector
- Sparsity oracle inequalities for the Lasso
- Sup-norm convergence rate and sign concentration property of Lasso and Dantzig estimators
- Aggregation for Gaussian regression
- High-dimensional graphs and variable selection with the Lasso
- The Group Lasso for Logistic Regression
- Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties
- Information-Theoretic Limits on Sparsity Recovery in the High-Dimensional and Noisy Setting
- Compressed and Privacy-Sensitive Sparse Regression
- Regularization and Variable Selection Via the Elastic Net
- On the Non-Negative Garrotte Estimator
- Model Selection and Estimation in Regression with Grouped Variables