Oracle inequalities and optimal inference under group sparsity
From MaRDI portal
(Redirected from Publication:651028)
Abstract: We consider the problem of estimating a sparse linear regression vector under a gaussian noise model, for the purpose of both prediction and model selection. We assume that prior knowledge is available on the sparsity pattern, namely the set of variables is partitioned into prescribed groups, only few of which are relevant in the estimation process. This group sparsity assumption suggests us to consider the Group Lasso method as a means to estimate . We establish oracle inequalities for the prediction and estimation errors of this estimator. These bounds hold under a restricted eigenvalue condition on the design matrix. Under a stronger coherence condition, we derive bounds for the estimation error for mixed -norms with . When , this result implies that a threshold version of the Group Lasso estimator selects the sparsity pattern of with high probability. Next, we prove that the rate of convergence of our upper bounds is optimal in a minimax sense, up to a logarithmic factor, for all estimators over a class of group sparse vectors. Furthermore, we establish lower bounds for the prediction and estimation errors of the usual Lasso estimator. Using this result, we demonstrate that the Group Lasso can achieve an improvement in the prediction and estimation properties as compared to the Lasso.
Recommendations
- Some theoretical results on the grouped variables Lasso
- On the asymptotic properties of the group lasso estimator for linear models
- Non-asymptotic oracle inequalities for the Lasso and group Lasso in high dimensional logistic model
- Sparsity oracle inequalities for the Lasso
- The benefit of group sparsity
Cites work
- scientific article; zbMATH DE number 3984433 (Why is no real title available?)
- scientific article; zbMATH DE number 193897 (Why is no real title available?)
- scientific article; zbMATH DE number 741240 (Why is no real title available?)
- scientific article; zbMATH DE number 1522808 (Why is no real title available?)
- scientific article; zbMATH DE number 1865743 (Why is no real title available?)
- Aggregation for Gaussian regression
- An Efficient Method of Estimating Seemingly Unrelated Regressions and Tests for Aggregation Bias
- Bounds for linear multi-task learning
- Consistency of the group Lasso and multiple kernel learning
- Convex analysis and nonlinear optimization. Theory and examples.
- Econometric analysis of cross section and panel data.
- Exponential screening and optimal rates of sparse estimation
- High-dimensional additive modeling
- High-dimensional generalized linear models and the lasso
- Introduction to nonparametric estimation
- Model Selection and Estimation in Regression with Grouped Variables
- Moment inequalities for sums of dependent random variables under projective conditions
- Nemirovski's inequalities revisited
- On the asymptotic properties of the group lasso estimator for linear models
- Oracle inequalities for inverse problems
- Oracle inequalities in empirical risk minimization and sparse recovery problems. École d'Été de Probabilités de Saint-Flour XXXVIII-2008.
- Sharp Thresholds for High-Dimensional and Noisy Sparsity Recovery Using $\ell _{1}$-Constrained Quadratic Programming (Lasso)
- Simultaneous analysis of Lasso and Dantzig selector
- Some theoretical results on the grouped variables Lasso
- Sparsity oracle inequalities for the Lasso
- Stable recovery of sparse overcomplete representations in the presence of noise
- Sup-norm convergence rate and sign concentration property of Lasso and Dantzig estimators
- Support union recovery in high-dimensional multivariate regression
- The Dantzig selector: statistical estimation when \(p\) is much larger than \(n\). (With discussions and rejoinder).
- The Group Lasso for Logistic Regression
- The benefit of group sparsity
- The composite absolute penalties family for grouped and hierarchical variable selection
- The sparsity and bias of the LASSO selection in high-dimensional linear regression
- Variable selection in nonparametric additive models
- Weak convergence and empirical processes. With applications to statistics
Cited in
(only showing first 100 items - show all)- Structured sparsity through convex optimization
- Joint variable and rank selection for parsimonious estimation of high-dimensional matrices
- Asymptotic properties of adaptive group Lasso for sparse reduced rank regression
- Scalable efficient reproducible multi-task learning via data splitting
- Decomposable norm minimization with proximal-gradient homotopy algorithm
- Group regularized estimation under structural hierarchy
- A selective review of group selection in high-dimensional models
- Sparse estimation by exponential weighting
- Quantile regression with group Lasso for classification
- The de-biased group Lasso estimation for varying coefficient models
- On the robustness of minimum norm interpolators and regularized empirical risk minimizers
- Covariate-adaptive randomization with variable selection in clinical trials
- Trend filtering for functional data
- Robust regression with compositional covariates
- Consistent group selection with Bayesian high dimensional modeling
- ``Grouping strategies and thresholding for high dimensional linear models: discussion
- ``Grouping strategies and thresholding for high dimensional linear models: rejoinder
- Joint rank and variable selection for parsimonious estimation in a high-dimensional finite mixture regression model
- High-dimensional regression with unknown variance
- High-Dimensional Gaussian Graphical Regression Models with Covariates
- Bayesian Multi-Task Variable Selection with an Application to Differential DAG Analysis
- Grouping strategies and thresholding for high dimensional linear models
- Robust regression via mutivariate regression depth
- Error bounds for compressed sensing algorithms with group sparsity: A unified approach
- Solution of linear ill-posed problems using overcomplete dictionaries
- Group sparse recovery via group square-root elastic net and the iterative multivariate thresholding-based algorithm
- Localized Gaussian width of \(M\)-convex hulls with applications to Lasso and convex aggregation
- Least squares after model selection in high-dimensional sparse models
- Noise covariance estimation in multi-task high-dimensional linear models
- Low-rank matrix recovery under heavy-tailed errors
- An Interactive Greedy Approach to Group Sparsity in High Dimensions
- scientific article; zbMATH DE number 7306926 (Why is no real title available?)
- ISLET: fast and optimal low-rank tensor regression via importance sketching
- Non-asymptotic oracle inequalities for the Lasso and group Lasso in high dimensional logistic model
- Some theoretical results on the grouped variables Lasso
- scientific article; zbMATH DE number 7370643 (Why is no real title available?)
- Group sparse structural smoothing recovery: model, statistical properties and algorithm
- Structured variable selection via prior-induced hierarchical penalty functions
- Statistical and computational limits for sparse matrix detection
- Multi-Task Learning with High-Dimensional Noisy Images
- Oracle inequalities for weighted group Lasso in high-dimensional misspecified Cox models
- Tuning-free heterogeneous inference in massive networks
- Recovery of simultaneous low rank and two-way sparse coefficient matrices, a nonconvex approach
- A no-free-lunch theorem for multitask learning
- Posterior contraction in group sparse logit models for categorical responses
- Discrete optimization methods for group model selection in compressed sensing
- Variable selection for generalized linear model with highly correlated covariates
- scientific article; zbMATH DE number 6982912 (Why is no real title available?)
- Joint association and classification analysis of multi‐view data
- Greedy variance estimation for the LASSO
- Slope meets Lasso: improved oracle bounds and optimality
- Model selection in regression under structural constraints
- Randomized maximum-contrast selection: subagging for large-scale regression
- Adaptive estimation in multivariate response regression with hidden variables
- scientific article; zbMATH DE number 7306893 (Why is no real title available?)
- Generalized linear models with structured sparsity estimators
- Multivariate sparse group Lasso for the multivariate multiple linear regression with an arbitrary group structure
- Theoretical properties of the overlapping groups Lasso
- Estimation and variable selection with exponential weights
- Gelfand numbers related to structured sparsity and Besov space embeddings with small mixed smoothness
- Sharp oracle inequalities for low-complexity priors
- Prediction and estimation consistency of sparse multi-class penalized optimal scoring
- Adaptive estimation of the rank of the coefficient matrix in high-dimensional multivariate response regression models
- High-dimensional generalized linear models incorporating graphical structure among predictors
- Bayesian MIDAS penalized regressions: estimation, selection, and prediction
- Oracle inequalities for weighted group Lasso in high-dimensional Poisson regression model
- Optimal estimation and rank detection for sparse spiked covariance matrices
- Grouped variable selection with discrete optimization: computational and statistical perspectives
- Sparse PCA: optimal rates and adaptive estimation
- Bayesian linear regression for multivariate responses under group sparsity
- Minimax estimation in multi-task regression under low-rank structures
- Variable selection in heterogeneous panel data models with cross‐sectional dependence
- Variable selection, monotone likelihood ratio and group sparsity
- Tight conditions for consistency of variable selection in the context of high dimensionality
- Simultaneous feature selection and clustering based on square root optimization
- Penalized least square in sparse setting with convex penalty and non Gaussian errors
- Adaptive and robust multi-task learning
- A general framework for Bayes structured linear models
- Trace regression model with simultaneously low rank and row(column) sparse parameter
- On the finite-sample analysis of \(\Theta\)-estimators
- Regularizers for structured sparsity
- Sparse regression and support recovery with \(\mathbb{L}_2\)-boosting algorithms
- Expectile trace regression via low-rank and group sparsity regularization
- Improved linear regression prediction by transfer learning
- Simultaneous off-the-grid learning of mixtures issued from a continuous dictionary
- Structured estimation for the nonparametric Cox model
- A Unified Approach to Sparse Tweedie Modeling of Multisource Insurance Claim Data
- Sparse high-dimensional varying coefficient model: nonasymptotic minimax study
- A varying-coefficient panel data model with fixed effects: theory and an application to US commercial banks
- Robust grouped variable selection using distributionally robust optimization
- The statistical rate for support matrix machines under low rankness and row (column) sparsity
- Adaptive bi-level variable selection for multivariate failure time model with a diverging number of covariates
- Variable selection and structure identification for varying coefficient Cox models
- The benefit of group sparsity in group inference with de-biased scaled group Lasso
- Lasso in Infinite dimension: application to variable selection in functional multivariate linear regression
- Multidimensional linear functional estimation in sparse Gaussian models and robust estimation of the mean
- A penalized two-pass regression to predict stock returns with time-varying risk premia
- On the finite-sample analysis of \(\Theta\)-estimators
- scientific article; zbMATH DE number 7626731 (Why is no real title available?)
- A group VISA algorithm for variable selection
This page was built for publication: Oracle inequalities and optimal inference under group sparsity
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q651028)