Joint variable and rank selection for parsimonious estimation of high-dimensional matrices
From MaRDI portal
Publication:741790
DOI10.1214/12-AOS1039zbMath1373.62246arXiv1110.3556MaRDI QIDQ741790
Yiyuan She, Florentina Bunea, Marten H. Wegkamp
Publication date: 15 September 2014
Published in: The Annals of Statistics (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1110.3556
dimension reductionadaptive estimationgroup Lassooracle inequalitiesmultivariate response regressionrank constrained minimizationreduced rank estimatorsrow and rank sparse models
Related Items (38)
High-Dimensional Vector Autoregressive Time Series Modeling via Tensor Decomposition ⋮ Optimal large-scale quantum state tomography with Pauli measurements ⋮ Generalized co-sparse factor regression ⋮ Adaptive estimation in multivariate response regression with hidden variables ⋮ High-dimensional consistency of rank estimation criteria in multivariate linear model ⋮ Sparse reduced-rank regression with covariance estimation ⋮ Improved Estimation of High-dimensional Additive Models Using Subspace Learning ⋮ Sparse Single Index Models for Multivariate Responses ⋮ Unnamed Item ⋮ Robust reduced-rank modeling via rank regression ⋮ Reduced rank regression with possibly non-smooth criterion functions: an empirical likelihood approach ⋮ Joint rank and variable selection for parsimonious estimation in a high-dimensional finite mixture regression model ⋮ Bayesian sparse reduced rank multivariate regression ⋮ High-dimensional multivariate posterior consistency under global-local shrinkage priors ⋮ A principal varying-coefficient model for quantile regression: joint variable selection and dimension reduction ⋮ Exponential weights in multivariate regression and a low-rankness favoring prior ⋮ Model-based regression clustering for high-dimensional data: application to functional data ⋮ Sparse and Low-Rank Matrix Quantile Estimation With Application to Quadratic Regression ⋮ Integrative sparse reduced-rank regression via orthogonal rotation for analysis of high-dimensional multi-source data ⋮ Pairwise directions estimation for multivariate response regression data ⋮ Sparse Reduced Rank Huber Regression in High Dimensions ⋮ Scalable interpretable learning for multi-response error-in-variables regression ⋮ Low-Rank Regression Models for Multiple Binary Responses and their Applications to Cancer Cell-Line Encyclopedia Data ⋮ Nonconvex penalized reduced rank regression and its oracle properties in high dimensions ⋮ Analysis of Double Single Index Models ⋮ Estimating a sparse reduction for general regression in high dimensions ⋮ Nonconvex tensor rank minimization and its applications to tensor recovery ⋮ Parallel integrative learning for large-scale multi-response regression with incomplete outcomes ⋮ Sparse inference of the drift of a high-dimensional Ornstein-Uhlenbeck process ⋮ Parametric and semiparametric reduced-rank regression with flexible sparsity ⋮ Leveraging mixed and incomplete outcomes via reduced-rank modeling ⋮ Adaptive estimation of the rank of the coefficient matrix in high-dimensional multivariate response regression models ⋮ Recovery of simultaneous low rank and two-way sparse coefficient matrices, a nonconvex approach ⋮ Dimensionality Reduction and Variable Selection in Multivariate Varying-Coefficient Models With a Large Number of Covariates ⋮ Signal extraction approach for sparse multivariate response regression ⋮ A note on rank reduction in sparse multivariate regression ⋮ On Cross-Validation for Sparse Reduced Rank Regression ⋮ Unnamed Item
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Statistics for high-dimensional data. Methods, theory and applications.
- Estimation of high-dimensional low-rank matrices
- Estimation of (near) low-rank matrices with noise and high-dimensional scaling
- Optimal selection of reduced rank estimators of high-dimensional matrices
- Consistent group selection in high-dimensional linear regression
- Oracle inequalities and optimal inference under group sparsity
- Nuclear-norm penalization and optimal rates for noisy low-rank matrix completion
- Linear and nonlinear programming.
- On the convergence properties of the EM algorithm
- Minimizing a differentiable function over a differential manifold
- Multivariate reduced-rank regression
- Moderate projection pursuit regression for multivariate response data
- Least angle regression. (With discussion)
- An iterative algorithm for fitting nonconvex penalized generalized linear models with grouped predictors
- Honest variable selection in linear and logistic regression models via \(\ell _{1}\) and \(\ell _{1}+\ell _{2}\) penalization
- Low rank multivariate regression
- Simultaneous analysis of Lasso and Dantzig selector
- Modern Multivariate Statistical Techniques
- Tight Oracle Inequalities for Low-Rank Matrix Recovery From a Minimal Number of Noisy Random Measurements
- Model Selection and Estimation in Regression with Grouped Variables
- Estimating Linear Restrictions on Regression Coefficients for Multivariate Normal Distributions
- Convergence of a block coordinate descent method for nondifferentiable minimization
This page was built for publication: Joint variable and rank selection for parsimonious estimation of high-dimensional matrices