Optimal selection of reduced rank estimators of high-dimensional matrices
From MaRDI portal
Publication:548562
DOI10.1214/11-AOS876zbMATH Open1216.62086arXiv1004.2995MaRDI QIDQ548562FDOQ548562
Authors: Florentina Bunea, Yiyuan She, Marten H. Wegkamp
Publication date: 29 June 2011
Published in: The Annals of Statistics (Search for Journal in Brave)
Abstract: We introduce a new criterion, the Rank Selection Criterion (RSC), for selecting the optimal reduced rank estimator of the coefficient matrix in multivariate response regression models. The corresponding RSC estimator minimizes the Frobenius norm of the fit plus a regularization term proportional to the number of parameters in the reduced rank model. The rank of the RSC estimator provides a consistent estimator of the rank of the coefficient matrix; in general, the rank of our estimator is a consistent estimate of the effective rank, which we define to be the number of singular values of the target matrix that are appropriately large. The consistency results are valid not only in the classic asymptotic regime, when , the number of responses, and , the number of predictors, stay bounded, and , the number of observations, grows, but also when either, or both, and grow, possibly much faster than . We establish minimax optimal bounds on the mean squared errors of our estimators. Our finite sample performance bounds for the RSC estimator show that it achieves the optimal balance between the approximation error and the penalty term. Furthermore, our procedure has very low computational complexity, linear in the number of candidate models, making it particularly appealing for large scale problems. We contrast our estimator with the nuclear norm penalized least squares (NNP) estimator, which has an inherently higher computational complexity than RSC, for multivariate regression models. We show that NNP has estimation properties similar to those of RSC, albeit under stronger conditions. However, it is not as parsimonious as RSC. We offer a simple correction of the NNP estimator which leads to consistent rank estimation.
Full work available at URL: https://arxiv.org/abs/1004.2995
Recommendations
- Adaptive estimation of the rank of the coefficient matrix in high-dimensional multivariate response regression models
- Joint variable and rank selection for parsimonious estimation of high-dimensional matrices
- Reduced rank regression via adaptive nuclear norm penalization
- On estimation in the reduced-rank regression with a large number of responses and predictors
- Reduced rank regression with matrix projections for high-dimensional multivariate linear regression model
dimension reductionnuclear normadaptive estimationoracle inequalitieslow rank matrix approximationrank selectionmultivariate response regression
Cites Work
- SDPT3 — A Matlab software package for semidefinite programming, Version 1.3
- Weak convergence and empirical processes. With applications to statistics
- Matrix Analysis
- Modern Multivariate Statistical Techniques
- Reduced-rank regression for the multivariate linear model
- Multivariate reduced-rank regression
- Estimation of (near) low-rank matrices with noise and high-dimensional scaling
- Estimating Linear Restrictions on Regression Coefficients for Multivariate Normal Distributions
- Convex optimization methods for dimension reduction and coefficient estimation in multivariate linear regression
- Guaranteed minimum-rank solutions of linear matrix equations via nuclear norm minimization
- Tight Oracle Inequalities for Low-Rank Matrix Recovery From a Minimal Number of Noisy Random Measurements
- The Power of Convex Relaxation: Near-Optimal Matrix Completion
- Estimation of high-dimensional low-rank matrices
- Specification and misspecification in reduced rank regression
- Fixed point and Bregman iterative methods for matrix rank minimization
- Constrained principal component analysis: A comprehensive theory
- Oracle inequalities for inverse problems
- Title not available (Why is that?)
- Non-asymptotic theory of random matrices: extreme singular values
- Asymptotic distribution of the reduced rank regression estimator under general conditions
- Generalized canonical analysis for time series
- Regularized linear and kernel redundancy analysis
- Title not available (Why is that?)
- Identification, Estimation and Large-Sample Theory for Regressions Containing Unobservable Variables
Cited In (83)
- D4R: doubly robust reduced rank regression in high dimension
- FACTORISABLE MULTITASK QUANTILE REGRESSION
- High-dimensional dynamic systems identification with additional constraints
- Fast Estimation of Approximate Matrix Ranks Using Spectral Densities
- Negative binomial factor regression with application to microbiome data analysis
- On the convergence of rank-one multi-target linear regression
- Reduced rank regression with matrix projections for high-dimensional multivariate linear regression model
- Sparse and Low-Rank Matrix Quantile Estimation With Application to Quadratic Regression
- Controlling the false discovery rate for latent factors via unit-rank deflation
- Minimax estimation in multi-task regression under low-rank structures
- Sparse vector error correction models with application to cointegration‐based trading
- Penalisation methods in fitting high-dimensional cointegrated vector autoregressive models: a review
- Stability Approach to Regularization Selection for Reduced-Rank Regression
- Sequential Scaled Sparse Factor Regression
- A reduced-rank approach to predicting multiple binary responses through machine learning
- Integrative sparse reduced-rank regression via orthogonal rotation for analysis of high-dimensional multi-source data
- Low-Rank Regression Models for Multiple Binary Responses and their Applications to Cancer Cell-Line Encyclopedia Data
- Sparse reduced-rank regression for simultaneous rank and variable selection via manifold optimization
- A fully Bayesian approach to sparse reduced-rank multivariate regression
- Inferring Influence Networks from Longitudinal Bipartite Relational Data
- On Cross-Validation for Sparse Reduced Rank Regression
- Tensor factorization via transformed tensor-tensor product for image alignment
- Multivariate Functional Regression Via Nested Reduced-Rank Regularization
- Sparse Reduced Rank Huber Regression in High Dimensions
- Adjusting for high-dimensional covariates in sparse precision matrix estimation by \(\ell_1\)-penalization
- Capturing between-tasks covariance and similarities using multivariate linear mixed models
- Robust regression via mutivariate regression depth
- Low rank multivariate regression
- Inter-class sparsity based discriminative least square regression
- Generalized co-sparse factor regression
- Parallel integrative learning for large-scale multi-response regression with incomplete outcomes
- On principal components regression, random projections, and column subsampling
- Reconstruction of a low-rank matrix in the presence of Gaussian noise
- Structured matrix estimation and completion
- Principal single-index varying-coefficient models for dimension reduction in quantile regression
- Optimal large-scale quantum state tomography with Pauli measurements
- Parametric and semiparametric reduced-rank regression with flexible sparsity
- Recovery of simultaneous low rank and two-way sparse coefficient matrices, a nonconvex approach
- Adaptive estimation in structured factor models with applications to overlapping clustering
- Exponential weights in multivariate regression and a low-rankness favoring prior
- Adaptive estimation in multivariate response regression with hidden variables
- Sparse inference of the drift of a high-dimensional Ornstein-Uhlenbeck process
- High-dimensional consistency of rank estimation criteria in multivariate linear model
- Noisy low-rank matrix completion with general sampling distribution
- Robust reduced-rank modeling via rank regression
- Adaptive estimation of the rank of the coefficient matrix in high-dimensional multivariate response regression models
- Inference in latent factor regression with clusterable features
- Distributed estimation in heterogeneous reduced rank regression: with application to order determination in sufficient dimension reduction
- Robust reduced rank regression in a distributed setting
- High-dimensional VAR with low-rank transition
- Reduced-rank estimation for ill-conditioned stochastic linear model with high signal-to-noise ratio
- Estimation of Low Rank High-Dimensional Multivariate Linear Models for Multi-Response Data
- Title not available (Why is that?)
- Trace regression model with simultaneously low rank and row(column) sparse parameter
- Reduced rank regression with possibly non-smooth criterion functions: an empirical likelihood approach
- A principal varying-coefficient model for quantile regression: joint variable selection and dimension reduction
- On the sample covariance matrix estimator of reduced effective rank population matrices, with applications to fPCA
- Rank penalized estimators for high-dimensional matrices
- A generalized information criterion for high-dimensional PCA rank selection
- Nonconvex penalized reduced rank regression and its oracle properties in high dimensions
- Multivariate sparse group lasso for the multivariate multiple linear regression with an arbitrary group structure
- Leveraging mixed and incomplete outcomes via reduced-rank modeling
- On the exponentially weighted aggregate with the Laplace prior
- Asymptotic equivalence of quantum state tomography and noisy matrix completion
- Bayesian sparse reduced rank multivariate regression
- Computing the degrees of freedom of rank-regularized estimators and cousins
- On estimation in the reduced-rank regression with a large number of responses and predictors
- Spectral thresholding quantum tomography for low rank states
- Spectral thresholding for the estimation of Markov chain transition operators
- Adaptive estimation of the copula correlation matrix for semiparametric elliptical copulas
- High-dimensional covariance matrix estimation with missing observations
- Sparse Single Index Models for Multivariate Responses
- Estimation of high-dimensional low-rank matrices
- Scalable interpretable learning for multi-response error-in-variables regression
- Convex optimization methods for dimension reduction and coefficient estimation in multivariate linear regression
- Nuclear-norm penalization and optimal rates for noisy low-rank matrix completion
- Pairwise directions estimation for multivariate response regression data
- Joint variable and rank selection for parsimonious estimation of high-dimensional matrices
- Sparse reduced-rank regression with covariance estimation
- Signal extraction approach for sparse multivariate response regression
- A note on rank reduction in sparse multivariate regression
- High-dimensional regression with unknown variance
- Concentration inequalities for matrix martingales in continuous time
Uses Software
This page was built for publication: Optimal selection of reduced rank estimators of high-dimensional matrices
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q548562)