Optimal selection of reduced rank estimators of high-dimensional matrices
From MaRDI portal
Publication:548562
Abstract: We introduce a new criterion, the Rank Selection Criterion (RSC), for selecting the optimal reduced rank estimator of the coefficient matrix in multivariate response regression models. The corresponding RSC estimator minimizes the Frobenius norm of the fit plus a regularization term proportional to the number of parameters in the reduced rank model. The rank of the RSC estimator provides a consistent estimator of the rank of the coefficient matrix; in general, the rank of our estimator is a consistent estimate of the effective rank, which we define to be the number of singular values of the target matrix that are appropriately large. The consistency results are valid not only in the classic asymptotic regime, when , the number of responses, and , the number of predictors, stay bounded, and , the number of observations, grows, but also when either, or both, and grow, possibly much faster than . We establish minimax optimal bounds on the mean squared errors of our estimators. Our finite sample performance bounds for the RSC estimator show that it achieves the optimal balance between the approximation error and the penalty term. Furthermore, our procedure has very low computational complexity, linear in the number of candidate models, making it particularly appealing for large scale problems. We contrast our estimator with the nuclear norm penalized least squares (NNP) estimator, which has an inherently higher computational complexity than RSC, for multivariate regression models. We show that NNP has estimation properties similar to those of RSC, albeit under stronger conditions. However, it is not as parsimonious as RSC. We offer a simple correction of the NNP estimator which leads to consistent rank estimation.
Recommendations
- Adaptive estimation of the rank of the coefficient matrix in high-dimensional multivariate response regression models
- Joint variable and rank selection for parsimonious estimation of high-dimensional matrices
- Reduced rank regression via adaptive nuclear norm penalization
- On estimation in the reduced-rank regression with a large number of responses and predictors
- Reduced rank regression with matrix projections for high-dimensional multivariate linear regression model
Cites work
- scientific article; zbMATH DE number 3690480 (Why is no real title available?)
- scientific article; zbMATH DE number 3215519 (Why is no real title available?)
- Asymptotic distribution of the reduced rank regression estimator under general conditions
- Constrained principal component analysis: A comprehensive theory
- Convex optimization methods for dimension reduction and coefficient estimation in multivariate linear regression
- Estimating Linear Restrictions on Regression Coefficients for Multivariate Normal Distributions
- Estimation of (near) low-rank matrices with noise and high-dimensional scaling
- Estimation of high-dimensional low-rank matrices
- Fixed point and Bregman iterative methods for matrix rank minimization
- Generalized canonical analysis for time series
- Guaranteed minimum-rank solutions of linear matrix equations via nuclear norm minimization
- Identification, Estimation and Large-Sample Theory for Regressions Containing Unobservable Variables
- Matrix Analysis
- Modern Multivariate Statistical Techniques
- Multivariate reduced-rank regression
- Non-asymptotic theory of random matrices: extreme singular values
- Oracle inequalities for inverse problems
- Reduced-rank regression for the multivariate linear model
- Regularized linear and kernel redundancy analysis
- SDPT3 — A Matlab software package for semidefinite programming, Version 1.3
- Specification and misspecification in reduced rank regression
- The Power of Convex Relaxation: Near-Optimal Matrix Completion
- Tight Oracle Inequalities for Low-Rank Matrix Recovery From a Minimal Number of Noisy Random Measurements
- Weak convergence and empirical processes. With applications to statistics
Cited in
(86)- Minimax estimation in multi-task regression under low-rank structures
- On the convergence of rank-one multi-target linear regression
- Sequential Scaled Sparse Factor Regression
- Reduced rank regression with matrix projections for high-dimensional multivariate linear regression model
- Scalable interpretable multi-response regression via SEED
- Stability Approach to Regularization Selection for Reduced-Rank Regression
- Inferring Influence Networks from Longitudinal Bipartite Relational Data
- Sparse and Low-Rank Matrix Quantile Estimation With Application to Quadratic Regression
- D4R: doubly robust reduced rank regression in high dimension
- Penalisation methods in fitting high-dimensional cointegrated vector autoregressive models: a review
- Capturing between-tasks covariance and similarities using multivariate linear mixed models
- Sparse vector error correction models with application to cointegration‐based trading
- Scalable interpretable learning for multi-response error-in-variables regression
- Controlling the false discovery rate for latent factors via unit-rank deflation
- A reduced-rank approach to predicting multiple binary responses through machine learning
- Integrative sparse reduced-rank regression via orthogonal rotation for analysis of high-dimensional multi-source data
- Negative binomial factor regression with application to microbiome data analysis
- Sparse reduced-rank regression for simultaneous rank and variable selection via manifold optimization
- Multivariate Functional Regression Via Nested Reduced-Rank Regularization
- On Cross-Validation for Sparse Reduced Rank Regression
- Factorisable multitask quantile regression
- Sparse Reduced Rank Huber Regression in High Dimensions
- Tensor factorization via transformed tensor-tensor product for image alignment
- A fully Bayesian approach to sparse reduced-rank multivariate regression
- High-dimensional dynamic systems identification with additional constraints
- Low-Rank Regression Models for Multiple Binary Responses and their Applications to Cancer Cell-Line Encyclopedia Data
- Fast Estimation of Approximate Matrix Ranks Using Spectral Densities
- Reconstruction of a low-rank matrix in the presence of Gaussian noise
- On the exponentially weighted aggregate with the Laplace prior
- Sparse reduced-rank regression with covariance estimation
- Inter-class sparsity based discriminative least square regression
- On estimation in the reduced-rank regression with a large number of responses and predictors
- Leveraging mixed and incomplete outcomes via reduced-rank modeling
- Reduced rank regression via adaptive nuclear norm penalization
- Adaptive estimation of the rank of the coefficient matrix in high-dimensional multivariate response regression models
- Bayesian sparse reduced rank multivariate regression
- Trace regression model with simultaneously low rank and row(column) sparse parameter
- Reduced rank regression with possibly non-smooth criterion functions: an empirical likelihood approach
- Adjusting for high-dimensional covariates in sparse precision matrix estimation by \(\ell_1\)-penalization
- A principal varying-coefficient model for quantile regression: joint variable selection and dimension reduction
- High-dimensional consistency of rank estimation criteria in multivariate linear model
- Principal single-index varying-coefficient models for dimension reduction in quantile regression
- Generalized co-sparse factor regression
- Parallel integrative learning for large-scale multi-response regression with incomplete outcomes
- Structured matrix estimation and completion
- Estimation of Low Rank High-Dimensional Multivariate Linear Models for Multi-Response Data
- Optimal large-scale quantum state tomography with Pauli measurements
- High-dimensional covariance matrix estimation with missing observations
- Robust regression via mutivariate regression depth
- Signal extraction approach for sparse multivariate response regression
- Noisy low-rank matrix completion with general sampling distribution
- Adaptive estimation of the copula correlation matrix for semiparametric elliptical copulas
- Estimation of high-dimensional low-rank matrices
- Nonconvex penalized reduced rank regression and its oracle properties in high dimensions
- Computing the degrees of freedom of rank-regularized estimators and cousins
- Sparse Single Index Models for Multivariate Responses
- A note on rank reduction in sparse multivariate regression
- Exponential weights in multivariate regression and a low-rankness favoring prior
- Rank penalized estimators for high-dimensional matrices
- Convex optimization methods for dimension reduction and coefficient estimation in multivariate linear regression
- Nuclear-norm penalization and optimal rates for noisy low-rank matrix completion
- Inference in latent factor regression with clusterable features
- High-dimensional regression with unknown variance
- Concentration inequalities for matrix martingales in continuous time
- Parametric and semiparametric reduced-rank regression with flexible sparsity
- Distributed estimation in heterogeneous reduced rank regression: with application to order determination in sufficient dimension reduction
- Robust reduced-rank modeling via rank regression
- On principal components regression, random projections, and column subsampling
- On the sample covariance matrix estimator of reduced effective rank population matrices, with applications to fPCA
- Spectral thresholding quantum tomography for low rank states
- Asymptotic equivalence of quantum state tomography and noisy matrix completion
- Robust reduced rank regression in a distributed setting
- Reduced-rank estimation for ill-conditioned stochastic linear model with high signal-to-noise ratio
- Spectral thresholding for the estimation of Markov chain transition operators
- Recovery of simultaneous low rank and two-way sparse coefficient matrices, a nonconvex approach
- Adaptive estimation in two-way sparse reduced-rank regression
- Low rank multivariate regression
- Pairwise directions estimation for multivariate response regression data
- Adaptive estimation in multivariate response regression with hidden variables
- Multivariate sparse group Lasso for the multivariate multiple linear regression with an arbitrary group structure
- On the degrees of freedom of reduced-rank estimators in multivariate regression
- Sparse inference of the drift of a high-dimensional Ornstein-Uhlenbeck process
- High-dimensional VAR with low-rank transition
- Adaptive estimation in structured factor models with applications to overlapping clustering
- A generalized information criterion for high-dimensional PCA rank selection
- Joint variable and rank selection for parsimonious estimation of high-dimensional matrices
This page was built for publication: Optimal selection of reduced rank estimators of high-dimensional matrices
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q548562)