Stability Approach to Regularization Selection for Reduced-Rank Regression
From MaRDI portal
Publication:6180727
DOI10.1080/10618600.2022.2119986arXiv2207.00924MaRDI QIDQ6180727FDOQ6180727
Authors:
Publication date: 22 January 2024
Published in: Journal of Computational and Graphical Statistics (Search for Journal in Brave)
Abstract: The reduced-rank regression model is a popular model to deal with multivariate response and multiple predictors, and is widely used in biology, chemometrics, econometrics, engineering, and other fields. In the reduced-rank regression modelling, a central objective is to estimate the rank of the coefficient matrix that represents the number of effective latent factors in predicting the multivariate response. Although theoretical results such as rank estimation consistency have been established for various methods, in practice rank determination still relies on information criterion based methods such as AIC and BIC or subsampling based methods such as cross validation. Unfortunately, the theoretical properties of these practical methods are largely unknown. In this paper, we present a novel method called StARS-RRR that selects the tuning parameter and then estimates the rank of the coefficient matrix for reduced-rank regression based on the stability approach. We prove that StARS-RRR achieves rank estimation consistency, i.e., the rank estimated with the tuning parameter selected by StARS-RRR is consistent to the true rank. Through a simulation study, we show that StARS-RRR outperforms other tuning parameter selection methods including AIC, BIC, and cross validation as it provides the most accurate estimated rank. In addition, when applied to a breast cancer dataset, StARS-RRR discovers a reasonable number of genetic pathways that affect the DNA copy number variations and results in a smaller prediction error than the other methods with a random-splitting process.
Full work available at URL: https://arxiv.org/abs/2207.00924
Cites Work
- A penalized matrix decomposition, with applications to sparse principal components and canonical correlation analysis
- Reduced rank regression via adaptive nuclear norm penalization
- Estimating the dimension of a model
- Stability Selection
- A new look at the statistical model identification
- Subsampling
- Generalized Cross-Validation as a Method for Choosing a Good Ridge Parameter
- Modern Multivariate Statistical Techniques
- Reduced rank ridge regression and its kernel extensions
- Stability
- Sparse reduced-rank regression for simultaneous dimension reduction and variable selection
- Dimension Reduction and Coefficient Estimation in Multivariate Linear Regression
- On the degrees of freedom of reduced-rank estimators in multivariate regression
- Optimal selection of reduced rank estimators of high-dimensional matrices
- Reduced Rank Stochastic Regression with a Sparse Singular value Decomposition
- Estimating Linear Restrictions on Regression Coefficients for Multivariate Normal Distributions
- Consistent selection of tuning parameters via variable selection stability
- Variable Selection with Error Control: Another Look at Stability Selection
- Tuning Parameter Selection in High Dimensional Penalized Likelihood
- Specification and misspecification in reduced rank regression
- Degrees of freedom in low rank matrix estimation
- False Discovery and its Control in Low Rank Estimation
- Reduced rank regression in cointegrated models.
- Selective factor extraction in high dimensions
- Bayesian assessment of dimensionality in reduced rank regression
- Model diagnostics in reduced-rank estimation
- A random-perturbation-based rank estimator of the number of factors
- Rank determination for low-rank data completion
- On Cross-Validation for Sparse Reduced Rank Regression
This page was built for publication: Stability Approach to Regularization Selection for Reduced-Rank Regression
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6180727)