On Cross-Validation for Sparse Reduced Rank Regression
From MaRDI portal
Publication:3120104
DOI10.1111/rssb.12295zbMath1407.62195arXiv1812.11555OpenAlexW2898542396WikidataQ129032768 ScholiaQ129032768MaRDI QIDQ3120104
No author found.
Publication date: 1 March 2019
Published in: Journal of the Royal Statistical Society Series B: Statistical Methodology (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1812.11555
Related Items (5)
Gaining Outlier Resistance With Progressive Quantiles: Fast Algorithms and Theoretical Studies ⋮ Sparse Reduced Rank Huber Regression in High Dimensions ⋮ Low-Rank Regression Models for Multiple Binary Responses and their Applications to Cancer Cell-Line Encyclopedia Data ⋮ Stability Approach to Regularization Selection for Reduced-Rank Regression ⋮ Recovery of simultaneous low rank and two-way sparse coefficient matrices, a nonconvex approach
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- A penalized matrix decomposition, with applications to sparse principal components and canonical correlation analysis
- Sparse principal component analysis and iterative thresholding
- Estimation of high-dimensional low-rank matrices
- Optimal selection of reduced rank estimators of high-dimensional matrices
- The solution path of the generalized lasso
- Joint variable and rank selection for parsimonious estimation of high-dimensional matrices
- The restricted isometry property and its implications for compressed sensing
- Sparse principal component analysis via regularized low rank matrix approximation
- The sparsity and bias of the LASSO selection in high-dimensional linear regression
- A survey of cross-validation procedures for model selection
- Estimating the dimension of a model
- Asymptotic distribution of the reduced rank regression estimator under general conditions
- The risk inflation criterion for multiple regression
- An iterative algorithm for fitting nonconvex penalized generalized linear models with grouped predictors
- Sparse regression with exact clustering
- Simultaneous analysis of Lasso and Dantzig selector
- Extended Bayesian information criteria for model selection with large model spaces
- Can the strengths of AIC and BIC be shared? A conflict between model indentification and regression estimation
- The Predictive Sample Reuse Method with Applications
- Ideal spatial adaptation by wavelet shrinkage
- Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties
- Forecasting Using Principal Components From a Large Number of Predictors
- Group Iterative Spectrum Thresholding for Super-Resolution Sparse Spectral Selection
- Sparse Partial Least Squares Regression for Simultaneous Dimension Reduction and Variable Selection
- Sparse Reduced-Rank Regression for Simultaneous Dimension Reduction and Variable Selection
- On Consistency and Sparsity for Principal Components Analysis in High Dimensions
- Tight Oracle Inequalities for Low-Rank Matrix Recovery From a Minimal Number of Noisy Random Measurements
- Linear Model Selection by Cross-Validation
- Selective factor extraction in high dimensions
- Some Comments on C P
- Reduced Rank Stochastic Regression with a Sparse Singular value Decomposition
- On the finite-sample analysis of \(\Theta\)-estimators
- A new look at the statistical model identification
This page was built for publication: On Cross-Validation for Sparse Reduced Rank Regression