Matrix Factor Analysis: From Least Squares to Iterative Projection
From MaRDI portal
Publication:6150367
Abstract: In this article, we study large-dimensional matrix factor models and estimate the factor loading matrices and factor score matrix by minimizing square loss function. Interestingly, the resultant estimators coincide with the Projected Estimators (PE) in Yu et al.(2022), which was proposed from the perspective of simultaneous reduction of the dimensionality and the magnitudes of the idiosyncratic error matrix. In other word, we provide a least-square interpretation of the PE for matrix factor model, which parallels to the least-square interpretation of the PCA for the vector factor model. We derive the convergence rates of the theoretical minimizers under sub-Gaussian tails. Considering the robustness to the heavy tails of the idiosyncratic errors, we extend the least squares to minimizing the Huber loss function, which leads to a weighted iterative projection approach to compute and learn the parameters. We also derive the convergence rates of the theoretical minimizers of the Huber loss function under bounded th moment of the idiosyncratic errors. We conduct extensive numerical studies to investigate the empirical performance of the proposed Huber estimators relative to the state-of-the-art ones. The Huber estimators perform robustly and much better than existing ones when the data are heavy-tailed, and as a result can be used as a safe replacement in practice. An application to a Fama-French financial portfolio dataset demonstrates the empirical advantage of the Huber estimator.
Cites work
- A randomized sequential procedure to determine the number of factors
- Community detection on mixture multilayer networks via regularized tensor decomposition
- Constrained Factor Models for High-Dimensional Matrix-Variate Time Series
- Determining the Number of Factors in Approximate Factor Models
- Eigenvalue ratio test for the number of factors
- Factor Models for High-Dimensional Tensor Time Series
- Factor models for matrix-valued high-dimensional time series
- Forecasting Using Principal Components From a Large Number of Predictors
- High-dimensional statistics. A non-asymptotic viewpoint
- High-frequency factor models and regressions
- Identification and estimation of threshold matrix-variate factor models
- Inferential Theory for Factor Models of Large Dimensions
- Large covariance estimation by thresholding principal orthogonal complements. With discussion and authors' reply
- Large dimensional latent factor modeling with missing observations and applications to causal inference
- Principal components estimation and identification of static factors
- Projected estimation for large-dimensional matrix factor models
- Quantile factor models
- Rank determination in tensor factor model
- Robust Estimation of a Location Parameter
- Robust factor number specification for large-dimensional elliptical factor model
- Self-normalized large deviations
- Separable factor analysis with applications to mortality data
- Testing hypotheses about the number of factors in large factor models
- Towards a universal self-normalized moderate deviation
- Using principal component analysis to estimate a high dimensional factor model with high-frequency data
Cited in
(1)
This page was built for publication: Matrix Factor Analysis: From Least Squares to Iterative Projection
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6150367)