High-dimensional low-rank tensor autoregressive time series modeling
From MaRDI portal
Publication:6152591
Abstract: Modern technological advances have enabled an unprecedented amount of structured data with complex temporal dependence, urging the need for new methods to efficiently model and forecast high-dimensional tensor-valued time series. This paper provides the first practical tool to accomplish this task via autoregression (AR). By considering a low-rank Tucker decomposition for the transition tensor, the proposed tensor autoregression can flexibly capture the underlying low-dimensional tensor dynamics, providing both substantial dimension reduction and meaningful dynamic factor interpretation. For this model, we introduce both low-dimensional rank-constrained estimator and high-dimensional regularized estimators, and derive their asymptotic and non-asymptotic properties. In particular, by leveraging the special balanced structure of the AR transition tensor, a novel convex regularization approach, based on the sum of nuclear norms of square matricizations, is proposed to efficiently encourage low-rankness of the coefficient tensor. A truncation method is further introduced to consistently select the Tucker ranks. Simulation experiments and real data analysis demonstrate the advantages of the proposed approach over various competing ones.
Cites work
- A Multilinear Singular Value Decomposition
- A direct estimation of high dimensional stationary vector autoregressions
- An optimal statistical and computational framework for generalized tensor estimation
- Autoregressive models for matrix-valued time series
- Convex regularization for high-dimensional multiresponse tensor regression
- Estimation of (near) low-rank matrices with noise and high-dimensional scaling
- Factor Models for High-Dimensional Tensor Time Series
- Factor modeling for high-dimensional time series: inference for the number of factors
- Finite-time analysis of vector autoregressive models under linear restrictions
- High-Dimensional Vector Autoregressive Time Series Modeling via Tensor Decomposition
- High-dimensional VARs with common factors
- Low Rank and Structured Modeling of High-Dimensional Vector Autoregressions
- Matrix Variate Regressions and Envelope Models
- Network vector autoregression
- Non-convex optimization for machine learning
- Regularized estimation in sparse high-dimensional time series models
- Tensor Decompositions and Applications
- Tensor completion and low-\(n\)-rank tensor recovery via convex optimization
This page was built for publication: High-dimensional low-rank tensor autoregressive time series modeling
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6152591)