High-dimensional low-rank tensor autoregressive time series modeling
From MaRDI portal
Publication:6152591
DOI10.1016/J.JECONOM.2023.105544arXiv2101.04276MaRDI QIDQ6152591FDOQ6152591
Authors:
Publication date: 13 February 2024
Published in: Journal of Econometrics (Search for Journal in Brave)
Abstract: Modern technological advances have enabled an unprecedented amount of structured data with complex temporal dependence, urging the need for new methods to efficiently model and forecast high-dimensional tensor-valued time series. This paper provides the first practical tool to accomplish this task via autoregression (AR). By considering a low-rank Tucker decomposition for the transition tensor, the proposed tensor autoregression can flexibly capture the underlying low-dimensional tensor dynamics, providing both substantial dimension reduction and meaningful dynamic factor interpretation. For this model, we introduce both low-dimensional rank-constrained estimator and high-dimensional regularized estimators, and derive their asymptotic and non-asymptotic properties. In particular, by leveraging the special balanced structure of the AR transition tensor, a novel convex regularization approach, based on the sum of nuclear norms of square matricizations, is proposed to efficiently encourage low-rankness of the coefficient tensor. A truncation method is further introduced to consistently select the Tucker ranks. Simulation experiments and real data analysis demonstrate the advantages of the proposed approach over various competing ones.
Full work available at URL: https://arxiv.org/abs/2101.04276
high-dimensional time seriesnuclear normtensor decompositionglobal trade flowsnon-convex tensor regressiontensor-valued time series
Statistics (62-XX) Game theory, economics, finance, and other social and behavioral sciences (91-XX)
Cites Work
- Regularized estimation in sparse high-dimensional time series models
- Factor modeling for high-dimensional time series: inference for the number of factors
- Network vector autoregression
- A direct estimation of high dimensional stationary vector autoregressions
- Factor Models for High-Dimensional Tensor Time Series
- Tensor Decompositions and Applications
- Autoregressive models for matrix-valued time series
- Estimation of (near) low-rank matrices with noise and high-dimensional scaling
- Convex regularization for high-dimensional multiresponse tensor regression
- A Multilinear Singular Value Decomposition
- Tensor completion and low-\(n\)-rank tensor recovery via convex optimization
- An optimal statistical and computational framework for generalized tensor estimation
- Non-convex optimization for machine learning
- Low Rank and Structured Modeling of High-Dimensional Vector Autoregressions
- Matrix Variate Regressions and Envelope Models
- High-dimensional VARs with common factors
- Finite-time analysis of vector autoregressive models under linear restrictions
- High-Dimensional Vector Autoregressive Time Series Modeling via Tensor Decomposition
This page was built for publication: High-dimensional low-rank tensor autoregressive time series modeling
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6152591)