Identification of Directed Influence: Granger Causality, Kullback-Leibler Divergence, and Complexity
From MaRDI portal
Publication:2919418
DOI10.1162/NECO_a_00291zbMath1311.92115WikidataQ48608378 ScholiaQ48608378MaRDI QIDQ2919418
Abd-Krim Seghouane, Shun-ichi Amari
Publication date: 2 October 2012
Published in: Neural Computation (Search for Journal in Brave)
Neural biology (92C20) Biomedical imaging and signal processing (92C55) Statistical aspects of information-theoretic topics (62B10)
Related Items
Transfer entropy expressions for a class of non-Gaussian distributions ⋮ Inference of time-varying networks through transfer entropy, the case of a Boolean network model ⋮ Is First-Order Vector Autoregressive Model Optimal for fMRI Data? ⋮ Improving on transfer entropy-based network reconstruction using time-delays: Approach and validation
Cites Work
- Unnamed Item
- Asymptotic bootstrap corrections of AIC for linear regression models
- Time series: theory and methods.
- Evaluating causal relations in neural systems: Granger causality, directed transfer function and statistical assessment of significance
- Measurement of Linear Dependence and Feedback Between Multiple Time Series
- Vector Autoregressive Model-Order Selection From Finite Samples Using Kullback's Symmetric Divergence
- Interpretations of Directed Information in Portfolio Theory, Data Compression, and Hypothesis Testing
- Investigating Causal Relations by Econometric Models and Cross-spectral Methods
- A CORRECTED AKAIKE INFORMATION CRITERION FOR VECTOR AUTOREGRESSIVE MODEL SELECTION
- A Small Sample Model Selection Criterion Based on Kullback's Symmetric Divergence
- On Information and Sufficiency