Detecting conditional independence for modeling non-Gaussian time series
From MaRDI portal
Publication:2131924
DOI10.1007/s42952-019-00030-yzbMath1485.62120OpenAlexW3034073350MaRDI QIDQ2131924
Deemat C. Mathew, G. Hareesh, Sudheesh Kumar Kattumannil
Publication date: 27 April 2022
Published in: Journal of the Korean Statistical Society (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s42952-019-00030-y
Density estimation (62G07) Time series, auto-correlation, regression, etc. in statistics (GARCH) (62M10) Asymptotic properties of nonparametric inference (62G20) Statistical aspects of information-theoretic topics (62B10)
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Some data analyses using mutual information
- Estimation of entropy and other functionals of a multivariate density
- A smoothed bootstrap test for independence based on mutual information
- Akaike's information criterion and Kullback-Leibler loss for histogram density estimation
- On Kullback-Leibler loss and density estimation
- Time series: theory and methods.
- On the estimation of entropy
- Time series analysis of categorical data using auto-mutual information
- Auto-association measures for stationary time series of categorical data
- Stable Autoregressive Models and Signal Estimation
- Nonparametric Entropy-Based Tests of Independence Between Stochastic Processes
- A nonparametric estimation of the entropy for absolutely continuous distributions (Corresp.)
- USING THE MUTUAL INFORMATION COEFFICIENT TO IDENTIFY LAGS IN NONLINEAR MODELS
- Inverse Gaussian Autoregressive Models
- Analysis of autoregressive models with symmetric stable innovations
- Asymptotic Distribution Theory for Nonparametric Entropy Measures of Serial Dependence
This page was built for publication: Detecting conditional independence for modeling non-Gaussian time series