Parameter identifiability with Kullback-Leibler information divergence criterion
From MaRDI portal
Publication:2928567
DOI10.1002/ACS.1078zbMath1298.93337OpenAlexW2115737071MaRDI QIDQ2928567
Badong Chen, Zeng-Qi Sun, Jinchun Hu, Yu Zhu
Publication date: 10 November 2014
Published in: International Journal of Adaptive Control and Signal Processing (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1002/acs.1078
system identificationparameter estimationconsistency in probabilityFisher's information matrix (FIM)Kullback-Leibler information divergence (KLID)
Estimation and detection in stochastic control theory (93E10) Identification in stochastic control theory (93E12)
Related Items (2)
A new interpretation on the MMSE as a robust MEE criterion ⋮ A Resampling-Based Stochastic Approximation Method for Analysis of Large Geostatistical Data
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- On Kullback-Leibler loss and density estimation
- Modeling by shortest data description
- Approximation problems with the divergence criterion for Gaussian variables and Gaussian processes
- Sequential algorithms for parameter estimation based on the Kullback-Leibler information measure
- Spectral distance measures between Gaussian processes
- Stochastic model simplification
- Distribution estimation consistent in total variation and in two types of information divergence
- Parametrizations of linear dynamical systems: Canonical forms and identifiability
- Identifiability of linear and nonlinear dynamical systems
- An information theoretic approach to dynamical systems modeling and identification
- Asymptotic Statistics
- On-line estimation of dynamic shock-error models based on the Kullback Leibler information measure
- Identification in Parametric Models
- On the identifiability of parameters
- A new look at the statistical model identification
This page was built for publication: Parameter identifiability with Kullback-Leibler information divergence criterion