Parameter identifiability with Kullback-Leibler information divergence criterion
DOI10.1002/ACS.1078zbMATH Open1298.93337OpenAlexW2115737071MaRDI QIDQ2928567FDOQ2928567
Authors: Badong Chen, Jinchun Hu, Yu Zhu, Zengqi Sun
Publication date: 10 November 2014
Published in: International Journal of Adaptive Control and Signal Processing (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1002/acs.1078
Recommendations
parameter estimationsystem identificationconsistency in probabilityFisher's information matrix (FIM)Kullback-Leibler information divergence (KLID)
Estimation and detection in stochastic control theory (93E10) Identification in stochastic control theory (93E12)
Cites Work
- Asymptotic Statistics
- A new look at the statistical model identification
- Identification in Parametric Models
- Title not available (Why is that?)
- On Kullback-Leibler loss and density estimation
- Parametrizations of linear dynamical systems: Canonical forms and identifiability
- Modeling by shortest data description
- Title not available (Why is that?)
- Identifiability of linear and nonlinear dynamical systems
- Title not available (Why is that?)
- Title not available (Why is that?)
- Distribution estimation consistent in total variation and in two types of information divergence
- An information theoretic approach to dynamical systems modeling and identification
- Sequential algorithms for parameter estimation based on the Kullback-Leibler information measure
- Spectral distance measures between Gaussian processes
- On the identifiability of parameters
- Approximation problems with the divergence criterion for Gaussian variables and Gaussian processes
- On-line estimation of dynamic shock-error models based on the Kullback Leibler information measure
- Stochastic model simplification
Cited In (4)
- A Resampling-Based Stochastic Approximation Method for Analysis of Large Geostatistical Data
- A new interpretation on the MMSE as a robust MEE criterion
- Parameter identification in Choquet Integral by the Kullback-Leibler divergence on continuous densities with application to classification fusion
- Parameter identifiability and its key issues in statistical machine learning
This page was built for publication: Parameter identifiability with Kullback-Leibler information divergence criterion
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2928567)