Parameter identifiability with Kullback-Leibler information divergence criterion
From MaRDI portal
Publication:2928567
Recommendations
Cites work
- scientific article; zbMATH DE number 5654889 (Why is no real title available?)
- scientific article; zbMATH DE number 3997615 (Why is no real title available?)
- scientific article; zbMATH DE number 2221907 (Why is no real title available?)
- scientific article; zbMATH DE number 3086018 (Why is no real title available?)
- A new look at the statistical model identification
- An information theoretic approach to dynamical systems modeling and identification
- Approximation problems with the divergence criterion for Gaussian variables and Gaussian processes
- Asymptotic Statistics
- Distribution estimation consistent in total variation and in two types of information divergence
- Identifiability of linear and nonlinear dynamical systems
- Identification in Parametric Models
- Modeling by shortest data description
- On Kullback-Leibler loss and density estimation
- On the identifiability of parameters
- On-line estimation of dynamic shock-error models based on the Kullback Leibler information measure
- Parametrizations of linear dynamical systems: Canonical forms and identifiability
- Sequential algorithms for parameter estimation based on the Kullback-Leibler information measure
- Spectral distance measures between Gaussian processes
- Stochastic model simplification
Cited in
(4)- A Resampling-Based Stochastic Approximation Method for Analysis of Large Geostatistical Data
- A new interpretation on the MMSE as a robust MEE criterion
- Parameter identification in Choquet Integral by the Kullback-Leibler divergence on continuous densities with application to classification fusion
- Parameter identifiability and its key issues in statistical machine learning
This page was built for publication: Parameter identifiability with Kullback-Leibler information divergence criterion
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2928567)