An extended result on the optimal estimation under the minimum error entropy criterion
From MaRDI portal
Publication:296457
Abstract: The minimum error entropy (MEE) criterion has been successfully used in fields such as parameter estimation, system identification and the supervised machine learning. There is in general no explicit expression for the optimal MEE estimate unless some constraints on the conditional distribution are imposed. A recent paper has proved that if the conditional density is conditionally symmetric and unimodal (CSUM), then the optimal MEE estimate (with Shannon entropy) equals the conditional median. In this study, we extend this result to the generalized MEE estimation where the optimality criterion is the Renyi entropy or equivalently, the alpha-order information potential (IP).
Cites work
- scientific article; zbMATH DE number 3304505 (Why is no real title available?)
- An application of the information theory to estimation problems
- Entropy analysis of estimating systems
- Information theoretic learning. Renyi's entropy and kernel perspectives
- Linear prediction, filtering, and smoothing: An information-theoretic approach
- On optimal estimation with respect to a large family of cost functions
- On the Minimum Entropy of a Mixture of Unimodal and Symmetric Distributions
- Robustness and risk-sensitive filtering
- Some further results on the minimum error entropy estimation
- Stochastic gradient algorithm under \((h, \phi)\)-entropy criterion
This page was built for publication: An extended result on the optimal estimation under the minimum error entropy criterion
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q296457)