Some further results on the minimum error entropy estimation (Q406075): Difference between revisions

From MaRDI portal
Importer (talk | contribs)
Created a new Item
 
ReferenceBot (talk | contribs)
Changed an Item
 
(4 intermediate revisions by 4 users not shown)
Property / review text
 
Summary: The minimum error entropy (MEE) criterion has been receiving increasing attention due to its promising perspectives for applications in signal processing and machine learning. In the context of Bayesian estimation, the MEE criterion is concerned with the estimation of a certain random variable based on another random variable, so that the error's entropy is minimized. Several theoretical results on this topic have been reported. In this work, we present some further results on the MEE estimation. The contributions are twofold: (1) we extend a recent result on the minimum entropy of a mixture of unimodal and symmetric distributions to a more general case, and prove that if the conditional distributions are generalized uniformly dominated (GUD), the dominant alignment will be the MEE estimator; (2) we show by examples that the MEE estimator (not limited to singular cases) may be non-unique even if the error distribution is restricted to zero-mean (unbiased).
Property / review text: Summary: The minimum error entropy (MEE) criterion has been receiving increasing attention due to its promising perspectives for applications in signal processing and machine learning. In the context of Bayesian estimation, the MEE criterion is concerned with the estimation of a certain random variable based on another random variable, so that the error's entropy is minimized. Several theoretical results on this topic have been reported. In this work, we present some further results on the MEE estimation. The contributions are twofold: (1) we extend a recent result on the minimum entropy of a mixture of unimodal and symmetric distributions to a more general case, and prove that if the conditional distributions are generalized uniformly dominated (GUD), the dominant alignment will be the MEE estimator; (2) we show by examples that the MEE estimator (not limited to singular cases) may be non-unique even if the error distribution is restricted to zero-mean (unbiased). / rank
 
Normal rank
Property / Mathematics Subject Classification ID
 
Property / Mathematics Subject Classification ID: 62B10 / rank
 
Normal rank
Property / zbMATH DE Number
 
Property / zbMATH DE Number: 6341001 / rank
 
Normal rank
Property / zbMATH Keywords
 
entropy
Property / zbMATH Keywords: entropy / rank
 
Normal rank
Property / zbMATH Keywords
 
estimation
Property / zbMATH Keywords: estimation / rank
 
Normal rank
Property / zbMATH Keywords
 
minimum error entropy estimation
Property / zbMATH Keywords: minimum error entropy estimation / rank
 
Normal rank
Property / MaRDI profile type
 
Property / MaRDI profile type: MaRDI publication profile / rank
 
Normal rank
Property / full work available at URL
 
Property / full work available at URL: https://doi.org/10.3390/e14050966 / rank
 
Normal rank
Property / OpenAlex ID
 
Property / OpenAlex ID: W2011976556 / rank
 
Normal rank
Property / cites work
 
Property / cites work: Entropy analysis of estimating systems / rank
 
Normal rank
Property / cites work
 
Property / cites work: An application of the information theory to estimation problems / rank
 
Normal rank
Property / cites work
 
Property / cites work: Minimum-entropy estimation in semi-parametric models / rank
 
Normal rank
Property / cites work
 
Property / cites work: An extended orthogonal forward regression algorithm for system identification using entropy / rank
 
Normal rank
Property / cites work
 
Property / cites work: \(\Delta \)-entropy: definition, properties and applications in system identification with quantized data / rank
 
Normal rank
Property / cites work
 
Property / cites work: Linear prediction, filtering, and smoothing: An information-theoretic approach / rank
 
Normal rank
Property / cites work
 
Property / cites work: Optimal state estimation for stochastic systems: an information theoretic approach / rank
 
Normal rank
Property / cites work
 
Property / cites work: Minimum entropy filtering for multivariate stochastic systems with non-Gaussian noises / rank
 
Normal rank
Property / cites work
 
Property / cites work: Minimum entropy control of non-Gaussian dynamic stochastic systems / rank
 
Normal rank
Property / cites work
 
Property / cites work: Minimum entropy control of closed-loop tracking errors for dynamic stochastic systems / rank
 
Normal rank
Property / cites work
 
Property / cites work: Information Theoretic Learning / rank
 
Normal rank
Property / cites work
 
Property / cites work: Robustness and risk-sensitive filtering / rank
 
Normal rank
Property / cites work
 
Property / cites work: Minimum entropy of error principle in estimation / rank
 
Normal rank
Property / cites work
 
Property / cites work: On the Minimum Entropy of a Mixture of Unimodal and Symmetric Distributions / rank
 
Normal rank
Property / cites work
 
Property / cites work: On optimal estimations with minimum error entropy criterion / rank
 
Normal rank
Property / cites work
 
Property / cites work: A new interpretation on the MMSE as a robust MEE criterion / rank
 
Normal rank
links / mardi / namelinks / mardi / name
 

Latest revision as of 00:05, 9 July 2024

scientific article
Language Label Description Also known as
English
Some further results on the minimum error entropy estimation
scientific article

    Statements

    Some further results on the minimum error entropy estimation (English)
    0 references
    0 references
    0 references
    0 references
    8 September 2014
    0 references
    Summary: The minimum error entropy (MEE) criterion has been receiving increasing attention due to its promising perspectives for applications in signal processing and machine learning. In the context of Bayesian estimation, the MEE criterion is concerned with the estimation of a certain random variable based on another random variable, so that the error's entropy is minimized. Several theoretical results on this topic have been reported. In this work, we present some further results on the MEE estimation. The contributions are twofold: (1) we extend a recent result on the minimum entropy of a mixture of unimodal and symmetric distributions to a more general case, and prove that if the conditional distributions are generalized uniformly dominated (GUD), the dominant alignment will be the MEE estimator; (2) we show by examples that the MEE estimator (not limited to singular cases) may be non-unique even if the error distribution is restricted to zero-mean (unbiased).
    0 references
    entropy
    0 references
    estimation
    0 references
    minimum error entropy estimation
    0 references

    Identifiers