Minimum entropy of error principle in estimation (Q1331031): Difference between revisions

From MaRDI portal
RedirectionBot (talk | contribs)
Removed claims
ReferenceBot (talk | contribs)
Changed an Item
 
(3 intermediate revisions by 3 users not shown)
Property / author
 
Property / author: Martin Janžura / rank
 
Normal rank
Property / reviewed by
 
Property / reviewed by: Andreas N. Philippou / rank
 
Normal rank
Property / MaRDI profile type
 
Property / MaRDI profile type: MaRDI publication profile / rank
 
Normal rank
Property / full work available at URL
 
Property / full work available at URL: https://doi.org/10.1016/0020-0255(94)90043-4 / rank
 
Normal rank
Property / OpenAlex ID
 
Property / OpenAlex ID: W2068708092 / rank
 
Normal rank
Property / cites work
 
Property / cites work: Exponential entropy as a measure of extent of a distribution / rank
 
Normal rank
Property / cites work
 
Property / cites work: Characterization of entropy of probability distributions on the real line / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q4023085 / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q3918886 / rank
 
Normal rank
Property / cites work
 
Property / cites work: Information and distortion in reduced-order filter design / rank
 
Normal rank
Property / cites work
 
Property / cites work: Theoretical limitations on the transmission of data from analog sources / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q3996472 / rank
 
Normal rank
Property / cites work
 
Property / cites work: Continuity and characterization of Shannon-Wiener information measure for continuous probability distributions / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q5750089 / rank
 
Normal rank
Property / cites work
 
Property / cites work: Some properties of generalized exponential entropies with applications to data compression / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q3995624 / rank
 
Normal rank
Property / cites work
 
Property / cites work: An application of the information theory to estimation problems / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q3834874 / rank
 
Normal rank
Property / cites work
 
Property / cites work: Entropy analysis of parameter estimation / rank
 
Normal rank
Property / cites work
 
Property / cites work: Entropy analysis of estimating systems / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q5665108 / rank
 
Normal rank

Latest revision as of 17:18, 22 May 2024

scientific article
Language Label Description Also known as
English
Minimum entropy of error principle in estimation
scientific article

    Statements

    Minimum entropy of error principle in estimation (English)
    0 references
    0 references
    0 references
    0 references
    17 August 1994
    0 references
    The principle of minimum error entropy estimation as found in the work of \textit{H. L. Weidemann} and \textit{E. B. Stear} [Inf. Control 14, 493-506 (1969; Zbl 0212.233), and IEEE Trans. Inf. Theory IT-16, 264-270 (1970; Zbl 0205.209)] is reformulated as a problem of finding optimum locations of probability densities in a given mixture such that the resulting (differential) entropy is minimized. New results concerning the entropy lower bound are derived. Continuity of the entropy and attaining the minimum entropy are proved in the case where the mixture is finite. Some other examples and situations, in particular that of symmetric unimodal densities, are studied in more detail.
    0 references
    0 references
    0 references
    0 references
    0 references
    finite mixtures
    0 references
    continuity
    0 references
    minimum error entropy estimation
    0 references
    entropy lower bound
    0 references
    symmetric unimodal densities
    0 references
    0 references