Estimation of an entropy-based functional (Q653353): Difference between revisions

From MaRDI portal
Import240304020342 (talk | contribs)
Set profile property.
Set OpenAlex properties.
Property / full work available at URL
 
Property / full work available at URL: https://doi.org/10.3390/e12030338 / rank
 
Normal rank
Property / OpenAlex ID
 
Property / OpenAlex ID: W2105342380 / rank
 
Normal rank

Revision as of 21:13, 19 March 2024

scientific article
Language Label Description Also known as
English
Estimation of an entropy-based functional
scientific article

    Statements

    Estimation of an entropy-based functional (English)
    0 references
    0 references
    0 references
    9 January 2012
    0 references
    Summary: Given a function \(f\) from \([0,1]\) to the real line, we consider the (nonlinear) functional \(\mathsf h\) obtained by evaluating the continuous entropy of the `density function' of \(f\). Motivated by an application in signal processing, we wish to estimate \(h(f)\). Our main tool is a decomposition of \(h\) into two terms, which each have favorable scaling properties. We show that if functions \(f\) and \(g\) satisfy a regularity condition, then the smallness of \(\|f-g\|_{\infty}\) and \(\|f'-g'\|_{\infty}\), along with some basic control on derivatives of \(f\) and \(g\), is sufficient to imply that \(h(f)\) and \(h(g)\) are close.
    0 references
    entropy
    0 references
    differential entropy
    0 references
    Shannon entropy
    0 references
    entropy estimation
    0 references
    nonlinear functional
    0 references
    signal processing
    0 references

    Identifiers

    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references