Entropy of Markovian processes: Application to image entropy in computer vision (Q5959128)

From MaRDI portal
scientific article; zbMATH DE number 1722267
Language Label Description Also known as
English
Entropy of Markovian processes: Application to image entropy in computer vision
scientific article; zbMATH DE number 1722267

    Statements

    Entropy of Markovian processes: Application to image entropy in computer vision (English)
    0 references
    0 references
    28 September 2002
    0 references
    For image processing, the author introduces about twelve new measures of information, for example the density of conditional entropy of \(f\) as \(\frac{1}{2} \int_0^{2\pi} \ln (f'_x\cos\alpha +f'_y\sin\alpha)^2 d\alpha.\) Two propositions and five lemmata are offered (the lemmata do not lead to the propositions). The style is conversational, with (triple) exclamation marks; the statement of Lemma 2 is ``As a direct consequence of the definition of discrete Markov processes, by using entropy with respect to a measure, one can obtain the entropy of the trajectory \(x(t)\) on the finite interval \([0,T],\) in the form \(H(x(\;);0,T)=H(x_0)+\int_0^T\int_{\mathbb{R}^n} p(x,t)\ln[(2\pi e)^n|GQG^T|]^{1/2} dx dt\), where \(p(x,t)\) is the probability density defined by the Fokker-Planck-Kolmogorov equation \(\frac{\partial p}{\partial t}= -\sum_{i=1}^n\frac{\partial(pf_i)}{\partial x_i}+\frac{1}{2}\sum_{i,j=1}^n \frac{\partial^2[p(GQG^T)_{ij}]}{\partial x_i\partial x_j}\)''.
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    measures of information
    0 references
    entropy
    0 references
    image entropy
    0 references
    monkey model
    0 references
    Markov processes
    0 references
    computer vision
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references