Entropy of Markovian processes: Application to image entropy in computer vision (Q5959128): Difference between revisions

From MaRDI portal
Importer (talk | contribs)
Created a new Item
 
ReferenceBot (talk | contribs)
Changed an Item
 
(2 intermediate revisions by 2 users not shown)
Property / MaRDI profile type
 
Property / MaRDI profile type: MaRDI publication profile / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q5569065 / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q4404472 / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q5624701 / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q4727261 / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q3809102 / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q4732346 / rank
 
Normal rank
Property / cites work
 
Property / cites work: Stochastic processes and filtering theory / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q4332707 / rank
 
Normal rank
links / mardi / namelinks / mardi / name
 

Latest revision as of 00:01, 4 June 2024

scientific article; zbMATH DE number 1722267
Language Label Description Also known as
English
Entropy of Markovian processes: Application to image entropy in computer vision
scientific article; zbMATH DE number 1722267

    Statements

    Entropy of Markovian processes: Application to image entropy in computer vision (English)
    0 references
    0 references
    28 September 2002
    0 references
    For image processing, the author introduces about twelve new measures of information, for example the density of conditional entropy of \(f\) as \(\frac{1}{2} \int_0^{2\pi} \ln (f'_x\cos\alpha +f'_y\sin\alpha)^2 d\alpha.\) Two propositions and five lemmata are offered (the lemmata do not lead to the propositions). The style is conversational, with (triple) exclamation marks; the statement of Lemma 2 is ``As a direct consequence of the definition of discrete Markov processes, by using entropy with respect to a measure, one can obtain the entropy of the trajectory \(x(t)\) on the finite interval \([0,T],\) in the form \(H(x(\;);0,T)=H(x_0)+\int_0^T\int_{\mathbb{R}^n} p(x,t)\ln[(2\pi e)^n|GQG^T|]^{1/2} dx dt\), where \(p(x,t)\) is the probability density defined by the Fokker-Planck-Kolmogorov equation \(\frac{\partial p}{\partial t}= -\sum_{i=1}^n\frac{\partial(pf_i)}{\partial x_i}+\frac{1}{2}\sum_{i,j=1}^n \frac{\partial^2[p(GQG^T)_{ij}]}{\partial x_i\partial x_j}\)''.
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    measures of information
    0 references
    entropy
    0 references
    image entropy
    0 references
    monkey model
    0 references
    Markov processes
    0 references
    computer vision
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references