A New Family of Bounded Divergence Measures and Application to Signal Detection

From MaRDI portal
Publication:6230028

arXiv1201.0418MaRDI QIDQ6230028FDOQ6230028


Authors: Shivakumar Jolad, Ahmed Roman, Mahesh C. Shastry, Mihir Gadgil, Ayanendranath Basu Edit this on Wikidata


Publication date: 1 January 2012

Abstract: We introduce a new one-parameter family of divergence measures, called bounded Bhattacharyya distance (BBD) measures, for quantifying the dissimilarity between probability distributions. These measures are bounded, symmetric and positive semi-definite and do not require absolute continuity. In the asymptotic limit, BBD measure approaches the squared Hellinger distance. A generalized BBD measure for multiple distributions is also introduced. We prove an extension of a theorem of Bradt and Karlin for BBD relating Bayes error probability and divergence ranking. We show that BBD belongs to the class of generalized Csiszar f-divergence and derive some properties such as curvature and relation to Fisher Information. For distributions with vector valued parameters, the curvature matrix is related to the Fisher-Rao metric. We derive certain inequalities between BBD and well known measures such as Hellinger and Jensen-Shannon divergence. We also derive bounds on the Bayesian error probability. We give an application of these measures to the problem of signal detection where we compare two monochromatic signals buried in white noise and differing in frequency and amplitude.













This page was built for publication: A New Family of Bounded Divergence Measures and Application to Signal Detection

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6230028)