Law invariant risk measures and information divergences

From MaRDI portal
Publication:2283649

DOI10.1515/DEMO-2018-0014zbMATH Open1430.91134arXiv1510.07030OpenAlexW2963593180WikidataQ128907462 ScholiaQ128907462MaRDI QIDQ2283649FDOQ2283649


Authors: Daniel Lacker Edit this on Wikidata


Publication date: 13 January 2020

Published in: Dependence Modeling (Search for Journal in Brave)

Abstract: A one-to-one correspondence is drawn between law invariant risk measures and divergences, which we define as functionals of pairs of probability measures on arbitrary standard Borel spaces satisfying a few natural properties. Divergences include many classical information divergence measures, such as relative entropy and f-divergences. Several properties of divergence and their duality with law invariant risk measures are developed, most notably relating their chain rules or additivity properties with certain notions of time consistency for dynamic law invariant risk measures known as acceptance and rejection consistency. These properties are linked also to a peculiar property of the acceptance sets on the level of distributions, analogous to results of Weber on weak acceptance and rejection consistency. Finally, the examples of shortfall risk measures and optimized certainty equivalents are discussed in some detail, and it is shown that the relative entropy is essentially the only divergence satisfying the chain rule.


Full work available at URL: https://arxiv.org/abs/1510.07030




Recommendations




Cites Work


Cited In (8)





This page was built for publication: Law invariant risk measures and information divergences

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2283649)