Stochastic Differential Games and Viscosity Solutions of Hamilton–Jacobi–Bellman–Isaacs Equations

From MaRDI portal
Publication:3614801

DOI10.1137/060671954zbMATH Open1157.93040arXivmath/0702131OpenAlexW3121250812MaRDI QIDQ3614801FDOQ3614801


Authors: Rainer Buckdahn, Juan Li Edit this on Wikidata


Publication date: 10 March 2009

Published in: SIAM Journal on Control and Optimization (Search for Journal in Brave)

Abstract: In this paper we study zero-sum two-player stochastic differential games with the help of theory of Backward Stochastic Differential Equations (BSDEs). At the one hand we generalize the results of the pioneer work of Fleming and Souganidis by considering cost functionals defined by controlled BSDEs and by allowing the admissible control processes to depend on events occurring before the beginning of the game (which implies that the cost functionals become random variables), on the other hand the application of BSDE methods, in particular that of the notion of stochastic "backward semigroups" introduced by Peng allows to prove a dynamic programming principle for the upper and the lower value functions of the game in a straight-forward way, without passing by additional approximations. The upper and the lower value functions are proved to be the unique viscosity solutions of the upper and the lower Hamilton-Jacobi-Bellman-Isaacs equations, respectively. For this Peng's BSDE method is translated from the framework of stochastic control theory into that of stochastic differential games.


Full work available at URL: https://arxiv.org/abs/math/0702131




Recommendations





Cited In (only showing first 100 items - show all)





This page was built for publication: Stochastic Differential Games and Viscosity Solutions of Hamilton–Jacobi–Bellman–Isaacs Equations

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q3614801)