Stochastic Differential Games with Random Coefficients and Stochastic Hamilton-Jacobi-Bellman-Isaacs Equations
https://nus-sg.zoom.us/j/87645650702?pwd=OWUyODF5alBFSExPL0pzcEJIblh0Zz09
ABSTRACT
In this paper, we study a class of zero-sum two-player stochastic differential games with the controlled stochastic differential equations and the payoff/cost functionals of recursive type. As opposed to the pioneering work by Fleming and Souganidis (Indianna Univ. Math.J., 38(1989), pp.~293-314) and the seminal work by Buckdahn and Li (SIAM J. Control Optim., 417 (2008), pp.~444-475), the involved coefficients may be random, going beyond the Markovian framework and leading to the random upper and lower value functions. We first prove the dynamic programming principle for the game, and then under the standard Lipschitz continuity assumptions on the coefficients, the upper and lower value functions are shown to be the viscosity solutions of the upper and the lower fully nonlinear stochastic Hamilton-Jacobi-Bellman-Isaacs (HJBI) equations, respectively. A stability property of viscosity solutions is also proved. Under certina additional regularity assumptions on the diffusion coefficient, the uniqueness of the viscosity solution is addressed as well.