Optimal control for stochastic partial differential equations and viscosity solutions of Bellman equations
From MaRDI portal
Publication:3988495
DOI10.1017/S0027763000003639zbMath0749.93083MaRDI QIDQ3988495
Publication date: 28 June 1992
Published in: Nagoya Mathematical Journal (Search for Journal in Brave)
Bellman equationviscosity solutionstochastic partial differential equationsinfinite horizon problems
Optimal stochastic control (93E20) Stochastic partial differential equations (aspects of stochastic analysis) (60H15) Viscosity solutions to Hamilton-Jacobi equations in optimal control and differential games (49L25)
Related Items (6)
Differential games for stochastic partial differential equations ⋮ Infinite horizon optimal control of stochastic delay evolution equations in Hilbert spaces ⋮ On consistent regularities of control and value functions ⋮ Remarks on optimal controls of stochastic partial differential equations ⋮ Maximum principle for forward-backward doubly stochastic control systems and applications ⋮ On the existence of stochastic optimal control of distributed state system
Cites Work
This page was built for publication: Optimal control for stochastic partial differential equations and viscosity solutions of Bellman equations