Error estimation and adaptive discretization for the discrete stochastic Hamilton-Jacobi-Bellman equation (Q706233)
From MaRDI portal
scientific article
Language | Label | Description | Also known as |
---|---|---|---|
English | Error estimation and adaptive discretization for the discrete stochastic Hamilton-Jacobi-Bellman equation |
scientific article |
Statements
Error estimation and adaptive discretization for the discrete stochastic Hamilton-Jacobi-Bellman equation (English)
0 references
8 February 2005
0 references
The dynamic programming method is a well known technique for the numerical solution of optimal control problems. Generalizing the technique and results from the deterministic case [cf. the author, ibid. 75, 319--337 (1997; Zbl 0880.65045)], the author obtains a posteriori error estimates for the space discretization of the stochastic Hamilton-Jacobi-Bellman equation. This method gives full global information about the optimal value function of the related stochastic optimal control problem. Therefore a feedback optimal control can be obtained. It is also demonstrated that the a posteriori error estimates are efficient and reliable for the numerical approximation of PDEs and they allow to derive a bound for the numerical error corresponding to the derivatives. The asymptotic behavior of the error estimates with respect to the size of the grid elements is also investigated. Finally, an adaptive space discretization scheme is developed and numerical examples are presented.
0 references
stochastic optimal control
0 references
stochastic Hamilton-Jacobi-Bellman equation
0 references
a posteriori error estimates
0 references
feedback optimal control
0 references
numerical examples
0 references
0 references
0 references
0 references
0 references
0 references