Remarks on optimal controls of stochastic partial differential equations (Q1175511)
From MaRDI portal
scientific article
Language | Label | Description | Also known as |
---|---|---|---|
English | Remarks on optimal controls of stochastic partial differential equations |
scientific article |
Statements
Remarks on optimal controls of stochastic partial differential equations (English)
0 references
25 June 1992
0 references
Two important points of view to study the optimal control problems are Pontryagin's maximum principle and Bellman dynamic programming. The classical result between the two approaches is that the partial differential of the value function in the state variable along the optimal path turns out to be the adjoint function-process in the maximum principle. In this paper the author interpretes the above classical result in the language of Crandall and Lions super and sub-differential [\textit{M. G. Crandall} and \textit{P. L. Lions}, Trans. Am. Math. Soc. 277, 1-4 (1983; Zbl 0599.35024)] in the case of optimal control of stochastic partial differential equations. More precisely, if \(\hat q\) is the corresponding optimal state of an optimal control, we have a.e. for \(t\in [0,1]\) \[ D^-V(t,\hat q(t))\;\{\lambda(t)\}\subset D^+V(t,\hat q(t)) \] where \(V\) is the value of the optimal control problem, \(\lambda\) the adjoint process and \(D^+,D^-\) are respectively sub and super differential. In the last paragraph a comparison between the super and sub-differential, and Clarke's generalized gradient [\textit{F. H. Clark}, Optimization and nonsmooth analysis (1983; Zbl 0582.49001)] is discussed.
0 references
maximum principle
0 references
stochastic partial differential equations
0 references
optimal control
0 references
0 references