Approximation and optimality necessary conditions in relaxed stochastic control problems (Q995846)

From MaRDI portal





scientific article; zbMATH DE number 5189223
Language Label Description Also known as
default for all languages
No label defined
    English
    Approximation and optimality necessary conditions in relaxed stochastic control problems
    scientific article; zbMATH DE number 5189223

      Statements

      Approximation and optimality necessary conditions in relaxed stochastic control problems (English)
      0 references
      0 references
      0 references
      0 references
      10 September 2007
      0 references
      Summary: We consider a control problem where the state variable is a solution of a stochastic differential equation (SDE) in which the control enters both the drift and the diffusion coefficient. We study the relaxed problem for which admissible controls are measure-valued processes and the state variable is governed by an SDE driven by an orthogonal martingale measure. Under some mild conditions on the coefficients and pathwise uniqueness, we prove that every diffusion process associated to a relaxed control is a strong limit of a sequence of diffusion processes associated to strict controls. As a consequence, we show that the strict and the relaxed control problems have the same value function and that an optimal relaxed control exists. Moreover we derive a maximum principle of the Pontryagin type, extending the well-known Peng stochastic maximum principle to the class of measure-valued controls.
      0 references
      stochastic differential equation
      0 references
      Peng maximum principle
      0 references
      maximum principle of Pontryagin type
      0 references

      Identifiers

      0 references
      0 references
      0 references
      0 references
      0 references
      0 references