Existence of optimal controls for partially observed jump processes (Q1862261)
From MaRDI portal
scientific article
Language | Label | Description | Also known as |
---|---|---|---|
English | Existence of optimal controls for partially observed jump processes |
scientific article |
Statements
Existence of optimal controls for partially observed jump processes (English)
0 references
11 March 2003
0 references
This paper deals with a partially observable stochastic control problem for an \({\mathbb R}^d\)-valued Markov jump process \((X_t)\) given an observation process \((Y_t)\) which is a point process. In a previous paper [\textit{C. Ceci} and \textit{A. Gerardi}, Stochastic Processes Appl. 78, No.~2, 245-260 (1998; Zbl 0934.60077)], the authors have considered the case where \((Y_t)\) is the process that counts the jumps of \((X_s)\) up to time \(t\). In the present work they address the more general problem where \((X_t,Y_t)\) is a pure jump Markov process whose components may be strongly dependent. In particular no independence assumption is made on the intensities of \((X_t)\) and \((Y_t)\), which may have common jumps. They formulate the usual associated separated control problem in which the first component is a measure-valued process satisfying a controlled filtering equation. A relaxed generalized and totally observable control problem, for which an optimal strategy is determined, is stated by weakening the measurability assumptions on the control processes. Equivalence between the partially observable stochastic control problem and the relaxed generalized separated control problem is discussed. In particular it is proved that the infimum expected cost of the separated problem equals the infimum expected cost of the original problem, using a uniqueness result for the solutions of controlled filtering equations.
0 references
jump processes
0 references
optimal stochastic control
0 references
partial observations
0 references
filtering
0 references
history sets
0 references
martingale problems
0 references