Examples of optimal control for partially observable systems:comparison, classical, and martingale methods
From MaRDI portal
Publication:3921124
DOI10.1080/17442508108833173zbMath0467.93070OpenAlexW2035582493MaRDI QIDQ3921124
Ioannis Karatzas, Václav E. Beneš
Publication date: 1981
Published in: Stochastics (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1080/17442508108833173
dynamic programmingstochastic controlseparation principlefiltermartingale approachoptimal feedback lawoptimal feedbackKalman-Bucy
Filtering in stochastic control theory (93E11) Dynamic programming (90C39) Optimal stochastic control (93E20)
Related Items
Estimation and control for linear, partially observable systems with non- Gaussian initial distribution, Separation principle for impulse control with partial information, Existence of optimal controls for partially observed linear diffusions
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Information states for linear stochastic systems
- On Transforming a Certain Class of Stochastic Processes by Absolutely Continuous Substitution of Measures
- The Separation Principle in Stochastic Control via Girsanov Solutions
- Composition and invariance methods for solving some stochastic control problems
- On “predicted miss” stochastic control problems
- On the Separation Theorem of Stochastic Control
- Some Extensions of the Innovations Theorem*