Optimal Control of Stochastic Integrals and Hamilton–Jacobi–Bellman Equations. II
DOI10.1137/0320007zbMATH Open0478.93070OpenAlexW4253755470MaRDI QIDQ3936622FDOQ3936622
José-Luis Menaldi, P.-L. Lions
Publication date: 1982
Published in: SIAM Journal on Control and Optimization (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1137/0320007
Dynamic programming (90C39) Nonlinear elliptic equations (35J60) Fixed-point theorems (47H10) Semigroups of nonlinear operators (47H20) Existence of optimal solutions to problems involving randomness (49J55) Dynamic programming in optimal control and differential games (49L20) Optimal stochastic control (93E20)
Cited In (10)
- The dirichlet problem for nonlinear second-order elliptic equations. II. Complex monge-ampère, and uniformaly elliptic, equations
- On the Hamilton-Jacobi-Bellman equations
- Existence results for bellman equations and maximum principles in unbounded domains
- Stochastic optimal control problem with infinite horizon driven by G-Brownian motion
- A Counterexample toC2,1Regularity for Parabolic Fully Nonlinear Equations
- Wind time series modeling and stochastic optimal control for a grid-connected permanent magnet wind turbine generator
- Exit times for semimartingales under nonlinear expectation
- Nonlinear potentials for Hamilton-Jacobi-Bellman equations
- Generalized Hamilton-Jacobi-Bellman equations with Dirichlet boundary condition and stochastic exit time optimal control problem
- Optimal control of diffustion processes and hamilton-jacobi-bellman equations part I: the dynamic programming principle and application
This page was built for publication: Optimal Control of Stochastic Integrals and Hamilton–Jacobi–Bellman Equations. II
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q3936622)