Hamilton-Jacobi theory over time scales and applications to linear-quadratic problems
DOI10.1016/j.na.2011.09.027zbMath1230.49023MaRDI QIDQ651162
Roman Šimon Hilscher, Vera Zeidan
Publication date: 8 December 2011
Published in: Nonlinear Analysis. Theory, Methods \& Applications. Series A: Theory and Methods (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/j.na.2011.09.027
Hamilton-Jacobi-Bellman equation; dynamic programming; linear-quadratic regulator problem; Riccati equation; Hamilton-Jacobi theory; value function; feedback controller; Bellman principle of optimality; nonlinear optimal control problem; verification theorem; time scale symplectic system; weak Pontryagin principle
49L20: Dynamic programming in optimal control and differential games
93C10: Nonlinear systems in control theory
49N10: Linear-quadratic optimal control problems
49K15: Optimality conditions for problems involving ordinary differential equations
34N05: Dynamic equations on time scales or measure chains