On the sufficiency of the Hamilton-Jacobi-Bellman equation for optimality of the controls in a linear optimal-time problem
DOI10.1016/0167-6911(86)90132-5zbMath0597.49014OpenAlexW2052361005MaRDI QIDQ1079174
Publication date: 1986
Published in: Systems \& Control Letters (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/0167-6911(86)90132-5
generalized gradientsHamilton-Jacobi-Bellman (HJB) equationautonomous finite-dimensional linear systemsoptimal-time control
Dynamic programming in optimal control and differential games (49L20) Fréchet and Gateaux differentiability in optimization (49J50) Linear systems in control theory (93C05) Existence theories for optimal control problems involving ordinary differential equations (49J15) Attainable sets, reachability (93B03) Optimality conditions for problems involving ordinary differential equations (49K15) Hamilton-Jacobi theories (49L99) Model systems in control theory (93C99)
Related Items (3)
Cites Work
- On a generalized Bellman equation for the optimal-time problem
- Functional analysis and time optimal control
- Local Optimality Conditions and Lipschitzian Solutions to the Hamilton–Jacobi Equation
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
This page was built for publication: On the sufficiency of the Hamilton-Jacobi-Bellman equation for optimality of the controls in a linear optimal-time problem