Minimal-time functions in problems without local controllability
From MaRDI portal
Publication:1321363
DOI10.1007/BF00940699zbMath0794.93014MaRDI QIDQ1321363
Publication date: 27 April 1994
Published in: Journal of Optimization Theory and Applications (Search for Journal in Brave)
93B05: Controllability
49J15: Existence theories for optimal control problems involving ordinary differential equations
93C15: Control/observation systems governed by ordinary differential equations
Related Items
The Bellman equation for time-optimal control of noncontrollable, nonlinear systems, Lower semicontinuous solutions of the Bellman equation for the minimum time problem
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- On a generalized Bellman equation for the optimal-time problem
- On the sufficiency of the Hamilton-Jacobi-Bellman equation for optimality of the controls in a linear optimal-time problem
- Minimal time function and viscosity solutions
- The Bellman equation for time-optimal control of noncontrollable, nonlinear systems
- Functional analysis and time optimal control
- An Approximation Scheme for the Minimum Time Function
- Some Properties of Viscosity Solutions of Hamilton-Jacobi Equations
- A Boundary Value Problem for the Minimum-Time Function
- The Bang-Bang Principle for Linear Control Systems