The Bellman equation for time-optimal control of noncontrollable, nonlinear systems (Q1308768)

From MaRDI portal
scientific article
Language Label Description Also known as
English
The Bellman equation for time-optimal control of noncontrollable, nonlinear systems
scientific article

    Statements

    The Bellman equation for time-optimal control of noncontrollable, nonlinear systems (English)
    0 references
    0 references
    0 references
    6 January 1994
    0 references
    The authors consider time optimal control for nonlinear systems which are not small time locally controllable around the target set. Here, in particular, the optimal value function is not continuous. In order to get unique solutions, a transformation is applied giving a Hamilton-Jacobi equation which is analyzed using the methods of viscosity solutions. Applications to verification theorems and to the stability of the minimum time function with respect to perturbations are given.
    0 references
    viscosity solutions
    0 references
    minimum time function
    0 references
    perturbations
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references

    Identifiers