Further results on the bellman equation for optimal control problems with exit times and nonnegative lagrangians
From MaRDI portal
Publication:2503527
DOI10.1016/S0167-6911(03)00130-0zbMATH Open1157.49309MaRDI QIDQ2503527FDOQ2503527
Publication date: 21 September 2006
Published in: Systems \& Control Letters (Search for Journal in Brave)
Viscosity solutions to Hamilton-Jacobi equations in optimal control and differential games (49L25) Control/observation systems governed by ordinary differential equations (93C15)
Cites Work
- Title not available (Why is that?)
- Title not available (Why is that?)
- Title not available (Why is that?)
- On the Bellman equation for some unbounded control problems
- On the Bellman equation for infinite horizon problems with unbounded cost functional
- A generalization of Zubov's method to perturbed systems
- Regular Synthesis and Sufficiency Conditions for Optimality
- Optimal control and viscosity solutions of Hamilton-Jacobi-Bellman equations
- Theory of chattering control with applications to astronautics, robotics, economics, and engineering
- Maximal subsolutions for a class of degenerate Hamilton-Jacobi problems
- Stochastic and differential games. Theory and numerical methods. Dedicated to Prof. A. I. Subbotin
- A General Theorem on Local Controllability
- Optimality principles and representation formulas for viscosity solutions of Hamilton-Jacobi equations. II. Equations of control problems with state constraints
- Hamilton-Jacobi Equations With Singular Boundary Conditions on a free Boundary and Applications to Differential Games
- Viscosity Solutions of the Bellman Equation for Exit Time Optimal Control Problems with Non-Lipschitz Dynamics
- Viscosity Solutions of the Bellman Equation for Exit Time Optimal Control Problems with Vanishing Lagrangians
- Nonlinear Optimal Control with Infinite Horizon for Distributed Parameter Systems and Stationary Hamilton–Jacobi Equations
- Bounded-from-below solutions of the Hamilton-Jacobi equation for optimal control problems with exit times: Vanishing lagrangians, eikonal equations, and shape-from-shading
- Pursuit–Evasion Problems and Viscosity Solutions of Isaacs Equations
- Optimality principles and representation formulas for viscosity solutions of Hamilton-Jacobi equations. I: Equations of unbounded and degenerate control problems without uniqueness
- Meagre functions and asymptotic behaviour of dynamical systems
Cited In (5)
- Optimality principles and uniqueness for Bellman equations of unbounded control problems with discontinuous running cost
- The value function of an asymptotic exit-time optimal control problem
- The geometry of the solution set of nonlinear optimal control problems
- Financing policies via stochastic control: a dynamic programming approach
- A strong comparison result for the bellman equation arising in stochastic exit time control problems and its applications
This page was built for publication: Further results on the bellman equation for optimal control problems with exit times and nonnegative lagrangians
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2503527)