Qualitative properties of trajectories of control systems: a survey
DOI10.1007/BF02254655zbMath0951.49003MaRDI QIDQ1972685
Yuri S. Ledyaev, Peter R. Wolenski, Frank H. Clarke, Ronald J. Stern
Publication date: 13 April 2000
Published in: Journal of Dynamical and Control Systems (Search for Journal in Brave)
optimal control; nonsmooth analysis; Hamilton-Jacobi equation; necessary conditions; equilibria; feedback synthesis; weak invariance; strong invariance; Hamiltonian inclusions; local attainability of a set; Lyapounov stability of invariant sets; monotonicity along trajectories; verification functions
49L20: Dynamic programming in optimal control and differential games
49N35: Optimal feedback synthesis
49J52: Nonsmooth analysis
34A60: Ordinary differential inclusions
93B03: Attainable sets, reachability
49K15: Optimality conditions for problems involving ordinary differential equations
49-02: Research exposition (monographs, survey articles) pertaining to calculus of variations and optimal control
Related Items
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Hamilton-Jacobi equations: Viscosity solutions and generalized gradients
- Generalization of the main equation of differential game theory
- Characterizations of the values of differential games
- New results on the relationship between dynamic programming and the maximum principle
- Differential inclusions with free time
- The Hamilton-Jacobi equation. A global approach
- Monotone trajectories of differential inclusions and functional differential inclusions with memory
- Construction of optimal feedback controls
- Critical point theory and Hamiltonian systems
- Value function and optimality conditions for semilinear control problems
- The maximum principle, Bellman's equation, and Carathéodory's work
- Uniqueness of solutions to the Hamilton-Jacobi equation: A system theoretic proof
- Sufficiency conditions with minimal regularity assumptions
- Euler characteristic and fixed-point theorems
- Control of systems to sets and their interiors
- Proximal analysis and minimization principles
- Necessary conditions for functional differential inclusions
- A decoupling principle in the calculus of variations
- Stability in general control systems
- The fixed point theory of multi-valued mappings in topological vector spaces
- Local Optimality Conditions and Lipschitzian Solutions to the Hamilton–Jacobi Equation
- Hamilton-Jacobi Equations With Singular Boundary Conditions on a free Boundary and Applications to Differential Games
- Semicontinuous Viscosity Solutions For Hamilton–Jacobi Equations With Convex Hamiltonians
- The Adjoint Arc in Nonsmooth Optimization
- Perturbed optimal control problems
- Extensions of subgradient calculus with applications to optimization
- Periodic Solutions of Hamilton's Equations and Local Minima of the Dual Action
- Viscosity Solutions of Hamilton-Jacobi Equations
- The Value Function in Optimal Control: Sensitivity, Controllability, and Time-Optimality
- Hamiltonian Analysis of the Generalized Problem of Bolza
- Parameter sensitivity in stochastic optimal control∗
- The Relationship between the Maximum Principle and Dynamic Programming
- DIFFERENTIAL GAMES. APPROXIMATION AND FORMAL MODELS
- Optimal Control and Semicontinuous Viscosity Solutions
- Second-Order Hamilton–Jacobi Equations in Infinite Dimensions
- The Sensitivity of Optimal Control Problems to Time Delay
- User’s guide to viscosity solutions of second order partial differential equations
- Necessary and sufficient optimality conditions for control of piecewise deterministic markov processes
- Generalized Gradients and Applications
- Monotone Invariant Solutions to Differential Inclusions
- Optimal Control and the True Hamiltonian
- Subgradient Criteria for Monotonicity, The Lipschitz Condition, and Convexity
- Optimal Control of Unbounded Differential Inclusions
- Lipschitzian stability of constraint systems and generalized equations
- Generalized Solutions of the Hamilton–Jacobi Equation of Stochastic Control
- Mean Value Inequalities in Hilbert Space
- Mean Value Inequalities
- Lyapunov stability theory of nonsmooth systems
- Dualization of subgradient conditions for optimality
- Convex Duality and Nonlinear Optimal Control
- Lower Semicontinuous Solutions of Hamilton–Jacobi–Bellman Equations
- Generalized one-sided estimates for solutions of Hamilton-Jacobi equations and applications
- Optimal Multiprocesses
- Optimal Feedback Controls
- On Hamiltonian Flows and Symplectic Transformations
- Fixed points and equilibria in nonconvex sets
- Proximal Analysis and Approximate Subdifferentials
- Discontinuous viscosity solutions of first-order Hamilton-Jacobi equations: a guided visit
- Viability theory