Optimal control and differential games with measures
From MaRDI portal
Publication:4275616
DOI10.1016/0362-546X(93)90019-OzbMath0799.90139OpenAlexW2015973428MaRDI QIDQ4275616
R. Jensen, José Luis Menaldi, Emmanuel Nicholas Barron
Publication date: 17 November 1994
Published in: Nonlinear Analysis: Theory, Methods & Applications (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/0362-546x(93)90019-o
Differential games (aspects of game theory) (91A23) Perturbations in control/observation systems (93C73)
Related Items (9)
Approximation of control problems involving ordinary and impulsive controls ⋮ \((L^\infty+\mathrm{Bolza})\) control problems as dynamic differential games ⋮ Continuity of the upper and lower value of slow growth differential games ⋮ On feedback strengthening of the maximum principle for measure differential equations ⋮ Invariant solutions of differential games with measures: a discontinuous time reparameterization approach ⋮ State constrained control problems with neither coercivity nor \(L^1\) bounds on the controls ⋮ Relaxation of minimax optimal control problems with infinite horizon ⋮ Comparison theorems for viscosity solutions of a system of quasivariational inequalities with application to optimal control with switching costs ⋮ Semicontinuous viscosity solutions to mixed boundary value problems with degenerate convex Hamiltonians
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Perron's method for Hamilton-Jacobi equations
- Existence results for first order Hamilton Jacobi equations
- Uniqueness of viscosity solutions of Hamilton-Jacobi equations revisited
- The Bellman equation for minimizing the maximum cost
- On singular stochastic control problems for diffusion with jumps
- Some Properties of Viscosity Solutions of Hamilton-Jacobi Equations
- Differential games with maximum cost
- Viscosity Solutions for the Monotone Control Problem
- Deterministic Impulse Control Problems
- Probabilistic aspects of finite-fuel stochastic control
- Existence Theorems for Optimal Control and Calculus of Variations Problems Where the States Can Jump
- Additive Control of Stochastic Linear Systems with Finite Horizon
- Viscosity Solutions of Hamilton-Jacobi Equations
- Discontinuous solutions of deterministic optimal stopping time problems
- Optimal Switching for Ordinary Differential Equations
- A Maximum Principle for Optimal Processes with Discontinuous Trajectories
- Monotone Control of a Damped Oscillator Under Random Perturbations
- The Pontryagin Maximum Principle From Dynamic Programming and Viscosity Solutions to First-Order Partial Differential Equations
- An Extended Pontryagin Principle for Control Systems whose Control Laws Contain Measures
- Optimal Control Theory for Nonlinear Vector Differential Equations Containing Measures
This page was built for publication: Optimal control and differential games with measures