A stability property in mean field type differential games
From MaRDI portal
Publication:1998626
Abstract: The paper is concerned with the feedback approach to the deterministic mean field type differential games. Previously, it was shown that suboptimal strategies in the mean field type differential game can constructed based on functions of time and probability satisfying the stability condition. This property realizes the dynamic programming principle for the constant control of one player. We present the infinitesimal form of this condition involving analogs of the directional derivatives. In particular, we obtain the characterization of the value function of the deterministic mean field type differential game in the terms of directional derivatives and the set of directions feasible by virtue of the dynamics of the game.
Recommendations
Cites work
- scientific article; zbMATH DE number 3862969 (Why is no real title available?)
- scientific article; zbMATH DE number 5016968 (Why is no real title available?)
- scientific article; zbMATH DE number 192835 (Why is no real title available?)
- scientific article; zbMATH DE number 733957 (Why is no real title available?)
- scientific article; zbMATH DE number 1113627 (Why is no real title available?)
- scientific article; zbMATH DE number 2152346 (Why is no real title available?)
- scientific article; zbMATH DE number 3400017 (Why is no real title available?)
- A general stochastic maximum principle for SDEs of mean-field type
- A maximum principle for SDEs of mean-field type
- Bellman equation and viscosity solutions for mean-field stochastic control problem
- Dynamic Programming for Optimal Control of Stochastic McKean--Vlasov Dynamics
- Dynamic programming for mean-field type control
- Existence of Saddle Points in Differential Games
- Existence of optimal controls for systems governed by mean-field stochastic differential equations
- Forward-backward stochastic differential equations and controlled McKean-Vlasov dynamics
- Generalized control systems in the space of probability measures
- Hamilton-Jacobi equations in the Wasserstein space
- Infinite dimensional analysis. A hitchhiker's guide.
- Krasovskii-Subbotin approach to mean field type differential games
- Mayer control problem with probabilistic uncertainty on initial positions
- Mean field games and mean field type control theory
- On differentiability in the Wasserstein space and well-posedness for Hamilton-Jacobi equations
- Optimal control and viscosity solutions of Hamilton-Jacobi-Bellman equations
- Optimal control and zero-sum stochastic differential game problems of mean-field type
- The master equation for large population equilibriums
- The master equation in mean field theory
- Time-Optimal Control Problem in the Space of Probability Measures
- Values in differential games
- Viability theorem for deterministic mean field type control systems
- Viability theory
- Zero-sum stochastic differential games of generalized McKean-Vlasov type
Cited in
(4)- Ordering stability of Nash equilibria for a class of differential games
- Krasovskii-Subbotin approach to mean field type differential games
- Lattice approximations of the first-order mean field type differential games
- Feedback strategies in a game-theoretical control problem for a nonlocal continuity equation
This page was built for publication: A stability property in mean field type differential games
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q1998626)