Discounted and average Markov decision processes with unbounded rewards: New conditions
From MaRDI portal
(Redirected from Publication:1206951)
Recommendations
- scientific article; zbMATH DE number 1536370
- scientific article; zbMATH DE number 700091
- scientific article; zbMATH DE number 1159486
- New discount and average optimality conditions for continuous-time Markov decision processes
- A new condition for the existence of optimal stationary policies in average cost Markov decision processes
Cites work
Cited in
(14)- Markov decision processes with state-dependent discount factors and unbounded rewards/costs
- Necessary and sufficient conditions for a bounded solution to the optimality equation in average reward Markov decision chains
- A survey of recent results on continuous-time Markov decision processes (with comments and rejoinder)
- On the reduction of total-cost and average-cost MDPs to discounted mdps
- Conditions for the uniqueness of optimal policies of discounted Markov decision processes
- An unbounded Berge's minimum theorem with applications to discounted Markov decision processes
- Analysis for some properties of discrete time Markov decision processes
- scientific article; zbMATH DE number 4170671 (Why is no real title available?)
- Unbounded cost Markov decision processes with limsup and liminf average criteria: new conditions
- On the existence of relative values for undiscounted Markovian decision processes with a scalar gain rate
- Application of average dynamic programming to inventory systems
- scientific article; zbMATH DE number 3900544 (Why is no real title available?)
- A note on the vanishing interest rate approach in average Markov decision chains with continuous and bounded costs
- Another set of conditions for average optimality in Markov control processes
This page was built for publication: Discounted and average Markov decision processes with unbounded rewards: New conditions
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q1206951)