Average optimality in dynamic programming on Borel spaces -- unbounded costs and controls
From MaRDI portal
Publication:1190402
DOI10.1016/0167-6911(91)90069-QzbMath0771.90098MaRDI QIDQ1190402
Publication date: 26 September 1992
Published in: Systems \& Control Letters (Search for Journal in Brave)
Borel state space; unbounded rewards; average-optimal policy; Borel action space; discrete-time Markovian decision process
90C39: Dynamic programming
Related Items
Fatou's Lemma in Its Classical Form and Lebesgue's Convergence Theorems for Varying Measures with Applications to Markov Decision Processes, Average Cost Optimality Inequality for Markov Decision Processes with Borel Spaces and Universally Measurable Policies, On the Minimum Pair Approach for Average Cost Markov Decision Processes with Countable Discrete Action Spaces and Strictly Unbounded Costs, Examples concerning Abel and Cesàro limits, Application of average dynamic programming to inventory systems, The average cost optimality equation for Markov control processes on Borel spaces, Weak conditions for average optimality in Markov control processes, Value iteration in average cost Markov control processes on Borel spaces, Average Cost Markov Decision Processes with Weakly Continuous Transition Probabilities, Unnamed Item
Cites Work
- On strong average optimality of Markov decision processes with unbounded costs
- Comparing recent assumptions for the existence of average optimal stationary policies
- Adaptive Markov control processes
- Average cost optimal policies for Markov control processes with Borel state space and unbounded costs
- On the compactness method in general ergodic impulsive control of markov processes
- Control of Markov Chains with Long-Run Average Cost Criterion: The Dynamic Programming Equations
- Average Cost Optimal Stationary Policies in Infinite State Markov Decision Processes with Unbounded Costs
- Average Optimality in Dynamic Programming with General State Space
- Optimal control of service rates in networks of queues
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item