Constrained Markov decision processes with first passage criteria
From MaRDI portal
Publication:363565
DOI10.1007/S10479-012-1292-1zbMATH Open1271.90104OpenAlexW2079077148MaRDI QIDQ363565FDOQ363565
Authors: Qingda Wei, Xianping Guo, Yonghui Huang
Publication date: 3 September 2013
Published in: Annals of Operations Research (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s10479-012-1292-1
Recommendations
- First passage Markov decision processes with constraints and varying discount factors
- Constrained optimality for first passage criteria in semi-Markov decision processes
- Constrained continuous-time Markov decision processes with average criteria
- Total reward criteria for unconstrained/constrained continuous-time Markov decision processes
- Constrained Continuous-Time Markov Control Processes with Discounted Criteria
first passage timeMarkov decision processestarget setconstrained optimal policyexpected first passage reward/cost
Cites Work
- Title not available (Why is that?)
- Title not available (Why is that?)
- Title not available (Why is that?)
- Continuous-time Markov decision processes. Theory and applications
- Title not available (Why is that?)
- Finite state Markovian decision processes
- Convergence of the optimal values of constrained Markov control processes
- Constrained Markov control processes in Borel spaces: the discounted case
- Optimal pension funding dynamics over infinite control horizon when stochastic rates of return are stationary
- Constrained Discounted Markov Decision Chains
- Constrained Average Cost Markov Control Processes in Borel Spaces
- Constrained markov decision processes with compact state and action spaces: the average case
- Constrained Continuous-Time Markov Control Processes with Discounted Criteria
- Title not available (Why is that?)
- Discounting the distant future: How much do uncertain rates increase valuations?
- Optimal policies for controlled Markov chains with a constraint
- Optimization models for the first arrival target distribution function in discrete time
- Constrained denumerable state non-stationary MDPs with expected total reward criterion
- When to refinance a mortgage: a dynamic programming approach
- Optimal risk probability for first passage models in semi-Markov decision processes
- First passage models for denumerable semi-Markov decision processes with nonnegative discounted costs
- Constrained continuous-time Markov decision processes with average criteria
- Deterministic optimal policies for Markov control processes with pathwise constraints
- Title not available (Why is that?)
- Stochastic Target Hitting Time and the Problem of Early Retirement
- Markov decision processes with distribution function criterion of first-passage time
- An actor-critic algorithm with function approximation for discounted cost constrained Markov decision processes
- On discounted dynamic programming with constraints
Cited In (10)
- Zero-sum Markov games with random state-actions-dependent discount factors: existence of optimal strategies
- Constrained optimality for first passage criteria in semi-Markov decision processes
- First passage risk probability optimality for continuous time Markov decision processes
- First passage Markov decision processes with constraints and varying discount factors
- Discrete-time zero-sum Markov games with first passage criteria
- Discrete-time Markov decision processes with first passage models
- Convergence of Markov decision processes with constraints and state-action dependent discount factors
- Markov decision processes with distribution function criterion of first-passage time
- Finite approximation of the first passage models for discrete-time Markov decision processes with varying discount factors
- The Lagrange and the vanishing discount techniques to controlled diffusions with cost constraints
This page was built for publication: Constrained Markov decision processes with first passage criteria
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q363565)