Tests for nonergodicity of denumerable continuous time Markov processes
From MaRDI portal
Publication:931769
DOI10.1016/j.camwa.2007.07.003zbMath1155.60030OpenAlexW2143692740MaRDI QIDQ931769
Publication date: 26 June 2008
Published in: Computers \& Mathematics with Applications (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/j.camwa.2007.07.003
Continuous-time Markov processes on general state spaces (60J25) Queueing theory (aspects of probability theory) (60K25) Queues and service in operations research (90B22)
Related Items (4)
Stability of a queue with discriminatory random order service discipline and heterogeneous servers ⋮ On nonergodicity for nonparametric autoregressive models ⋮ A survey of retrial queueing systems ⋮ Stability of a multi-class multi-server retrial queueing system with service times depending on classes and servers
Cites Work
- Unnamed Item
- Non-ergodicity criteria for denumerable continuous time Markov processes.
- Continuous-time Markov chains. An applications-oriented approach
- Ergodicity of the BMAP/PH/s/s+K retrial queue with PH-retrial times
- A matrix continued fraction algorithm for the multiserver repeated order queue.
- Technical Note—Mean Drifts and the Non-Ergodicity of Markov Chains
- On sufficient conditions for ergodicity of multichannel queueing systems with repeated calls
- Tests for the Nonergodicity of Multidimensional Markov Chains
- A sufficient condition of nonergodicity of a Markov chain (Corresp.)
- Criteria for ergodicity, exponential ergodicity and strong ergodicity of Markov processes
- Sufficient conditions for regularity, recurrence and ergodicity of Markov processes
- The map/ph/1 retrial queue
- Applied Probability and Queues
- Topics in the Constructive Theory of Countable Markov Chains
- Retrial Queues
- Some Conditions for Ergodicity and Recurrence of Markov Chains
- On the Stochastic Matrices Associated with Certain Queuing Processes
This page was built for publication: Tests for nonergodicity of denumerable continuous time Markov processes