Some limit theorems for a general Markov process

From MaRDI portal
Revision as of 04:14, 7 March 2024 by Import240305080351 (talk | contribs) (Created automatically from import240305080351)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)

Publication:5643390

DOI10.1007/BF00531804zbMath0234.60086MaRDI QIDQ5643390

Naresh C. Jain

Publication date: 1966

Published in: Zeitschrift für Wahrscheinlichkeitstheorie und verwandte Gebiete (Search for Journal in Brave)




Related Items (24)

Unnamed ItemSome ratio limit Theorems for a general state space Markov ProcessContributions to Doeblin's theory of Markov processesOpérateurs potentiels des chaînes et des processus de Markov irréductiblesOn the global limit behaviour of Markov chains and of general nonsingular Markov processesPointwise convergence of the iterates of a Harris-recurrent Markov operatorIsomorphism and Approximation of General State Markov ProcessesAsymptotic decomposition of substochastic operators and semigroupsThe asymptotic distributional behaviour of transformations preserving infinite measuresUnnamed ItemNotes on 1-recurrent Markov chainsMarches récurrentes au sens de Harris sur les groupes localement compacts. IUnnamed ItemTheorems for conditional expectations, with applications to Markov processesOn a switch-over policy for controlling the workload in a queueing system with two constant service rates and fixed switch-over costsInventory control with two switch-over levels for a class of M/G/1 queueing systems with variable arrival and service rateUpper Bounds for Ergodic Sums of Infinite Measure Preserving TransformationsErgodic behaviour of stochastic parabolic equationsSome limit theorems for Markov processesMixed ratio limit theorems for Markov processesRatio limit theorems for Markov processesMarkov chains recurrent in the sense of HarrisSome results in Doeblin’s theory of Markov chainsDoeblin's and Harris' theory of Markov processes




Cites Work




This page was built for publication: Some limit theorems for a general Markov process