Stable states of perturbed Markov chains
From MaRDI portal
Publication:4608577
Abstract: Given an infinitesimal perturbation of a discrete-time finite Markov chain, we seek the states that are stable despite the perturbation, extit{i.e.} the states whose weights in the stationary distributions can be bounded away from as the noise fades away. Chemists, economists, and computer scientists have been studying irreducible perturbations built with exponential maps. Under these assumptions, Young proved the existence of and computed the stable states in cubic time. We fully drop these assumptions, generalize Young's technique, and show that stability is decidable as long as is. Furthermore, if the perturbation maps (and their multiplications) satisfy or , we prove the existence of and compute the stable states and the metastable dynamics at all time scales where some states vanish. Conversely, if the big- assumption does not hold, we build a perturbation with these maps and no stable state. Our algorithm also runs in cubic time despite the general assumptions and the additional work. Proving the correctness of the algorithm relies on new or rephrased results in Markov chain theory, and on algebraic abstractions thereof.
Recommendations
Cited in
(5)- A stable recursion for the steady state vector in markov chains of m/g/1 type
- On the perturbation of Markov chains with nearly transient states
- scientific article; zbMATH DE number 7446610 (Why is no real title available?)
- scientific article; zbMATH DE number 1025906 (Why is no real title available?)
- Strong stability and perturbation bounds for discrete Markov chains
This page was built for publication: Stable states of perturbed Markov chains
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q4608577)