Waiting times and stopping probabilities for patterns in Markov chains
From MaRDI portal
Publication:1747367
Abstract: Suppose that is a finite collection of patterns. Observe a Markov chain until one of the patterns in occurs as a run. This time is denoted by . In this paper, we aim to give an easy way to calculate the mean waiting time and the stopping probabilities with , where is the waiting time until the pattern appears as a run.
Recommendations
Cites work
- A martingale approach to the study of occurrence of sequence patterns in repeated experiments
- Double-scan statistics
- Gambling Teams and Waiting Times for Patterns in Two-State Markov Chains
- On occurrence of patterns in Markov chains: Method of gambling teams
- On probability generating functions for waiting time distributions of compound patterns in a sequence of multistate trials
- Pattern matching probabilities and paradoxes as a new variation on Penney's coin game
- Stopping Probabilities for Patterns in Markov Chains
- String overlaps, pattern matching, and nontransitive games
- The Occurrence of Sequence Patterns in Repeated Dependent Experiments
- The occurrence of sequence patterns in repeated experiments and hitting times in a Markov chain
Cited in
(7)- Stopping Probabilities for Patterns in Markov Chains
- Oscillation properties of expected stopping times and stopping probabilities for patterns consisting of consecutive states in Markov chains
- Patterns generated by \(m\)th-order Markov chains
- The occurrence of sequence patterns in ergodic Markov chains
- scientific article; zbMATH DE number 6719657 (Why is no real title available?)
- Computing the pattern waiting time: a revisit of the intuitive approach
- scientific article; zbMATH DE number 2134172 (Why is no real title available?)
This page was built for publication: Waiting times and stopping probabilities for patterns in Markov chains
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q1747367)