High-order extensions of the Double Chain Markov Model
From MaRDI portal
Publication:3147435
DOI10.1081/STM-120004464zbMath1006.60071MaRDI QIDQ3147435
Publication date: 12 November 2002
Published in: Stochastic Models (Search for Journal in Brave)
Viterbi algorithmforward-backward algorithmdouble chain Markov modelBaum-Welch algorithmmixture transition distribution model
Time series, auto-correlation, regression, etc. in statistics (GARCH) (62M10) Applications of Markov chains and discrete-time Markov processes on general state spaces (social mobility, learning theory, industrial processes, etc.) (60J20)
Related Items (8)
The mixture transition distribution model for high-order Markov chains and non-Gaussian time series ⋮ General framework and model building in the class of hidden mixture transition distribution models ⋮ Efficient Bayesian estimation of the multivariate double chain Markov model ⋮ The table auto-regressive moving-average model for (categorical) stationary series: statistical properties (causality; from the all random to the conditional random) ⋮ Modeling the coupled return-spread high frequency dynamics of large tick assets ⋮ Efficient backward decoding of high-order hidden Markov models ⋮ Экономные модели цепей Маркова высокого порядка для оценивания криптографических генераторов ⋮ Lumps, breathers, interactions and rogue wave solutions for a stochastic gene evolution in double chain deoxyribonucleic acid system
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Estimation in the Mixture Transition Distribution Model
- The ergodic theory of Markov chains in random environments
- On Some Criteria for Estimating the Order of a Markov Chain
- Finite-horizon dynamic optimisation when the terminal reward is a concave functional of the distribution of the final state
- Estimation and Modelling Repeated Patterns in High Order Markov Chains with the Mixture Transition Distribution Model
- The double chain markov model
This page was built for publication: High-order extensions of the Double Chain Markov Model