Approximating a diffusion by a finite-state hidden Markov model
From MaRDI portal
Publication:2360239
Abstract: For a wide class of continuous-time Markov processes, including all irreducible hypoelliptic diffusions evolving on an open, connected subset of , the following are shown to be equivalent: (i) The process satisfies (a slightly weaker version of) the classical Donsker-Varadhan conditions; (ii) The transition semigroup of the process can be approximated by a finite-state hidden Markov model, in a strong sense in terms of an associated operator norm; (iii) The resolvent kernel of the process is `-separable', that is, it can be approximated arbitrarily well in operator norm by finite-rank kernels. Under any (hence all) of the above conditions, the Markov process is shown to have a purely discrete spectrum on a naturally associated weighted space.
Recommendations
- Approximation of stationary processes by hidden Markov models
- Finite-dimensional models for hidden Markov chains
- Stochastic approximations for finite-state Markov chains
- Approximating the Variance of the Conditional Probability of the State of a Hidden Markov Model
- Hidden Markov Chain Filtering for a Jump Diffusion Model
- On approximation of smoothing probabilities for hidden Markov models
- On the approximation quality of Markov state models
- scientific article; zbMATH DE number 2144803
Cites work
- scientific article; zbMATH DE number 3940343 (Why is no real title available?)
- scientific article; zbMATH DE number 3951715 (Why is no real title available?)
- scientific article; zbMATH DE number 475325 (Why is no real title available?)
- scientific article; zbMATH DE number 739283 (Why is no real title available?)
- scientific article; zbMATH DE number 1158743 (Why is no real title available?)
- scientific article; zbMATH DE number 1515832 (Why is no real title available?)
- scientific article; zbMATH DE number 847787 (Why is no real title available?)
- Asymptotic evaluation of certain Markov process expectations for large time—III
- Asymptotic evaluation of certain markov process expectations for large time, II
- Asymptotic evaluation of certain markov process expectations for large time. IV
- Comparing Markov chains: aggregation and precedence relations applied to sets of states, with applications to assemble-to-order systems
- Discrete Dynamic Programming with Sensitive Discount Optimality Criteria
- Essential spectral radius for Markov semigroups. I: Discrete time case
- Exit probabilities and optimal stochastic control
- Exponential and uniform ergodicity of Markov processes
- Fluctuations of the entropy production in anharmonic chains
- General Irreducible Markov Chains and Non-Negative Operators
- Inequalities in Theorems of Ergodicity and Stability for Markov Chains with Common Phase Space. I
- Large and moderate deviations and exponential convergence for stochastic damping Hamiltonian systems.
- Large deviations asymptotics and the spectral theory of multiplicatively regular Markov proces\-ses
- Large deviations for stochastic processes.
- Markov chains and stochastic stability
- Martingale problems for large deviations of Markov processes
- Multiplicative ergodicity and large deviations for an irreducible Markov chain.
- Optimal Kullback-Leibler Aggregation via Spectral Theory of Markov Chains
- Spectral gap of positive operators and applications
- Spectral theory and limit theorems for geometrically ergodic Markov processes
- Stability of Markovian processes II: continuous-time processes and sampled chains
- Stability of Markovian processes III: Foster–Lyapunov criteria for continuous-time processes
- Transportation-information inequalities for Markov processes
- Two-sided bounds for degenerate processes with densities supported in subsets of \(\mathbb R^N\)
- Unsupervised learning by probabilistic latent semantic analysis
Cited in
(3)
This page was built for publication: Approximating a diffusion by a finite-state hidden Markov model
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2360239)