From one dimensional diffusions to symmetric Markov processes
From MaRDI portal
Publication:972806
DOI10.1016/j.spa.2010.01.010zbMath1221.60113OpenAlexW1966107808MaRDI QIDQ972806
Publication date: 21 May 2010
Published in: Stochastic Processes and their Applications (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/j.spa.2010.01.010
Continuous-time Markov processes on general state spaces (60J25) Dirichlet forms (31C25) Diffusion processes (60J60) Boundary theory for Markov processes (60J50)
Related Items (9)
Regular subspaces of skew product diffusions ⋮ Reflections at infinity of time changed RBMs on a domain with Liouville branches ⋮ Jump-type Hunt processes generated by lower bounded semi-Dirichlet forms ⋮ Two methods for studying the response and the reliability of a fractional stochastic dynamical system ⋮ On general boundary conditions for one-dimensional diffusions with symmetry ⋮ On symmetric linear diffusions ⋮ One-point reflection ⋮ On the Dirichlet form of three-dimensional Brownian motion conditioned to hit the origin ⋮ Silverstein extension and Fukushima extension
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- On unique extension of time changed reflecting Brownian motions
- On boundaries and lateral conditions for the Kolmogorov differential equations
- On reflected Dirichlet spaces
- Symmetric Markov processes
- Dirichlet forms and symmetric Markov processes
- Reflected Dirichlet forms and the uniqueness of Silverstein's extension
- On regular Dirichlet subspaces of \(H^1(I)\) and associated linear diffusions
- Diffusion processes and their sample paths.
- On Feller's boundary problem for Markov processes in weak duality
- Poisson point processes attached to symmetric diffusions
- Entrance law, exit system and Lévy system of time changed processes
- Brownian motions on a half line
- Les espaces du type de Beppo Levi
- On notions of harmonicity
- DIRICHLET SPACES
This page was built for publication: From one dimensional diffusions to symmetric Markov processes