On dynamical Gaussian random walks (Q2569225)
From MaRDI portal
scientific article
Language | Label | Description | Also known as |
---|---|---|---|
English | On dynamical Gaussian random walks |
scientific article |
Statements
On dynamical Gaussian random walks (English)
0 references
18 October 2005
0 references
Let \((\omega_j)\) be iid random variables and associate to each \(\omega_j\) independently Poisson processes with intensity \(1\); the jump times of the \(j\)th processes are denoted by \(\tau_j(k)\), \(k\in\mathbb N\). Finally, write \((\omega_j^k)_{j,k\in\mathbb N}\) for an array of independent copies of the \(\omega_j\)'s. Define a process \(X=\{X_j(t), t\geq 0\}_{j\in\mathbb N}\) by \[ X_j(0) := \omega_j, \qquad X_j(t) := \omega_j^k\quad\forall\,t\in [\tau_j(k),\tau_j(k+1)). \] A dynamical random walk is then given as the partial sum \(S_n(t) := X_1(t)+\cdots+X_n(t)\). Throughout this paper it is assumed that the distribution of \(\omega_1\) is standard normal. Denote by \(U_t^n(s) := S_{[ns]}(t) / \sqrt n\) the rescaled dynamical walk. The main theorems of this paper concern results on the convergence behaviour of the two-parameter field \(U_t^n(s)\) as \(n\to\infty\). Firstly, as \(n\to\infty\) the random fields \(U^n_t(s)\) converge weakly in \(D([0,1]^2)\) to the continuous centered Gaussian random field \(U_t(s)\) with covariance function \[ \mathbb E(U_t(s)U_{t'}(s')) = \exp(-|t-t'|)\min(s,s'),\quad s,t,s',t'\in [0,1]. \] Here \(D([0,1]^2)\) is the two-parameter Skorokhod-type space consisting of all càdlàg-functions on \([0,1]^2\) where we have a partial ordering on \([0,1]^2\) induced by \((s,t)\prec (s',t')\) if \(s\leq s'\) and \(t\leq t'\); this construction is due to \textit{G. Neuhaus} [Ann. Math. Stat. 42, 1285--1295 (1971; Zbl 0222.60013)]. The limiting random field can be interpreted as \(U_t(s) = e^{-t} B(s,e^{2t})\) where \(B\) is a standard Brownian sheet. Another way to see \(\{U_t(\cdot)\}_{t}\) is to interpret it as Ornstein-Uhlenbeck process in classical Wiener space. The above theorem, therefore, gives a construction of this process. The authors prove then a two-sided maximal inequality (large-deviation-type result) estimating \(\mathbb P(\sup_t S_n(t) \geq z_n\sqrt n)\) below and above by (a constant times) \(z_n^2\bar\Phi(z_n)\) where \(z_n\) is a fixed sequence converging to infinity and satisfying \(z_n = o(\sqrt{n/\log n})\) and \(\bar\Phi = 1-\Phi\), \(\Phi\) being the standard normal cdf. This result has a path-by-path consequence which furnishes an analogue of Erdős' integral test. Set for any positive measurable function \(H\) \[ J(H) = \int_1^\infty H^4(t)\,\bar\Phi(H(t))\frac{dt}{t}. \] Then, with probability one, \(J(H)<\infty\) implies that \( \sup_t S_n(t)< H(n)\sqrt n\) for finally all \(n\), while \(J(H)=\infty\) entails that there is some \(t_0\) such that \(S_n(t_0) \geq H(n)\sqrt n\) for infinitely many \(n\). This immediately yields a \(\log\log\log\)-type result for \(S_n\).
0 references
dynamical walks
0 references
Ornstein-Uhlenbeck processes
0 references
large deviations
0 references
upper functions
0 references