Gradient estimates for the performance of Markov chains and discrete event processes (Q1207844)
From MaRDI portal
scientific article
Language | Label | Description | Also known as |
---|---|---|---|
English | Gradient estimates for the performance of Markov chains and discrete event processes |
scientific article |
Statements
Gradient estimates for the performance of Markov chains and discrete event processes (English)
0 references
16 May 1993
0 references
A Markov chain \(M_ x(i)\), \(i=0,1,\dots,\) with state space \((R,r)\) --- a metric space with metric \(r\) --- is considered. \(R\) may be countable or not. \(x\) is a real control parameter which governs the transition operator \(P_ x(w,A)\). The existence of a unique stationary measure \(\mu_ x\) is supposed for which \(\mu_ x\cdot P_ x=\mu_ x\). \(H(w)\) is a performance-generating function, i.e. a real function defined on \(R\), and \(F(w)=\int H(w)d\mu_ x(w)\) is the stationary performance of the process. The paper gives an algorithm for the estimation of the gradient of this stationary performance. The method works for discrete and continuous state spaces as well. After a review of methods for estimating derivatives of probability measures the derivatives for stationary distributions of Markov chains are given. A comparison with the efficient score methods and extensions to semi-Markov processes are made. Finally, discrete event dynamical systems (DEDS) are considered.
0 references
score method
0 references
unique stationary measure
0 references
stationary performance
0 references
stationary distributions
0 references
0 references