Rates of convergence of adaptive step-size of stochastic approximation algorithms (Q1977809)
From MaRDI portal
scientific article
Language | Label | Description | Also known as |
---|---|---|---|
English | Rates of convergence of adaptive step-size of stochastic approximation algorithms |
scientific article |
Statements
Rates of convergence of adaptive step-size of stochastic approximation algorithms (English)
0 references
14 December 2000
0 references
The stochastic approximation of the model \(y_n=x_n^T\) \(\theta +v_n\) is considered, where \(y_n\), \(x_n\), \(v_n\) are the observations, the random inputs, the noise, respectively, and \(\theta\) is the estimated parameter. The constant step-size \(\varepsilon\) of the classical adaptive algorithm \(\widehat{\theta}_{n+1}=\widehat{\theta}_n+\varepsilon x_n(y_n-x_n^T\widehat{\theta}_n)\) is replaced by an adaptive step-size \(\overline{\varepsilon}_n\), \(\widehat{\theta}_{n+1}=\widehat{\theta}_n+\overline{\varepsilon}_nx_n(y_n-x_n^T\widehat{\theta}_n)\), defined recursively by \(\overline{\varepsilon}_n=\overline{\varepsilon}_{n-1}+\frac 1n(\varepsilon_n-\overline{\varepsilon}_ n)=\frac 1n\sum_{i=1}^n \varepsilon_i\) with \(\varepsilon_{n+1}=\varepsilon_n+\gamma_nx_n(y_n-x_n^T\widehat{\theta}_n)\), \(\sum\varepsilon_n<\infty\) and \(\sum\gamma_n=\infty\) and \(\gamma_n\to 0\) slower than \(\frac 1n\). The convergence of \(\widehat{\theta}_n\) and \(\overline{\varepsilon}_n\) are proven and the rate of the convergence is analyzed by using the Gaussian approximation theorem.
0 references
adaptive algorithm
0 references
stochastic approximation
0 references
convergence
0 references
0 references