Adaptive estimation of mean and volatility functions in (auto-)regressive models. (Q1766042)
From MaRDI portal
scientific article
Language | Label | Description | Also known as |
---|---|---|---|
English | Adaptive estimation of mean and volatility functions in (auto-)regressive models. |
scientific article |
Statements
Adaptive estimation of mean and volatility functions in (auto-)regressive models. (English)
0 references
25 February 2005
0 references
The authors study a nonparametric AR(1) process with varying variance depending on the last state. The objective is to estimate both the mean and variance (volatility) functions. From many possibilities how to model a non-parametrized function the authors use the construction from functional bases. Namely they selected three bases: trigonometric, polynomial splines (two variants), and compactly supported wavelets. Hence, the model is built as a linear combination of basic functions, for both estimated functions. Naturally, the criterion of fit is then based on the least squares, and the error of the estimate is also expressed w.r. to the empirical \(L_2\) norm. The degree of the model is controlled by a penalty term. The authors propose an estimation procedure based on two steps, the first estimating the mean function, assuming constant variance, and the second estimating the variance function, again minimizing the corresponding sum of the squared divergences. Theoretical results then deal with the bounds for the accuracy of estimates (nonasymptotic), which are connected with the variability of variance (this variability actually is not taken into account in estimation of the mean function). A great part of the paper is devoted to simulation studies. Data are generated from a number of models, the results of estimation are compared with the best possible choice of fixed degree from a given functional basis (called here the oracle), which in real cases is not known, but however can be derived in cases of simulation studies, when the actual function is selected by the analyst. The last 13 pages contain technical details and proofs. It may be concluded that the paper offers both technical and numerical justification for the proposed procedure, though a more standard approach would use sequential iterative computation of the mean function (e.g., from a functional basis, by weighted least squares) and the variance function (e.g., with the help of kernel smoothing, avoiding the lest squares method), with the chance to come more closely to the `best' solution. In the framework of the method presented here, authors used just one couple of steps, because further iterations appeared numerically unstable.
0 references
nonparametric regression
0 references
least-squares estimator
0 references
adaptive estimation
0 references
autoregression
0 references
variance estimation
0 references
mixing processes
0 references
0 references
0 references
0 references
0 references
0 references