Mean estimation bias in least squares estimation of autoregressive processes (Q1058799): Difference between revisions

From MaRDI portal
Set OpenAlex properties.
ReferenceBot (talk | contribs)
Changed an Item
 
Property / cites work
 
Property / cites work: Finite sample properties of estimators for autoregressive moving average models / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q3914262 / rank
 
Normal rank
Property / cites work
 
Property / cites work: Application of Least Squares Regression to Relationships Containing Auto- Correlated Error Terms / rank
 
Normal rank
Property / cites work
 
Property / cites work: Corrigenda: Properties of Predictors for Autoregressive Time Series / rank
 
Normal rank
Property / cites work
 
Property / cites work: Estimation of the parameters of stochastic difference equations / rank
 
Normal rank
Property / cites work
 
Property / cites work: BIAS IN THE ESTIMATION OF AUTOCORRELATIONS / rank
 
Normal rank
Property / cites work
 
Property / cites work: First Order Autoregression: Inference, Estimation, and Prediction / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q3745108 / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q5583535 / rank
 
Normal rank
Property / cites work
 
Property / cites work: On the Calculation of the Inverse of the Error Function. / rank
 
Normal rank
Property / cites work
 
Property / cites work: Bias of some commonly-used time series estimates / rank
 
Normal rank

Latest revision as of 16:50, 14 June 2024

scientific article
Language Label Description Also known as
English
Mean estimation bias in least squares estimation of autoregressive processes
scientific article

    Statements

    Mean estimation bias in least squares estimation of autoregressive processes (English)
    0 references
    0 references
    0 references
    1985
    0 references
    Let \(X_{ti}\) \((t=1,...,n\); \(i=1,...,r)\) be given constants (e.g., \(X_{ti}=t^{i-1})\) and let \(\{e_ t\}\) be i.i.d. \(N(0,\sigma^ 2)\) variables. Consider the model \[ Y_ t=\sum^{r}_{i=1}X_{ti}\beta_ i+P_ t,\quad P_ t=\sum^{p}_{j=1}\alpha_ jP_{t-j}+e_ t, \] where \(\beta_ i\) and \(\alpha_ j\) are unknown parameters such that \(\{P_ t\}\) is a stationary AR(p) process. Let \({\tilde \alpha}\) be the estimator of \(\alpha =(\alpha_ 1,...,\alpha_ p)\) which is given by the regression of \(Y_ t\) on \((X_{t1},...,X_{tr}, Y_{t-1},...,Y_{t-p})\). The other estimator \({\hat \alpha}\) of \(\alpha\) arises from the regression of \(\hat P_ t\) on \((\hat P_{t-1},...,\hat P_{t-p})\), where \(\hat P_ t\) are the least squares residuals. The authors prove that \(E({\hat \alpha}-{\tilde \alpha})=O(n^{-2})\) and propose a reparametrization that isolates the bias of the estimators. A Monte Carlo study of the second-order autoregressive process is presented which includes also the case of the generalized least squares estimator of the mean function.
    0 references
    approximate expressions
    0 references
    stationary AR(p) process
    0 references
    least squares residuals
    0 references
    reparametrization
    0 references
    bias
    0 references
    Monte Carlo study
    0 references
    second-order autoregressive process
    0 references
    generalized least squares estimator of the mean function
    0 references
    0 references

    Identifiers