More effective time-series analysis and forecasting
DOI10.1016/0377-0427(95)00011-9zbMath0847.62073OpenAlexW2027582252WikidataQ126297114 ScholiaQ126297114MaRDI QIDQ1917907
Publication date: 8 October 1996
Published in: Journal of Computational and Applied Mathematics (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/0377-0427(95)00011-9
serial correlationforecastingARIMAprocess modellingARMAtime-seriessample autocorrelationspartial autocorrelationsARUMA processesidentifying nonstationarityimproving time-domain modellingnonstationary models
Inference from stochastic processes and prediction (62M20) Time series, auto-correlation, regression, etc. in statistics (GARCH) (62M10)
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- The serial correlation structure for a random process with steps
- Discriminating between nonstationary and nearly nonstationary time series models: A simulation study
- Sampled autocovariance and autocorrelation results for linear time processes
- Discrimination between nonstationary and nearly nonstationary processes, and its effect on forecasting
- Small-sample Autocorrelation Structure for Long-memory Time Series
- Serial Dependence Properties of Linear Processes
- PARTIAL AUTOCORRELATION PROPERTIES FOR NON-STATIONARY AUTOREGRESSIVE MOVING-AVERAGE MODELS
- The behaviour of the sample autocorrelation function for an integrated moving average process
- NOTE ON BIAS IN THE ESTIMATION OF AUTOCORRELATION
This page was built for publication: More effective time-series analysis and forecasting