next up previous
Next: Qualitative Data Analysis Up: Autocorrelation Estimation Previous: Multitapering

Autoregressive Parametric Model Estimation

Stationary stochastic time series can be modelled using an autoregressive process of sufficiently high order $ p$ (AR($ p$)):

$\displaystyle x(t)=\phi_1x(t-1)+\phi_2x(t-2)+...+\phi_px(t-p)+e(t)$ (17)

where $ e(t)$ is a white noise process and $ \phi_1,\phi_2, \dots ,\phi_p$ are the autoregressive model parameters. There is the option to fit more complex ARMA models (autoregressive process forced by a moving average process). However, AR models are often used on their own due to their relative simplicity in fitting. It may then require more parameters to fit the process with no significant loss of accuracy. This is the approach considered here.

The time series literature, including Chatfield (1996), describes various techniques for determining the order $ p$ and parameters of AR models. Here, we use the partial autocorrelation function (PACF) to find $ p$ and ordinary least squares to fit the parameters. When fitting an AR($ p$) model, the last partial coefficient $ \alpha_p$ measures the excess correlation at lag p not accounted for by an AR($ p-1$) model; $ \alpha_p$ plotted for all $ p$ is the PACF. The lowest value of $ p$ for which $ \alpha_p$ in the PACF is not significantly different to zero (using the 95% confidence limits of approximately $ \pm 2/\sqrt{N}$  (Chatfield, 1996)), is the order used.


next up previous
Next: Qualitative Data Analysis Up: Autocorrelation Estimation Previous: Multitapering
Mark Woolrich 2001-07-16