Auto-Regressive processes, Moving-Average processes and Wold’s theorem

Autoregressive model

Autoregressive models (AR)

AR(1)

Our basic model was:

xt=α+ϵt

We add an autoregressive component by adding a lagged observation.

xt=α+βxt1+ϵt

AR(p)

AR(p) has p previous dependent variables.

xt=α+i=1pβixti

Propagation of shocks

A shock bumps up the output variable, which bumps up output variables forever, at a decreasing rate.

Testing for stationarity with Dickey-Fuller (DF) and Augmented Dicky-Fuller (ADF)

Stationarity

Unit roots

Integration order

Dickey-Fuller

The Dickey-Fuller test tests if there is a unit root.

The AR(1) model is:

yt=α+βyt1+ϵt

We can rewrite this as:

Δyt=α+(β1)yt1+ϵt

We test if β1)=0.

If the coefficient on the last term is 1 we have a random walk, and the process is non-stationary.

If the last term is <1 then we have a stationary process.

Variation: Removing the drift

If our model has no intercept it is:

yt=βyt1+ϵt

Δyt=(β1)yt1+ϵt

Variation: Adding a deterministic trend

If our model has a time trend it is:

yt=αβyt1+γt+ϵt

Δyt=α+(β1)yt1+γt+ϵt

Augmented Dickey-Fuller

We include more lagged variables.

yt=α+βt+ipθiyti+ϵt

If no unit root, can do normal OLS?

Autoregressive Conditional Heteroskedasticity (ARCH)

Variance of the AR(1) model

The standard AR(1) model is:

yt=α+βyt1+ϵt

The variance is:

Var(yt)=Var(α+βyt1+ϵt)

Var(yt)(1β2)=Var(ϵt)

Assuming the errors are IID we have:

Var(yt))=σ21β2

This is independent of historic observations, which may not be desirable.

Conditional variance

Consider the alternative formulation:

yt=ϵtf(yt1)

This allows for conditional heteroskedasticity.

Moving average models

Moving Average models (MA)

We add previous error terms as input variables

MA(q) has q previous error terms in the model

Unlike AR models, the effects of any shocks wear off after q terms.

This is harder to fit the OLS, the error terms themselves are not observed.

Autoregressive Moving Average models

Autoregressive Moving Average models (ARMA)

We include both AR and MA

Estimted using Box-Jenkins

Autoregressive Integrated Moving Average models (ARIMA)

Uses differences to remove non statiority

Also estiamted with box-jenkins

Seasonal ARIMA

Wold’s theorem

Introduction