← back to finance 2

Time Series Analysis

MIT OCW 18.S096 + 15.450 (CC BY-NC-SA 4.0)

Time series models decompose a sequence of observations into signal and noise. AR models use past values to predict the future; stationarity determines whether those predictions are meaningful. Cointegration finds stable long-run relationships between non-stationary series.

Time y(t) μ pull to μ pull to μ

Autoregressive (AR) models

An AR(1) process: y(t) = c + phi * y(t-1) + epsilon(t). If |phi| < 1, the process mean-reverts to c/(1-phi). Phi near 1 means high persistence; phi near 0 means each observation is mostly noise. Negative phi produces alternating overshoots.

Scheme

Stationarity and unit roots

A time series is stationary if its mean and variance do not change over time. When phi = 1, the process is a random walk โ€” it has a unit root and is non-stationary. Stock prices are typically non-stationary; log returns are stationary. The Dickey-Fuller test checks whether phi is significantly less than 1.

Scheme

Moving averages and ARMA

An MA(1) process: y(t) = mu + epsilon(t) + theta * epsilon(t-1). Past shocks persist for exactly one period. Combine AR and MA to get ARMA(p,q): p autoregressive lags and q moving-average lags. ARMA is parsimonious โ€” a low-order ARMA often fits as well as a high-order pure AR.

Scheme

Cointegration

Two non-stationary series are cointegrated if a linear combination of them is stationary. Stock prices of Coca-Cola and Pepsi each wander randomly, but their spread mean-reverts. Pairs trading exploits this: when the spread widens, bet on convergence. The Engle-Granger test runs a regression and checks the residuals for stationarity.

Scheme
Neighbors