(Latest Update 2025 /
2026) Time Series Analysis
| Questions & Answers |
Grade A | 100% Correct -
Georgia Tech
ISYE 6402 (200 Qs)
I. Fundamentals & Definitions (1–30)
1. Stationary time series: a series whose probabilistic properties
(mean, variance, autocovariance structure) do not change over
time.
2. Weak (covariance) stationarity: mean constant over time,
finite constant variance, and autocovariance depends only on
lag (not time).
3. Strictly stationary but not weakly: e.g., a process with heavy
tails where moments do not exist — it can be strictly stationary
while second moments don’t exist so not weakly stationary.
4. White noise: sequence of uncorrelated random variables with
zero mean and constant variance (often iid with variance σ²).
,5. White noise vs martingale difference: both have zero mean
conditional on past; white noise requires no autocorrelation
(often iid), martingale difference only requires zero conditional
mean (can be heteroskedastic).
6. Ergodicity: time averages converge to ensemble averages;
allows inference about distribution from one long realization.
7. Autocovariance at lag k: Cov(X_t, X_{t-k}) = E[(X_t − μ)(X_{t-k} −
μ)].
8. Autocorrelation function (ACF): autocovariance at lag k divided
by variance: ρ_k = γ_k/γ_0.
9. Partial autocorrelation function (PACF): the correlation
between X_t and X_{t-k} controlling for intermediate lags
1..k−1.
10. Yule–Walker equations: linear equations linking AR
coefficients to autocovariances; used to estimate AR(p)
parameters from autocovariances.
11. AR(p) process: X_t = φ_1 X_{t-1} + ... + φ_p X_{t-p} + ε_t,
with ε_t white noise.
12. MA(q) process: X_t = ε_t + θ_1 ε_{t-1} + ... + θ_q ε_{t-q}.
13. ARMA(p,q): combination: X_t = φ(L)^{-1} θ(L) ε_t or
equivalently AR part plus MA part with finite orders p and q.
14. ARIMA(p,d,q): difference the series d times to achieve
stationarity, then fit ARMA(p,q) to differenced series; d is
number of differences.
15. Invertibility for MA: MA polynomial θ(L) must have roots
outside unit circle so current innovations can be expressed
uniquely as infinite AR of observables.
, 16. Causality for AR: AR polynomial φ(L) must have roots
outside unit circle so X_t depends only on current and past
shocks (stable).
17. Backshift operator B: B X_t = X_{t-1}; used to write
models compactly e.g. (1 − φB)X_t = ε_t.
18. Characteristic polynomial: polynomial in z (or B) from AR
coefficients, e.g. 1 − φ_1 z − ... − φ_p z^p. Roots determine
stationarity.
19. Unit root: a root of the characteristic polynomial equal to
1 (on unit circle) — implies nonstationarity / random walk
behaviour.
20. Root on unit circle effect: leads to nonstationarity —
process has persistent shocks, no mean reversion.
21. Cointegration: combination of I(1) series that yields an
I(0) stationary linear combination — long-run equilibrium
relation.
22. Deterministic vs stochastic trend: deterministic trend is
predictable (e.g., linear trend), stochastic trend arises from unit
roots/random walk (nonstationary).
23. Seasonal differencing: apply (1 − B^s) to remove seasonal
unit root (e.g., monthly s=12: X_t − X_{t-12}).
24. Forecasting horizon vs sample size: horizon is how far
ahead you forecast (h), sample size is number of observations
used to fit model (T); both affect accuracy.
25. Out-of-sample forecast error: error computed on data
not used to fit the model (actual − forecast).
26. Mean squared forecast error (MSFE): average squared
out-of-sample forecast errors: MSFE = (1/N) Σ (y_t − ŷ_t)^2.