100% satisfaction guarantee Immediately available after payment Both online and in PDF No strings attached 4.2 TrustPilot
logo-home
Exam (elaborations)

ISYE 6402 Midterm Exam (Latest Update 2025 / 2026) Time Series Analysis | Questions & Answers | Grade A | 100% Correct - Georgia Tech

Rating
-
Sold
-
Pages
21
Grade
A+
Uploaded on
06-10-2025
Written in
2025/2026

ISYE 6402 Midterm Exam (Latest Update 2025 / 2026) Time Series Analysis | Questions & Answers | Grade A | 100% Correct - Georgia Tech

Institution
ISYE 6402
Course
ISYE 6402










Whoops! We can’t load your doc right now. Try again or contact support.

Written for

Institution
ISYE 6402
Course
ISYE 6402

Document information

Uploaded on
October 6, 2025
Number of pages
21
Written in
2025/2026
Type
Exam (elaborations)
Contains
Questions & answers

Subjects

Content preview

ISYE 6402 Midterm Exam
(Latest Update 2025 /
2026) Time Series Analysis
| Questions & Answers |
Grade A | 100% Correct -
Georgia Tech
ISYE 6402 (200 Qs)


I. Fundamentals & Definitions (1–30)
1. Stationary time series: a series whose probabilistic properties
(mean, variance, autocovariance structure) do not change over
time.
2. Weak (covariance) stationarity: mean constant over time,
finite constant variance, and autocovariance depends only on
lag (not time).
3. Strictly stationary but not weakly: e.g., a process with heavy
tails where moments do not exist — it can be strictly stationary
while second moments don’t exist so not weakly stationary.
4. White noise: sequence of uncorrelated random variables with
zero mean and constant variance (often iid with variance σ²).

,5. White noise vs martingale difference: both have zero mean
conditional on past; white noise requires no autocorrelation
(often iid), martingale difference only requires zero conditional
mean (can be heteroskedastic).
6. Ergodicity: time averages converge to ensemble averages;
allows inference about distribution from one long realization.
7. Autocovariance at lag k: Cov(X_t, X_{t-k}) = E[(X_t − μ)(X_{t-k} −
μ)].
8. Autocorrelation function (ACF): autocovariance at lag k divided
by variance: ρ_k = γ_k/γ_0.
9. Partial autocorrelation function (PACF): the correlation
between X_t and X_{t-k} controlling for intermediate lags
1..k−1.
10. Yule–Walker equations: linear equations linking AR
coefficients to autocovariances; used to estimate AR(p)
parameters from autocovariances.
11. AR(p) process: X_t = φ_1 X_{t-1} + ... + φ_p X_{t-p} + ε_t,
with ε_t white noise.
12. MA(q) process: X_t = ε_t + θ_1 ε_{t-1} + ... + θ_q ε_{t-q}.
13. ARMA(p,q): combination: X_t = φ(L)^{-1} θ(L) ε_t or
equivalently AR part plus MA part with finite orders p and q.
14. ARIMA(p,d,q): difference the series d times to achieve
stationarity, then fit ARMA(p,q) to differenced series; d is
number of differences.
15. Invertibility for MA: MA polynomial θ(L) must have roots
outside unit circle so current innovations can be expressed
uniquely as infinite AR of observables.

, 16. Causality for AR: AR polynomial φ(L) must have roots
outside unit circle so X_t depends only on current and past
shocks (stable).
17. Backshift operator B: B X_t = X_{t-1}; used to write
models compactly e.g. (1 − φB)X_t = ε_t.
18. Characteristic polynomial: polynomial in z (or B) from AR
coefficients, e.g. 1 − φ_1 z − ... − φ_p z^p. Roots determine
stationarity.
19. Unit root: a root of the characteristic polynomial equal to
1 (on unit circle) — implies nonstationarity / random walk
behaviour.
20. Root on unit circle effect: leads to nonstationarity —
process has persistent shocks, no mean reversion.
21. Cointegration: combination of I(1) series that yields an
I(0) stationary linear combination — long-run equilibrium
relation.
22. Deterministic vs stochastic trend: deterministic trend is
predictable (e.g., linear trend), stochastic trend arises from unit
roots/random walk (nonstationary).
23. Seasonal differencing: apply (1 − B^s) to remove seasonal
unit root (e.g., monthly s=12: X_t − X_{t-12}).
24. Forecasting horizon vs sample size: horizon is how far
ahead you forecast (h), sample size is number of observations
used to fit model (T); both affect accuracy.
25. Out-of-sample forecast error: error computed on data
not used to fit the model (actual − forecast).
26. Mean squared forecast error (MSFE): average squared
out-of-sample forecast errors: MSFE = (1/N) Σ (y_t − ŷ_t)^2.

Get to know the seller

Seller avatar
Reputation scores are based on the amount of documents a seller has sold for a fee and the reviews they have received for those documents. There are three levels: Bronze, Silver and Gold. The better the reputation, the more your can rely on the quality of the sellers work.
WORLDNURSE Chamberlain School Of Nursing
View profile
Follow You need to be logged in order to follow users or courses
Sold
154
Member since
2 year
Number of followers
40
Documents
2958
Last sold
5 days ago
Teach me to more

I HEIP STUDENTS WHO NEEDS HELP TO ALL TYPE OF EXAMS LIKE NGN,ATI,HESI,PN COMPREHENSION., FIREFIGHTER ,ECONOMICS .ENGLISH,SPANISH.MATHEMATICS .......TO SCORE A+ AND ALSO TO ASSIST IN ANY EXAM POSSIBLE .(What you need is just message me if you need personal study help of test ,case,study Quiz etc.) feel free

5.0

2354 reviews

5
2324
4
13
3
7
2
3
1
7

Recently viewed by you

Why students choose Stuvia

Created by fellow students, verified by reviews

Quality you can trust: written by students who passed their tests and reviewed by others who've used these notes.

Didn't get what you expected? Choose another document

No worries! You can instantly pick a different document that better fits what you're looking for.

Pay as you like, start learning right away

No subscription, no commitments. Pay the way you're used to via credit card and download your PDF document instantly.

Student with book image

“Bought, downloaded, and aced it. It really can be that simple.”

Alisha Student

Frequently asked questions