ISyE 6402 Midterm Prep (2023/2024) Already Passed
ISyE 6402 Midterm Prep (2023/2024) Already Passed Getting a 3 variable VAR model from summary(model) output of a VAR(1) model first matrix: first row are coefficients for Xt1, second row are coefficients for Xt2, etc... second matrix is Xt-1, i b/c this is a VAR(1) model last matrix are the constants eta_t is covariance matrix, direct copy (c) Based on the fitted model, is there contemporaneous cross-correlation? Is there lagged cross-correlation? Is there lagged auto-correlation? Explain. contemporaneous cross-correlation is NOT present if the variance-covariance matrix is a diagonal matrix there is lagged correlation if the order p of the VAR(p) model > 0 T/F - Differencing the data might not make the series stationary in the presence of cointegration. True Cointegration and long-run equilibrium See image Does cov(x,x) = var(x)? You betcha Autocovariance T/F see image T/F - The AR(1) process is causal if and only if the autoregressive parameter phi is between 0 and 1. However, it is always invertible. FALSE! the absolute value of phi must lie b/w -1 and 1 T/F - A linear process is a special case of the moving average model. FALSE - the moving average is a special case of a linear process. T/F - A guassian time series is always stationary false - guassian processes can have varying means T/F 'In autoregressive models the current value of dependent variable is influenced by past values of both dependent and independent variables.' FALSE - there are no analogies of dependent/independent variables w/ AR models, as there are w/ regression models in AR models the current value of the dependent variable is affected by the past values of both dependent and independent variables False - We don't have dependent and independent variables in AR models like we do in regression models how do ACF and PACF differ? TBD what in an ACF plot would show non-stationarity? slowly decreasing lags what in an ACF and PACF plot would show stationary? few lags outside of confidence bands, quickly decreasing can confidence intervals be used for significance? you bet - should all be same sign for significance in a VAR model w/ seasonality for twelve months, how many seasonality dummy variables will you have? just 11 - # of categories - 1 T/F 1. Time series processes generally can be decomposed into a component modeling systematic variation (trend and seasonality) and a component modeling stochastic stationary variation. TRUE! stochastic, like white noise In mathematics and statistics, a stationary process (a.k.a. a strict/strictly stationary process or strong/strongly stationary process) is a stochastic process whose unconditional joint probability distribution does not change when shifted in time. Consequently, parameters such as mean and variance also do not change over time. T/F Consecutive observations in time series data are independent and identically distributed. FALSE - otherwise, you wouldn't have to consider autocorrelation T/F - Var(a+bY) = b * Var(Y) FALSE - Var(a+bY) = b^2 * Var(Y) T/F - One model for the trend component of a time series is the simple linear regression model in which time is used as an explanatory variable. TRUE T/F - If Cov(X,Y)=0 then X and Y are independent FALSE - If X and Y are independent variables, then their covariance is 0: Cov(X, Y ) = E(XY ) − μXμY = E(X)E(Y ) − μXμY = 0 The converse, however, is not always true. Cov(X, Y ) can be 0 for variables that are not independent T/F If ρ=Corr(X,Y)=0, then X and Y are independent FALSE - However, if X and Y are uncorrelated, then they can still be dependent T/F If X and Y are independent random variables, then we have that Var(X+Y)≠Var(X)+Var(Y). FALSE T/F If X and Y and are independent random variables, then we have that Cov(a+bX, c+DY) = bdCov(X,Y). FALSE T/F The 0th lag order autocovariance of a variable Yt is equal to the variance of that variable. TRUE - Cov(X,X) = Var(X) T/F The condition that the covariance between Yi and Yi-j depends only on j is sufficient for the process to be stationary FALSE - Stationary if it has constant mean for all time points, t, it has a finite variance, or more specifically, has a finite second moment, and the covariance function does not change when shifted in time. What is the expectation formula? TBD T/F Consecutive observations in a white noise process are identically distributed. TRUE Consecutive observations in a white noise process are uncorrelated and identically distributed, but not necessarily independent. TRUE - uncorrelated and ID doesn't mean independent T/F If Xt is white noise, then -Xt is white noise. TRUE T/F The mean of a random walk process does not depend on time (the sequence is mean stationary). TRUE! For stationarity, the entire distribution of pt has to be constant over time, not only its mean. And while the mean of pt is indeed constant, e.g., it's standard deviation isn't. The larger t, the higher is the standard deviation of pt (over all realisations of the random walk - which is what you have to consider for stationarity), since individual realisations of the random walk can stray further and further from p0. T/F for a random walk process Var(St) > Var (St-1) TRUE which assumptions of stationary are violated? 2. Consider the following assumptions for stationary time series: (i) Constant mean; (ii) Finite variance; and (iii) Covariance between any two observations depends only on the time lag between them. Which assumptions of stationarity seem to be violated (figure 1)? i) and iii) do any homework questions show periodicity as valid answer? No, only seasonality. Also, there are HW questions saying there is seasonality, but that constant variance is not violated. T/F AR(p) processes are always invertible. TRUE T/F A white noise process has zero autocovariances except at lag zero. TRUE T/F 6. The PACF can only be used to select the order of the AR model and not of an ARMA model. TRUE - PACF for AR(p) PPP T/F There is no auto-correlation in an ARIMA (1, d, q) process. FALSE - AR(1) is autoregressive 1! T/F 9. An ARIMA (p, 0, q) model is stationary. TRUE - gotta be stationary to get ARIMA if Wt is a white noise process, then the first order difference of Wt is stationary TRUE What makes an AR(p) process stationary? If no roots lie on the unit circle is combination of white noise a stationary process? you betcha ! what makes a process causal? when the roots are not common and they're OUTSIDE the unit circle What if you have complex roots like m + n*i? The sqrt(m^2+n^2) What has to do w/ invertible - Xt or Zt? causal and Xt - think c and chi. invertible and Zt Which is true? Suppose we want to perform a t-test at the 0.05 significance level to test whether the two coefficients phi2 and phi2 are significant. What is the critical threshold? there are 98 observations qt(.05, 98-2-1) Why is "differencing" the data more suitable (in terms of modeling the characteristics of the time series) than "detrending" it (refer to figure 2 again)? Because it almost removes the nonconstant variance pattern, while "detrending" the data doesn't incorrect Because there are no significant lags present in neither the ACF plot, nor the PACF plot Which of the following characteristics are present in the time series plot of the original data (figure 1)? trend and heteroskedasticity NOT seasonality! Which assumptions violated? Constant mean and covariance b/w any two observations depends only on the time lag b/w them T/F The error terms of a VAR model are a white noise process. TRUE T/F The error terms of a VAR model are not correlated with any past or future disturbances. TRUE cross-covariances are not symmetric in K stationarity of multivariate time series can all VAR(p) models be re-written as VAR(1)? T/F Differencing the data might not make the series stationary in the presence of cointegration. TRUE T/F 12. Differencing non-stationary time series components individually may destroy important dynamic information TRUE - think cointegration mean vector question what makes VAR(p) model stable? Corr(X,Y) Suppose we fit a VAR model to the 6 time series simultaneously. At the 0.05 significance level, a 0.24 p-value corresponding to the multivariate Ljung-Box Q(m) statistic test for uncorrelated model residuals means that There is no lead-lag relationship among the six series correct T/F Your boss hands you the U.S. inflation rate and exchange rate time series. The two sequileries are not stationary, and are known to have a long-run equilibrium relationship. Building a VAR model on the differenced data is a preferable way to go about analyzing the two series. FALSE - if long run equilibrium, you've got co-integration and can't just difference it T/F TRUE T/F The error terms of the VAR model are both contemporaneously and auto- correlated. FALSE - not always two definitions of stationary and roots phi should have absolute value < 1 root abs(z) should not lie on unit circle T/F For a stationary time series, the autocorrelation function is between -1 and 1 for all lags. TRUE FALSE - Q might be way larger than P T/F For Structural VAR models, restrictions are needed for parameters to be identifiable. Having a B model means the B matrix must be an Identity matrix. FALSE - restrictions are necessary, but B model means A = I. A model means B = I How many terms to estimate in a VAR model? In a n‐variate system, the number of coefficients in each equation is 1+np and the total number is n(1+np)=n+
Written for
- Institution
- ISyE 6402
- Course
- ISyE 6402
Document information
- Uploaded on
- November 26, 2023
- Number of pages
- 17
- Written in
- 2023/2024
- Type
- Exam (elaborations)
- Contains
- Questions & answers
Subjects
-
isye 6402 midterm prep already passed
Also available in package deal