Chapter 11
1. A process is stationary if:
a. any collection of random variables in a sequence is taken and shifted ahead by h time periods; the joint probability distribution changes.
b. any collection of random variables in a sequence is taken and shifted ahead by h time periods, the joint probability distribution remains unchanged.
c. there is serial correlation between the error terms of successive time periods and the explanatory variables and the error terms have positive covariance.
d. there is no serial correlation between the error terms of successive time periods and the explanatory variables and the error terms have positive covariance.
Answer: b
Difficulty: Moderate
Bloom’s: Knowledge
A-Head: Stationary and Weakly Dependent Time Series
BUSPROG:
Feedback: A process is stationary if any collection of random variables in a sequence is taken and shifted ahead by h time periods; the joint probability distribution remains unchanged.
2. A stochastic process {xt: t = 1,2,….} with a finite second moment [E(xt2) < ∞] is covariance stationary if:
a. E(xt) is variable, Var(xt) is variable, and for any t, h ≥ 1, Cov(xt, xt+h) depends only on ‘h’ and not on ‘t’.
b. E(xt) is variable, Var(xt) is variable, and for any t, h ≥ 1, Cov(xt, xt+h) depends only on ‘t’ and not on h.
c. E(xt) is constant, Var(xt) is constant, and for any t, h ≥ 1, Cov(xt, xt+h) depends only on ‘h’ and not on ‘t’.
d. E(xt) is constant, Var(xt) is constant, and for any t, h ≥ 1, Cov(xt, xt+h) depends only on ‘t’ and not on ‘h’.
Answer: c
Difficulty: Moderate
Bloom’s: Knowledge
A-Head: Stationary and Weakly Dependent Time Series
BUSPROG:
Feedback: A stochastic process {xt: t = 1,2,….} with a finite second moment [E(xt2) < ∞] is covariance stationary if E(xt) is constant, Var(xt) is constant, and for any t, h ≥ 1, Cov(xt, xt+h) depends only on ‘h’ and not on ‘t’.
3. A covariance stationary time series is weakly dependent if:
a. the correlation between the independent variable at time ‘t’ and the dependent variable at time ‘t + h’ goes to ∞ as h → 0.
b. the correlation between the independent variable at time ‘t’ and the dependent variable at time ‘t + h’ goes to 0 as h → ∞.
c. the correlation between the independent variable at time ‘t’ and the independent variable at time ‘t + h’ goes to ∞ as h → 0.
d. the correlation between the independent variable at time ‘t’ and the independent variable at time ‘t + h’ goes to 0 as h → ∞.
Answer: d
Difficulty: Easy
Bloom’s: Knowledge
A-Head: Stationary and Weakly Dependent Time Series
BUSPROG:
Feedback: A covariance stationary time series is weakly dependent if the correlation between the independent variable at time ‘t’ and the independent variable at time ‘t + h’ goes to 0 as h → ∞.
4. The model yt = et + β1et – 1 + β2et – 2 , t = 1, 2, ….. , where et is an i.i.d. sequence with zero mean and variance σ2e represents a(n):
a. static model.
b. moving average process of order one.
c. moving average process of order two.
d. autoregressive process of order two.
Answer: c
Difficulty: Easy
Bloom’s: Knowledge
A-Head: Stationary and Weakly Dependent Time Series
BUSPROG:
Feedback: The model yt = et + β1et – 1 + β2et – 2 , t = 1, 2, ….. , where et is an i.i.d. sequence with zero mean and variance σ2e, represents an moving average process of order two.
5. The model xt¬ = α1xt – 1 + et , t =1,2,…. , where et is an i.i.d. sequence with zero mean and variance σ2e represents a(n):
a. moving average process of order one.
b. moving average process of order two.
c. autoregressive process of order one.
d. autoregressive process of order two.
Answer: c
Difficulty: Easy
Bloom’s: Knowledge
A-Head: Stationary and Weakly Dependent Time Series
BUSPROG:
Feedback: The model xt¬ = α1xt – 1 + et , t =1,2,…. , where et is an i.i.d. sequence with zero mean and variance σ2e, represents an autoregressive process of order one.
6. Which of the following is assumed in time series regression?
a. There is no perfect collinearity between the explanatory variables.
b. The explanatory variables are contemporaneously endogenous.
c. The error terms are contemporaneously heteroskedastic.
d. The explanatory variables cannot have temporal ordering.
Answer: a
Difficulty: Easy
Bloom’s: Knowledge
A-Head: Asymptotic Properties of OLS
BUSPROG:
Feedback: One of the assumptions of time series regression is that there should be no perfect collinearity between the explanatory variables.
7. Suppose ut is the error term for time period ‘t’ in a time series regression model the explanatory variables are xt = (xt1, xt2 …., xtk). The assumption that the errors are contemporaneously homoskedastic implies that:
a. Var(ut|xt) = √σ.
b. Var(ut|xt) = ∞.
c. Var(ut|xt) = σ2.
d. Var(ut|xt) = σ.
Answer: c
Difficulty: Moderate
Bloom’s: Knowledge
A-Head: Asymptotic Properties of OLS
BUSPROG:
Feedback: If ut is the error term for time period ‘t’ and xt is a matrix consisting of all independent variables for time ‘t’, the assumption of contemporaneously homoskedasticity implies that Var(ut|xt) = σ2.
8. Which of the following statements is true?
a. A model with a lagged dependent variable cannot satisfy the strict exogeneity assumption.
b. Stationarity is critical for OLS to have its standard asymptotic properties.
c. Efficient static models can be estimated for nonstationary time series.
d. In an autoregressive model, the dependent variable in the current time period varies with the error term of previous time periods.
Answer: a
Difficulty: Moderate
Bloom’s: Knowledge
A-Head: Asymptotic Properties of OLS
BUSPROG:
Feedback: A model with a lagged dependent variable cannot satisfy the strict exogeneity assumption. When explanatory variables are correlated with the past, strict exogeneity does not hold.
9. Consider the model: yt = α0 + α1rt1 + α2rt2 + ut. Under weak dependence, the condition sufficient for consistency of OLS is:
a. E(rt1|rt2) = 0.
b. E(yt |rt1, rt2) = 0.
c. E(ut |rt1, rt2) = 0.
d. E(ut |rt1, rt2) = ∞.
Answer: c
Difficulty: Moderate
Bloom’s: Knowledge
A-Head: Asymptotic Properties of OLS
BUSPROG:
Feedback: If a time series model is weakly dependent, the condition sufficient for consistency of OLS is E(ut|rt1, rt2) = 0.
10. The model yt = yt – 1 + et, t = 1, 2, … represents a:
a. AR(2) process.
b. MA(1) process.
c. random walk process.
d. random walk with a drift process.
Answer: c
Difficulty: Easy
Bloom’s: Knowledge
A-Head: Using Highly Persistent Time Series in Regression Analysis
BUSPROG:
Feedback: The model yt = yt – 1 + et, t = 1, 2, … represents a random walk process.
Reviews
There are no reviews yet.