Convert the following auto-regressive (AR) model into a moving average (MA) model: Y, = 1 +...
. Consider a moving average MA(2) model: y(t) = e(t)+b, e(t-1) +b,elt - 2) Assume that the noise e(t) has is i.i.d. with variance o = 1. (a) Compute the autocorrelation process r(k) for y(t).
. Consider a moving average MA(2) model: y(t) = e(t)+b, e(t-1) +b,elt - 2) Assume that the noise e(t) has is i.i.d. with variance o = 1. (b) Compute the PSD of y(t). (Hint: e24 +e 1214 = 2 cos(274))
1. Consider a moving average MA(2) model: y(t) = e(t) +belt-1) + b,elt-2) Assume that the noise e(t) has is i.i.d. with variance = 1. (a) Compute the autocorrelation process r(k) for y(t). (b) Compute the PSD of y(t). (Hint: 12.4 +e=24 = 2 cos(24)) (c) Plot the spectral density from part (b) for at least FOUR different combinations of (b1,b2), where b and b take either positive or negative values. (d) Comment on where the peaks of the PSD...
4. Convert the following from AR() to MA(-) or vice-versa. (a) X, = 1 +0.2X-1 +0.2X-2 + w, (b) X, = 2 - 0.3-1 (c) X, = 2 - 0.3w1-1 -0.10.-2 + Wi. (d) X, = 3 -0.1%,-1 + W..
Based on correlogram, write down MA(q) or AR(p) model.
a) Consider the following moving average process, MA(2): Yt = ut + α1ut-1 + α2ut-2 where ut is a white noise process, with E(ut)=0, var(ut)=σ2 and cov(ut,us)=0 . Derive the mean, E(Yt), the variance, var(Yt), and the covariances cov( Yt,Yt+1 ) and cov(Yt,Yt+2 ), of this process. b) Give a definition of a (covariance) stationary time series process. Is the MA(2) process (covariance) stationary?
1. [30 pts! Let Yǐ follow a moving average process of order 1 (ie, MA(1): where e is a white noise process with N(0,1). Suppose that you estimate the model using STATA. You obtain ê-1, ê-0.5 and ớ2-1. You also know e,-2 and E1-1-3. (a) Obtain the unconditional mean and variance of Y (b) Obtain Cor(Y, Yi-1). (c) Obtain the autocorrelation of order 1 for Y
1. [30 pts! Let Yǐ follow a moving average process of order 1 (ie,...
True or false? You do not have to provide explanations. (a) Any moving average (MA) process is covariance stationary. (b) Any autoregressive (AR) process is invertible. (c) The autocorrelation function of an MA process decays gradually while the partial autocorrelation function exhibits a sharp cut-off. (d) Suppose yt is a general linear process. The optimal 2-step-ahead prediction error follows MA(2) process. (e) Any autoregressive moving average (ARMA) process is invertible because any moving average (MA) process is invertible. (f) The...
Consider the following AR(2) model: Xt – Xt–1 + + X4-2 = Zt, Z4 ~ WN(0,1). (a) Show that X+ is causal. (b) Find the first four coefficients (VO, ..., 43) of the MA(0) representation of Xt. (c) Find the pacf at lag 3, 233, of the AR(2) model.
3. Suppose Δ1t follows the AR (1) model ΔΥ-30 + λίΔΙǐ-it . Show that Yt follows AR(2) model Derive the AR (2) coefficients for y, as a function of Ao and λι
3. Suppose Δ1t follows the AR (1) model ΔΥ-30 + λίΔΙǐ-it . Show that Yt follows AR(2) model Derive the AR (2) coefficients for y, as a function of Ao and λι