For a random walk with random starting value, let Y, Yoterter-1e for t > 0, where...
1. A simple regression model is given by Y81B2X+ e for t 1, (1) ,n errors e with Var (e) a follow AR(1) model where the regression et pet-1 + , t=1...n where 's are uncorrelated random variables with constant variance, that is, E()0, Var (v) = , Cov (, ,) 0 for t Now given that Var (e) = Var (e1-1)= , and Cov (e-1, v)0 (a) Show that (b) Show that E (ee-1)= p. (c) What problem(s) will...
1. Let (N(t))>o be a Poisson process with rate X, and let Y1,Y2, ... bei.i.d. random variables. Fur- ther suppose that (N(t))=>0 and (Y)>1 are independent. Define the compound Poisson process N(t) Y. X(t) = Recall that the moment generating function of a random variable X is defined by ºx(u) = E[c"X]. Suppose that oy, (u) < for all u CR (for simplicity). (a) Show that for all u ER, ºx() (u) = exp (Atløy, (u) - 1)). (b) Instead...
1) Let X and Y be random variables. Show that Cov( X + Y, X-Y) Var(X)--Var(Y) without appealing to the general formulas for the covariance of the linear combinations of sets of random variables; use the basic identity Cov(Z1,22)-E[Z1Z2]- E[Z1 E[Z2, valid for any two random variables, and the properties of the expected value 2) Let X be the normal random variable with zero mean and standard deviation Let ?(t) be the distribution function of the standard normal random variable....
9. Let X and Y be independent and identically distributed random variables with mean u and variance o. Find the following: (a) E[(x + 2)] (b) Var(3x + 4) (c) E[(X-Y)] (d) Cov{(X + Y), (X - Y)}
= Var(X) and σ, 1. Let X and Y be random variables, with μx = E(X), μY = E(Y), Var(Y). (1) If a, b, c and d are fixed real numbers, (a) show Cov (aX + b, cY + d) = ac Cov(X, Y). (b) show Corr(aX + b, cY +d) pxy for a > 0 and c> O
2. The joint density of X and Y is given by Say 0SX S1,0 Sy sa fxy(x,y) = {o otherwise. (a) Find fxiy (ay). (b) Set up the integrals (do not evaluate) for evaluating Cov(X,Y). 3. In this question, you will identify the distribution of the sum of independent random variables. I expect you will find that the mgf approach is your friend. (a) Let X and Y be independent Poisson random variables with means X, and A2, respectively, and...
5. Let X1,X2, . , Xn be a random sample from a distribution with finite variance. Show that (i) COV(Xi-X, X )-0 f ) ρ (Xi-XX,-X)--n-1, 1 # J, 1,,-1, , n. OV&.for any two random variables X and Y) or each 1, and (11 CoV(X,Y) var(x)var(y) (Recall that p vararo 5. Let X1,X2, . , Xn be a random sample from a distribution with finite variance. Show that (i) COV(Xi-X, X )-0 f ) ρ (Xi-XX,-X)--n-1, 1 # J,...
Let X and Y be two independent random variables. Show that Cov (X, XY) = E(Y) Var(X).
onsider the process Y, = Y + Σ|e, where Yo ~ (μ, σ2) and the e's are 0-mean, a stationary process? independent identically distributed random variables with variance 1. Is (Y How about the process ▽Yǐ = Yt-)t-1 ? Explain. onsider the process Y, = Y + Σ|e, where Yo ~ (μ, σ2) and the e's are 0-mean, a stationary process? independent identically distributed random variables with variance 1. Is (Y How about the process ▽Yǐ = Yt-)t-1 ? Explain.
Let(ej denote a white noise process from a normal distribution with E[9] = 0, Var(e-g an Cov(et, e) = 0 for tヂs. Define a new time series {Y.} by Y, = 9 + 0.6 e--04 et-2 + 0.2 9-3 1. Find E(Y) and Var(Y,) 2. Find Cov(Y,X,-k) for k = 1,2,