Show that Cov(X,Y) = 0 when E [X | Y] = E [X]. Hint: apply the Law of Iterated Expectation.
Exercise 2.6: Consider the models y Xßte and y* X"β+c" where E(e) = 0, cov(e) = σ21, y* = ГУ, X* = ГХ, e* =「ε and r is a known n x n orthogonal matrix. Show that: 1. E(e) 0, cov(e) σ21 2. b b and s2 s2, where b and b' are the least squares estimates of β and 82 and s+2 are the estimates of σ2 obtained from the two models.
Let X and Y be two independent random variables. Show that Cov (X, XY) = E(Y) Var(X).
10.3.8 Suppose that Y = E(Y | X) + Z, where X, Y and Z are random variables. (a) Show that E (Z | X) = 0. (b) Show that Cov(E(Y | X), Ζ) = 0. (Hint. Write Z-Y-E(YİX) and use Theo- rems 3.5.2 and 3.5.4.) (c) Suppose that Z is independent of X. Show that this implies that the conditional distribution of Y given X depends on X only through its conditional mean. (Hint: Evaluate the conditional distribution function...
Let X and Y be i.i.d. random variables with finite second moments. Show that Cov(X+Y, X ̶ Y) = 0.
Show that if Y = ax + b (a = 0), then Corr(X, Y = +1 or -1. We know Cov(X, Y) = Covl X, a X ) +o) - (1 ])uxo. X). Then Cov(X, Y) oxor Jux Corr(x, y) = which is 1 when a > 0 and –1 when a < 0 0x (lal ox) lal Under what condition will = +1? The value p = +1 when a > 0
1. Suppose that E(X) E(Y) E(Z) 2 Y and Z are independent, Cov(X, Y) V(X) V(Z) 4, V(Y) = 3 Let U X 3Y +Z and W = 2X + Y + Z 1, and Cov(X, Z) = -1 Compute E(U) and V (U) b. Compute Cov(U, W). а.
4. Recall that the covariance of random variables X, and Y is defined by Cov(X,Y) = E(X - Ex)(Y - EY) (a) (2pt) TRUE or FALSE (circle one). E(XY) 0 implies Cov(X, Y) = 0. (b) (4 pt) a, b, c, d are constants. Mark each correct statement ( ) Cov(aX, cY) = ac Cov(X, Y) ( ) Cor(aX + b, cY + d) = ac Cov(X, Y) + bc Cov(X, Y) + da Cov(X, Y) + bd ( )...
1) Let X and Y be random variables. Show that Cov( X + Y, X-Y) Var(X)--Var(Y) without appealing to the general formulas for the covariance of the linear combinations of sets of random variables; use the basic identity Cov(Z1,22)-E[Z1Z2]- E[Z1 E[Z2, valid for any two random variables, and the properties of the expected value 2) Let X be the normal random variable with zero mean and standard deviation Let ?(t) be the distribution function of the standard normal random variable....
. Let X and Y be random variables. The conditional
variance of Y given X, denoted Var(Y | X),
is defined as
Var(Y | X) = E[Y
2
| X] − E[Y | X]
2
.
Show that Var(Y ) = E[Var(Y | X)] + Var(E[Y | X]). (This equality
you are showing is known
as the Law of Total Variance). Hint: From the Law of Total
Expectation, you get Var(Y ) =
E[Y
2
] − E[Y ]
2...
Suppose that Cov(X,Y ) = 0.9 and Cov(X,Z) = −0.7. (a) What is Cov(X,Y + Z)? b) What is Cov(3X,−2Y )?