Suppose that X1 ∼ Geometric(1/3) and X2 ∼ Geometric(1/6).
(a) Do you have enough information to compute the expected value of Y = X1 + X2? If so, do so; if not, explain why not.
(b) Do you have enough information to compute the variance of Y = X1 + X2? If so, do so; if not, explain why not.
Suppose that X1 ∼ Geometric(1/3) and X2 ∼ Geometric(1/6). (a) Do you have enough information to...
Suppose you have a random sample {X1, X2, X3} of size n = 3. Consider the following three possible estimators for the population mean u and variance o2 Дi 3D (X1+ X2+ X3)/3 Ti2X1/4 X2/2 X3/4 Дз — (Х+ X,+ X3)/4 (a) What is the bias associated with each estimator? (b) What is the variance associated with each estimator? (c) Does the fact that Var(i3) < Var(1) contradict the statement that X is the minimum variance unbiased estimator? Why or...
O. Let X1 and X2 be two random variables, and let Y = (X1 +
X2)2. Suppose that E[Y ] = 25 and that the variance of X1 and X2
are 9 and 16, respectively.
O. Let Xi and X2 be two random variables, and let Y = (X1 X2)2. Suppose that and that the variance of X1 and X2 are 9 and 16, respectively E[Y] = 25 (63) Suppose that both X\ and X2 have mean zero. Then the...
6. Suppose random variables X1, X2, X3 have the following properties: E(X1) = 1; E(X2) = 2; E(X3) = −1 V(X1) = 1; V(X2) = 3; V(X3) = 5 COV (X1,X2) = 7; COV (X1,X3) = −4; COV (X2,X3) = 2 Let U = X1 −2X2 + X3 and W = 3X1 + X2. (a) Find V(U) (b) Find COV (U,W).
Q2 Suppose X1, X2, X3 are independent Bernoulli random variables with p = 0.5. Let Y; be the partial sums, i.e., Y1 = X1, Y2 = X1 + X2, Y3 = X1 + X2 + X3. 1. What is the distubution for each Yį, i = 1, 2, 3? 2. What is the expected value for Y1 + Y2 +Yz? 3. Are Yį and Y2 independent? Explain it by computing their joint P.M.F. 4. What is the variance of Y1...
1. Suppose that X1, X2, and X3 E(X1) = 0, E(X2) = 1, E(X3) = 1, Var(X1) = 1, Var(X2) = 2, Var(X3) = 3, Cov(X1, X2) = -1, Cov(X2, X3) = 1, where X1 and X3 are independent. a.) Find the covariance cov(X1 + X2, X1 - X3). b.) Define U = 2X1 - X2 + X3. Find the mean and variance of U.
Let X1 and X2 have joint PDF f(x1,x2)=x1+x2 for 0 <x1 <1 and 0<x2 <1.(a) Find the covariance and correlation of X1 and X2. (b) Find the conditional mean and conditional variance of X1 given X2 = x2.
Properties of Expectation and Variance Suppose we have two independent discrete random variables, say X1 and X2. Suppose further E(X1) = 21 Var(X1) = 126 and E(X2) = 3.36 Var(X2) = 1.38 Compute the Expectations and Variances of the following linear combinations of X1 and X2. a) E(πX1 + eX2 + 17) b) E(X1 · 3X2) c) Var( (√ 13X2) + 46) d) Var(X1 + 2X2 + 14)
3. Suppose X1, X2, -- are independent identically distributed random variables with mean 0 and variance 1.Let Sn denote the partial sum Let Fn denote the information contained in Xi, .X,. Suppoe m n. (1) Compute El(Sn Sm)lFm (2) Compute ESm(Sn Sm)|F (3) Compute ES|]. (Hint: Write S (4) Verify that S -n is a martingale. [Sm(Sn Sm))2)
3. Suppose X1, X2, -- are independent identically distributed random variables with mean 0 and variance 1.Let Sn denote the partial sum...
Suppose X1, X2,... are independent Geometric (number of trials)
random variables where Xi ~ Geometric(p = 1/i^2)
a) It is easily shown that Xn converges to a for some constant
a. Name it.
b) According to the Borel-Cantelli Lemmas, does Xn almost surely
converge to a?
Suppose Xi, X2, are independent Geometric (number of trials) random variables where x,~ Geometric(pal+) |. a) It is easily shown that Xa for some constant a. Name it. b) According to the Borel-Cantelli Lemmas,...
Suppose X1, X2, Xz~exp(1) and they are independent. (a) Compute the cdf of X1 (b) Let Y- max(Xi, X2, X3). Find the cdf of Y (c) Derive the pdf of Y