(7) 112 ptsl Let Xi,..., XT denote a random sample of size T from X, where VIX] < oo. a +bX, and for each t define Zt a +bX, for some Define a new random variable Z constants a and b. (a) Show that Z = a + bX and 03-b2q, where the sample me an X and sample variance x of the original sample are as defined in class (b) Prove that Z is an unbiased estimator of E[Z]...
X and Y are random variables (a) Show that E(X)=E(B(X|Y)). (b) If P((X x, Y ) P((X x})P({Y y)) then show that E(XY) = E(X)E(Y), i.e. if two random variables are independent, then show that they are uncorrelated. Is the reverse true? Prove or disprove (c) The moment generating function of a random variable Z is defined as ΨΖφ : Eez) Now if X and Y are independent random variables then show that Also, if ΨΧ(t)-(λ- (d) Show the conditional...
Let X and Y be independent identically distributed random variables with means µx and µy respectively. Prove the following. a. E [aX + bY] = aµx + bµy for any constants a and b. b. Var[X2] = E[X2] − E[X]2 c. Var [aX] = a2Var [X] for any constant a. d. Assume for this part only that X and Y are not independent. Then Var [X + Y] = Var[X] + Var[Y] + 2(E [XY] − E [X] E[Y]). e....
Let X and Y be independent identically distributed random variables with means µx and µy respectively. Prove the following. a. E [aX + bY] = aµx + bµy for any constants a and b. b. Var[X2] = E[X2] − E[X]2 c. Var [aX] = a2Var [X] for any constant a. d. Assume for this part only that X and Y are not independent. Then Var [X + Y] = Var[X] + Var[Y] + 2(E [XY] − E [X] E[Y]). e....
3. (5 marks) Let U be a random variable which has the continuous uniform distribution on the interval I-1, 1]. Recall that this means the density function fu satisfies for(z-a: a.crwise. 1 u(z), -1ss1, a) Find thc cxpccted valuc and the variancc of U. We now consider estimators for the expected value of U which use a sample of size 2 Let Xi and X2 be independent random variables with the same distribution as U. Let X = (X1 +...
Let X and Y be two independent random variables such that E(X) = E(Y) = u but og and Oy are unequal. We define another random variable Z as the weighted average of the random variables X and Y, as Z = 0X + (1 - 0)Y where 0 is a scalar and 0 = 0 < 1. 1. Find the expected value of Z , E(Z), as a function of u . 2. Find in terms of Oy and...
Let X1 and X2 be independent random variables so X1~ N(u,1) and X2 N(u,4) Where u R a) Show that the likelihood for , given that X1 = x1 and X2 = xz is 8 4T b) Show, that the maxium likelihood estimate for u is 4x1+ x2 и (х, х2) e) Show that СтN -("x"x) .я d) and enter a formula for the 95% confidence interval for Let X1 and X2 be independent random variables so X1~ N(u,1) and...
12. Let X and Y be independent random variables, where X has a uniform distribution on the interval (0,1/2), and Y has an exponential distribution with parameter = 1. (Remember to justify all of your answers.) (a) What is the joint distribution of X and Y? (b) What is P{(x > 0.25) U (Y > 0.25)}? (c) What is the conditional distribution of X. given that Y - 3? (d) What is Var(Y - E[2X] + 3)? (e) What is...
1) Let X and Y be random variables. Show that Cov( X + Y, X-Y) Var(X)--Var(Y) without appealing to the general formulas for the covariance of the linear combinations of sets of random variables; use the basic identity Cov(Z1,22)-E[Z1Z2]- E[Z1 E[Z2, valid for any two random variables, and the properties of the expected value 2) Let X be the normal random variable with zero mean and standard deviation Let ?(t) be the distribution function of the standard normal random variable....
(a) Let X and Y be independent random variables both with the same mean u +0. Define a new random variable W = ax + by, where a and b are constants. (i) Obtain an expression for E(W). (ii) What constraint is there on the values of a and b so that W is an unbiased estimator of u? Hence write all unbiased versions of W as a formula involving a, X and Y only (and not b). [2]