3. (5 marks) Let U be a random variable which has the continuous uniform distribution on the inte...
Let Xi and X2 independent random variables, with distribution functions F1, and F2, respectively Let Y a Bernoulli random variable with parameter p. Suppose that Y, X1 and X2 are independent. Proof using the de finition of distribution function that the the distribution function of Z =Y Xit(1-Y)X2 is F = pF14(1-p)F2 Don't use generatinq moment functions, characteristic functions) Xi and X2 independent random variables, with distribution functions F1, and F2, respectively Let Y a Bernoulli random variable with parameter...
5. Let X1,X2, . , Xn be a random sample from a distribution with finite variance. Show that (i) COV(Xi-X, X )-0 f ) ρ (Xi-XX,-X)--n-1, 1 # J, 1,,-1, , n. OV&.for any two random variables X and Y) or each 1, and (11 CoV(X,Y) var(x)var(y) (Recall that p vararo 5. Let X1,X2, . , Xn be a random sample from a distribution with finite variance. Show that (i) COV(Xi-X, X )-0 f ) ρ (Xi-XX,-X)--n-1, 1 # J,...
7. Section 6.4, Exercise 1 Let X. X be a random sample from the U(0,0) distribution, and let , 2X and mx X, be estimators for 0. It is given that the mean and variance of oz are (a) Give an expression for the bias of cach of the two estimators. Are they unbiased? (b) Give an expression for the MSE of cach of the two estimators. (c) Com pute the MSE of each of the two ctrnators for n...
Let x and x, be independent random variables with Mean u and variance o2. Suppose that we have two estimators Of u : A @= X1 + X2 2 and ©2 = X, +3X2 2 (a) Are both estimators unbiased estimators of u? (b) What is the variance of each estimator?
4. Let X1,X2, x 2) distribution, and let sr_ Ση:1 (Xi-X)2 and S2 n-l Σηι (Xi-X)2 be the estimators of σ2. (i) Show that the MSE of S" is smaller than the MSE of S2 (ii) Find ElvS2] and suggest an unbiased estimator of σ. n be a random sample from N (μ, σ
4. Let Xi, X2,... be uncorrelated random variables, such that Xn has a uniform distribution over -1/n, 1/n]. Does the sequence converge in probability? 5. Let Xi,X2 be independent random variables, such that P(X) PX--) Does the sequence X1 +X2+...+X satisfy the WLLN? Converge in probability to 0?
(a) Are they unbiased estimators for µ? (b) Compute the MSE for all the 4 estimators. (c) Which one is the best estimator for µ? Why. PLEASE answer all parts, thanks Let X1, X2, ..., X, be and i.i.d. sample from some distribution with mean y and variance o? Let us construct several estimators for . Let îi = X, iz = X1, A3 = (X1 + X2)/2, W = X1 + X2 (a) Are they unbiased estimators for ?...
4. Let X1,X2, ,Xn be a randonn sample from N(μ, σ2) distribution, and let s* Ση! (Xi-X)2 and S2-n-T Ση#1 (Xi-X)2 be the estimators of σ2 (i) Show that the MSE of s is smaller than the MSE of S2 (ii) Find E [VS2] and suggest an unbiased estimator of σ.
Let X1, ..., X., be i.i.d random variables N(u, 02) where u is known parameter and o2 is the unknown parameter. Let y() = 02. (i) Find the CRLB for yo?). (ii) Recall that S2 is an unbiased estimator for o2. Compare the Var(S2) to that of the CRLB for
3. A random variable X is said to have a Cauchy(α, β) distribution if and only if it has PDF function Now, suppose that Xi and X2 are independent Cauchy(0, 1) random variables, and let Y = X1 + X2. Use the transformation technique to find and identify the distribution of Y by first finding the joint distribution of Xi and Y. (Seahin 3 4