Second question has not been completely provided. Hence, I take it that you are seeking an answer for the first one.
Qi Prove that MSEên) = Varê) + Bias(ê)?, i.e., E[(ên – 6)2) = E[lê – E(Ô))]...
Q1 and Q2 (please also show the steps): Q1 Prove that MSE) = Var(ë) + Bias(@?, i.e., El(Ô – 9)2) = E[(O - ECO)?] + [ECO) – 6)2. Q2 Suppose X1, X2, ..., X, are i.i.d. Bernoulli random variables with probability of success p. It is known that = is an unbiased estimator for p. n 1. Find E(2) and show that p2 is a biased estimator for p? (Hint: make use of the distribution of x. and the fact...
Qi Prove that MSE(Ô) = Varê) + Bias(0)2, i.e., E[(Ô – 0)21 = E[cê – E(0)2] + [E(0) – ]2. +
Prove that MSE(Ô) = Var() + Bias(0)2, i.e., E[(ôn – 0)21 = E[(ên – E(0))21 + [E(Ô) – 012.
Q2 Suppose X1, X2, ..., Xn are i.i.d. Bernoulli random variables with probability of success p. It is known ΣΧ; is an unbiased estimator for p. that = n 2. Suggest an unbiased estimator for pa. (Hint: use the fact that the sample variance is unbiased for variance.) 3. Show that p= ΣΧ,+2 n+4 is a biased estimator for p. 4. For what values of p, MSE) is smaller than MSE)?
Q2 Suppose X1, X2, ..., Xn are i.i.d. Bernoulli random variables with probability of success p. It is known that p = ΣΧ; is an unbiased estimator for p. n 1. Find E(@2) and show that p2 is a biased estimator for p. (Hint: make use of the distribution of X, and the fact that Var(Y) = E(Y2) – E(Y)2) 2. Suggest an unbiased estimator for p2. (Hint: use the fact that the sample variance is unbiased for variance.) Xi+2...
2. Suppose Xi,X2,..., Xn are i.i.d. random variables such that a e [0, 1] and has the following density function: r (2a) (1a-1 where ? > 0 is the parameter for the distribution. It is known that E(X) = 2 Compute the method of moments estimator for a
3. Suppose that X1, X2, X3 be i.i.d. random variables with P(Xi 0) 2/5 and P(X 1) 3/5. Find the MGFof X, + X2 + X 3. 3. Suppose that X1, X2, X3 be i.i.d. random variables with P(Xi 0) 2/5 and P(X 1) 3/5. Find the MGFof X, + X2 + X 3.
Problem 2. (6 pts) Independence and Conditional Probability (a) (2 pts) An urn contains 3 red and 5 green balls. At each step of this game, we pick one ball at random, note its color and return the ball to the urn together with anoter ball of the same color. Prove by induction that the probability that the ball we pick a red ball at the n-th step is 3/8. (b) (2pts) Consider any two random variables X, Y of...
Central Limit Theorem: let x1,x2,...,xn be I.I.D. random variables with E(xi)= U Var(xi)= (sigma)^2 defind Z= x1+x2+...+xn the distribution of Z converges to a gaussian distribution P(Z<=z)=1-Q((z-Uz)/(sigma)^2) Use MATLAB to prove the central limit theorem. To achieve this, you will need to generate N random variables (I.I.D. with the distribution of your choice) and show that the distribution of the sum approaches a Guassian distribution. Plot the distribution and matlab code. Hint: you may find the hist() function helpful
(21%) Consider the parallel circuit with probability as follows: 3. C1 with success prob p1 C2 with success prob p2 (a) 3%) Compute the probability p of sending data successfully through this circuit as a function of pı and p2 (b) (3%) If p1 p2 1, find the maximum ql and also the minimum q2 of p. Prove or reason that the maximum value qi of p and minimum value q2 of p you find are really the maximum and...