4. (3 points) Let X,.., X be an i.i.d. Bernoulli random variables with parameter p. Is it reasona...
3. (15 Points) Let Xi Bernoulli(p) and X2Bernoulli(3p) be independent Bernoulli random variables where p E [0, 1/3]. Derive the Maximum Likelihood Estimator (MLE) of p. Denote it by p. 3. (15 Points) Let Xi Bernoulli(p) and X2Bernoulli(3p) be independent Bernoulli random variables where p E [0, 1/3]. Derive the Maximum Likelihood Estimator (MLE) of p. Denote it by p.
3. Let X and 6 be two random variables. Let X given 0 have a Bernoulli distribution with parameter θ, this is, X | θ ~ Bernoulli (9), and let θ have a beta distribution with parameters a and b, this is 9 Beta(a, b), where a and b are known positive constants (a) Find the joint distribution of (X,6 (b) Find the marginal distribution of X.
5. Let Xi, , X, (n 3) be iid Bernoulli random variables with parameter θ with 0<θ<1. Let T = Σ_iXi and 0 otherwiase. (a) Derive Eo[6(X,, X.)]. (b) Derive Ee16(X, . . . , Xn)IT = t], for t = 0, i, . . . , n.
Q2 Suppose X1, X2, ..., Xn are i.i.d. Bernoulli random variables with probability of success p. It is known that p = ΣΧ; is an unbiased estimator for p. n 1. Find E(@2) and show that p2 is a biased estimator for p. (Hint: make use of the distribution of X, and the fact that Var(Y) = E(Y2) – E(Y)2) 2. Suggest an unbiased estimator for p2. (Hint: use the fact that the sample variance is unbiased for variance.) Xi+2...
Let Xi and X2 independent random variables, with distribution functions F1, and F2, respectively Let Y a Bernoulli random variable with parameter p. Suppose that Y, X1 and X2 are independent. Proof using the de finition of distribution function that the the distribution function of Z =Y Xit(1-Y)X2 is F = pF14(1-p)F2 Don't use generatinq moment functions, characteristic functions) Xi and X2 independent random variables, with distribution functions F1, and F2, respectively Let Y a Bernoulli random variable with parameter...
(2) Given two independent variables X1 and X2 having Bernoulli distribution with parameter p=1/3, let Y1 = 2X1 and Y2 = 2X2. Then A E[Y1 · Y2] = 2/9 BE[Y1 · Y2] = 4/9 C P[Y1 · Y2 = 0) = 1/9 D P[Y1 · Y2 = 0) = 2/9 (3) Let X and Y be two independent random variables having gaussian (normal) distribution with mean 0 and variance equal 2. Then: A P[X +Y > 2] > 0.5 B...
Q2 Suppose X1, X2, ..., Xn are i.i.d. Bernoulli random variables with probability of success p. It is known ΣΧ; is an unbiased estimator for p. that = n 2. Suggest an unbiased estimator for pa. (Hint: use the fact that the sample variance is unbiased for variance.) 3. Show that p= ΣΧ,+2 n+4 is a biased estimator for p. 4. For what values of p, MSE) is smaller than MSE)?
Let X1, ..., X., be i.i.d random variables N(u, 02) where u is known parameter and o2 is the unknown parameter. Let y() = 02. (i) Find the CRLB for yo?). (ii) Recall that S2 is an unbiased estimator for o2. Compare the Var(S2) to that of the CRLB for
7. Let X1, X2,.. be i.i.d. random variables, and let T(t)minn: X > t, t20. (a) Determine the distribution of T(t) (b) Show that, if p= P(X1> t)0 astoo, then pT(t)Exp(1) as to 7. Let X1, X2,.. be i.i.d. random variables, and let T(t)minn: X > t, t20. (a) Determine the distribution of T(t) (b) Show that, if p= P(X1> t)0 astoo, then pT(t)Exp(1) as to
7. Let X1,....Xn random sample from a Bernoulli distribution with parameter p. A random variable X with Bernoulli distribution has a probability mass function (pmf) of with E(X) = p and Var(X) = p(1-p). (a) Find the method of moments (MOM) estimator of p. (b) Find a sufficient statistic for p. (Hint: Be careful when you write the joint pmf. Don't forget to sum the whole power of each term, that is, for the second term you will have (1...