Suppose X1, X2, . . . , Xn follows Bernoulli(p), and Y1, Y2, . . . , Ym follows Bernoulli(p + q), where both 0 < p, q < 0.5. Compute the moment estimator of p and q using first moments.
Suppose X1, X2, . . . , Xn follows Bernoulli(p), and Y1, Y2, . . ....
Let {x1, x2, ..., xn} be a sample from Bernoulli(p). Find an unbiased estimator for p^2 . Let {x1,x2,..., Xn} be a ..., Xn} be a sample from Bernoulli(p). Find an unbiased estimator for p?.
Q2 Suppose X1, X2, X3 are independent Bernoulli random variables with p = 0.5. Let Y; be the partial sums, i.e., Y1 = X1, Y2 = X1 + X2, Y3 = X1 + X2 + X3. 1. What is the distubution for each Yį, i = 1, 2, 3? 2. What is the expected value for Y1 + Y2 +Yz? 3. Are Yį and Y2 independent? Explain it by computing their joint P.M.F. 4. What is the variance of Y1...
Suppose that X1, X2,.... Xn and Y1, Y2,.... Yn are independent random samples from populations with the same mean μ and variances σ., and σ2, respectively. That is, x, ~N(μ, σ ) y, ~ N(μ, σ ) 2X + 3Y Show that is a consistent estimator of μ.
Q2 Suppose X1, X2, ..., Xn are i.i.d. Bernoulli random variables with probability of success p. It is known that p = ΣΧ; is an unbiased estimator for p. n 1. Find E(@2) and show that p2 is a biased estimator for p. (Hint: make use of the distribution of X, and the fact that Var(Y) = E(Y2) – E(Y)2) 2. Suggest an unbiased estimator for p2. (Hint: use the fact that the sample variance is unbiased for variance.) Xi+2...
Consider a random sample (X1, Y1),(X2, Y2), . . . ,(Xn, Yn) where Y | X = x is modeled by a N(β0 + βx, σ2 ) distribution, where β0, β1 and σ 2 are unknown. (a) Prove that the mle of β1 is an unbiased estimator of β1. (b) Prove that the mle of β0 is an unbiased estimator of β0.
Q2 Suppose X1, X2, ..., Xn are i.i.d. Bernoulli random variables with probability of success p. It is known ΣΧ; is an unbiased estimator for p. that = n 2. Suggest an unbiased estimator for pa. (Hint: use the fact that the sample variance is unbiased for variance.) 3. Show that p= ΣΧ,+2 n+4 is a biased estimator for p. 4. For what values of p, MSE) is smaller than MSE)?
Suppose Y1, Y2, ..., Yn are such that Y; ~ Bernoulli(p) and let X = 2h+Yi. (a) [1 point] Use the distribution of X to show that the method of moments estimator of p is ÔMM = Lizzi. (Work that is unclear or that cannot be followed from step to step will not recieve full credit.) (b) [2 points] Show that the method of moments estimator PMM is a consistent estimator of p. Please show your work to support your...
(2) Given two independent variables X1 and X2 having Bernoulli distribution with parameter p=1/3, let Y1 = 2X1 and Y2 = 2X2. Then A E[Y1 · Y2] = 2/9 BE[Y1 · Y2] = 4/9 C P[Y1 · Y2 = 0) = 1/9 D P[Y1 · Y2 = 0) = 2/9 (3) Let X and Y be two independent random variables having gaussian (normal) distribution with mean 0 and variance equal 2. Then: A P[X +Y > 2] > 0.5 B...
Suppose that X1, X2, ..., Xn is an iid sample, each with probability p of being distributed as uniform over (-1/2,1/2) and with probability 1 - p of being distributed as uniform over (a) Find the cumulative distribution function (cdf) and the probability density function (pdf) of X1 (b) Find the maximum likelihood estimator (MLE) of p. c) Find another estimator of p using the method of moments (MOM)
Problem 3. Given Xi,-.. ,Xn ~Bernoulli(p), and Yi,... ,Ym ~Bernoulli(a), find the plug-in estimator and estimated standard error for p, and the plug-in estimator for p- q. Conclude by finding the standard error SE for p-q.