3. Consider n i.i.d. r.v.s. X1, .Xn, where X, Bin(p). Show that the conditional PMF of...
X1, X2, . . . , Xn i.i.d. ∼ N (µ, σ2 ). Assume µ is known; show that ˆθ = 1 n Pn i=1(Xi− µ) 2 is the MLE for σ 2 and show that it is unbiased. Exactly 6.4-2. Xi, X2, . . . , xn i d. N(μ, μ)2 is the MLE for σ2 and show that it is unbiased. r'). Assume μ is known; show that θ- n Ση! (X,-
a) Consider a random sample {X1, X2, ... Xn} of X from a uniform distribution over [0,0], where 0 <0 < co and e is unknown. Is п Х1 п an unbiased estimator for 0? Please justify your answer. b) Consider a random sample {X1,X2, ...Xn] of X from N(u, o2), where u and o2 are unknown. Show that X2 + S2 is an unbiased estimator for 2 a2, where п п Xi and S (X4 - X)2. =- п...
Q2 Suppose X1, X2, ..., Xn are i.i.d. Bernoulli random variables with probability of success p. It is known that p = ΣΧ; is an unbiased estimator for p. n 1. Find E(@2) and show that p2 is a biased estimator for p. (Hint: make use of the distribution of X, and the fact that Var(Y) = E(Y2) – E(Y)2) 2. Suggest an unbiased estimator for p2. (Hint: use the fact that the sample variance is unbiased for variance.) Xi+2...
. The two questions that follow concern the following variant of the Bernoulli pro- cess: Fix k 2 1. At each (integer) time n 2 1 the process takes the value Xn, where Xn are i.i.d. random variables each with the uniform distribution on 12,,. (4) (a) What is the PMF for the random variable N defined as the smallest N 2 2 so that XN X1 (b) Is N a stopping time? (c) What is the probability that XN+1...
Let X1, X2, · · · Xn be a i.i.d. sample from Bernoulli(p) and let . Show that Yn converges to a degenerate distribution at 0 as n → ∞.
Q3 Suppose X1, X2, ..., Xn are i.i.d. Poisson random variables with expected value ). It is well-known that X is an unbiased estimator for l because I = E(X). 1. Show that X1+Xn is also an unbiased estimator for \. 2 2. Show that S2 (Xi-X) = is also an unbaised esimator for \. n-1 3. Find MSE(S2). (We will need two facts) E com/questions/2476527/variance-of-sample-variance) 2. Fact 2: For Poisson distribution, E[(X – u)4] 312 + 1. (See for...
Please answer all the parts neatly with all details. 3. Assume X1, X2,... are a sequence of i.i.d. random variables having finite first moment, that is, v = E(Xi oo. Let Yn = (|X1| .+ |Xn|)/n. (a) Show that Yn ->v in probability. (b) Show that E(Y,) -- v. (c) Show that E(|X, - /u|) -0 where u = E(X) 3. Assume X1, X2,... are a sequence of i.i.d. random variables having finite first moment, that is, v = E(Xi...
4. Let Xi,... . Xn be lid discrete uniform random variables with common pmf θ, with th θ) being {1, 2, . . .). Let T-max(X1, . .. , X e parameter space for (a) Derive the distribution of T. (Hint: use the edf approach). (b) Give the conditional distribution of Xi,... ,Xn given T-
Suppose X1, X2, . . . , Xn (n ≥ 5) are i.i.d. Exp(µ) with the density f(x) = 1 µ e −x/µ for x > 0. (a) Let ˆµ1 = X. Show ˆµ1 is a minimum variance unbiased estimator. (b) Let ˆµ2 = (X1 +X2)/2. Show ˆµ2 is unbiased. Calculate V ar(ˆµ2). Confirm V ar(ˆµ1) < V ar(ˆµ2). Calculate the efficiency of ˆµ2 relative to ˆµ1. (c) Show X is consistent and sufficient. (d) Show ˆµ2 is not consistent...
7. Let X1, X2, ... be an i.i.d. random variables. (a) Show that max(X1,... , X,n)/n >0 in probability if nP(Xn > n) -» 0. (b) Find a random variable Y satisfying nP(Y > n) ->0 and E(Y) = Oo