For each n, let Xn be a binomial random variable with n trials and probability of success p Yn Use the Weak Law of Large Numbers to show that is a consistent estimator of p. (b) Explain why it follows from (a) that (1 isaconsistent estimator of p(1 -p) and that 1) is a consistent estimator of p(-P) 7l V n
3. Let X1, X2, . . . , Xn be independent samples of a random variable with the probability density function (PDF): fX(x) = θ(x − 1/ 2 ) + 1, 0 ≤ x ≤ 1 ,0 otherwise where θ ∈ [−2, 2] is an unknown parameter. We define the estimator ˆθn = 12X − 6 to estimate θ. (a) Is ˆθn an unbiased estimator of θ? (b) Is ˆθn a consistent estimator of θ? (c) Find the mean squared...
Please give detailed steps. Thank you. 5. Let {X1, X2,..., Xn) denote a random sample of size N from a population d escribed by a random variable X. Let's denote the population mean of X by E(X) - u and its variance by Consider the following four estimators of the population mean μ : 3 (this is an example of an average using only part of the sample the last 3 observations) (this is an example of a weighted average)...
QUESTION 2 Let Xi.. Xn be a random sample from a N (μ, σ 2) distribution, and let S2 and Š-n--S2 be two estimators of σ2. Given: E (S2) σ 2 and V (S2) - ya-X)2 n-l -σ (a) Determine: E S2): (l) V (S2); and (il) MSE (S) (b) Which of s2 and S2 has a larger mean square error? (c) Suppose thatnis an estimator of e based on a random sample of size n. Another equivalent definition of...
Q2 Suppose X1, X2, ..., Xn are i.i.d. Bernoulli random variables with probability of success p. It is known ΣΧ; is an unbiased estimator for p. that = n 2. Suggest an unbiased estimator for pa. (Hint: use the fact that the sample variance is unbiased for variance.) 3. Show that p= ΣΧ,+2 n+4 is a biased estimator for p. 4. For what values of p, MSE) is smaller than MSE)?
HOMEWORK 1 ercise 20. (Rossi 4.1.1-2) (a) Let X,... , Xn be a sample of id U(0,0) random variables, and let T - 2X be an estimator of θ. Determine each of the following. (ii) Bias(T 8) (ii) MSE(T,9) (iv) whether T is an MSE-consistent estimator of θ (b) Let Xi,...,In be a sample of id Gamma(4,0) r random variables, and let T X be an estimator of θ. Determine each of the following. (ii) Bias(T; 0) (iii) MSE(T,0) (iv)...
Let X be a random variable with probability density function (pdf) given by fx(r0)o elsewhere where θ 0 is an unknown parameter. (a) Find the cumulative distribution function (cdf) for the random variable Y = θ and identify the distribution. Let X1,X2, . . . , Xn be a random sample of size n 〉 2 from fx (x10). (b) Find the maximum likelihood estimator, Ỗmle, for θ (c.) Find the Uniform Minimum Variance Unbiased Estimator (UMVUE), Bumvue, for 0...
8. Let X1,...,Xn denote a random sample of size n from an exponential distribution with density function given by, 1 -x/0 -e fx(x) MSE(1). Hint: What is the (a) Show that distribution of Y/1)? nY1 is an unbiased estimator for 0 and find (b) Show that 02 = Yn is an unbiased estimator for 0 and find MSE(O2). (c) Find the efficiency of 01 relative to 02. Which estimate is "better" (i.e. more efficient)? 8. Let X1,...,Xn denote a random...
please solve 6 4. Let Xi. X2. . Xnbe ap (1 I: 1 Xi ) 1/n is a consistent estimator for θ e . BAN. [Show that n(θ-X(n)) G (1, θ the estimator T0(X) = (n + 2)X(n)/(n + 1) in this class has the least MSE. an 5. In Problem 2, show that TX)Xm) is asymptotically biased for o 6.In Problem 5, consider the class of estimators T(X) cX(n), c 0. Sho 4. Let Xi. X2. . Xnbe ap...
Let X1, . . . , Xn be independent Beta(θ, 1) random variables with parameter θ > 0. (1) Find the Bayes estimator of θ for a Gamma(α, β) prior. (2) Find the MSE of the Bayes estimator.