Only Questions 4,5 and 6 a=5 Problem 1. Let (X1, ...., Xn) be an i.i.d random...
2. Suppose Xi,X2,..., Xn are i.i.d. random variables such that a e [0, 1] and has the following density function: r (2a) (1a-1 where ? > 0 is the parameter for the distribution. It is known that E(X) = 2 Compute the method of moments estimator for a
Q2 Suppose X1, X2, ..., Xn are i.i.d. Bernoulli random variables with probability of success p. It is known that p = ΣΧ; is an unbiased estimator for p. n 1. Find E(@2) and show that p2 is a biased estimator for p. (Hint: make use of the distribution of X, and the fact that Var(Y) = E(Y2) – E(Y)2) 2. Suggest an unbiased estimator for p2. (Hint: use the fact that the sample variance is unbiased for variance.) Xi+2...
Q3 Suppose X1, X2, ..., Xn are i.i.d. Poisson random variables with expected value ). It is well-known that X is an unbiased estimator for l because I = E(X). 1. Show that X1+Xn is also an unbiased estimator for \. 2 2. Show that S2 (Xi-X) = is also an unbaised esimator for \. n-1 3. Find MSE(S2). (We will need two facts) E com/questions/2476527/variance-of-sample-variance) 2. Fact 2: For Poisson distribution, E[(X – u)4] 312 + 1. (See for...
Answer the following questions: a. Let X1, X2, . . . , Xn be i.i.d. random vectors (a random sample) from Np(μ1, Σ). Find the distribution of X ̄ . Note: X ̄ = 1/n Xi . b. Refer to question (a). Consider the following two random variables: Q1 = 1′X ̄/1'1 and Q2 = 1′Σ−1X ̄/1′Σ−11 ̄ . Find the mean and variance of Q1 and Q2 .
Let X1,X2,...,Xn be an independent and identically distributed (i.i.d.) random sample of Beta distribution with parameters α = 2 and β = 1, i.e., with probability density function fX(x) = 2x for x ∈ (0,1). Find the probability density function of the first and last order statistics Y1 and Yn.
Suppose that the random variables X,..Xn are i.i.d. random variables, each uniform on the interval [0, 1]. Let Y1 = min(X1, ,X, and Yn = mar(X1,-..,X,H a. Show that Fri (y) = P(Ks y)-1-(1-Fri (y))". b. Show that and Fh(y) = P(, y) = (1-Fy(y))". c. Using the results from (a) and (b) and the fact that Fy (y)-y by property of uniform distribution on [0, 11, find EMI and EIYn]
Let X1, X2, ..., Xn be a random sample from a Gamma( a , ) distribution. That is, f(x;a,0) = loga xa-le-210, 0 < x <co, a>0,0 > 0. Suppose a is known. a. Obtain a method of moments estimator of 0, 0. b. Obtain the maximum likelihood estimator of 0, 0. c. Is O an unbiased estimator for 0 ? Justify your answer. "Hint": E(X) = p. d. Find Var(ë). "Hint": Var(X) = o/n. e. Find MSE(Ô).
5. Let X1, X2,. , Xn be a random sample from a distribution with pdf of f(x) (0+1)x,0< x<1 a. What is the moment estimator for 0 using the method of moments technique? b. What is the MLE for 0?
1. Let X1, ..., Xn be a random sample from a distribution with the pdf le-x/0, x > 0, N = (0,00). (a) Find the maximum likelihood estimator of 0. (b) Find the method of moments estimator of 0. (c) Are the estimators in a) and b) unbiased? (d) What is the variance of the estimators in a) and b)? (e) Suppose the observed sample is 2.26, 0.31, 3.75, 6.92, 9.10, 7.57, 4.79, 1.41, 2.49, 0.59. Find the maximum likelihood...
5. Let X1, X2, ..., Xn be a random sample from a distribution with pdf of f(x) = (@+1)xº,0<x<1. a. What is the moment estimator for 0 using the method of moments technique? b. What is the MLE for @ ?