a)
E[X] = 0 * P(X = 0) + 1 * P(X = 1) = 1 * =
E[X2] = 02 * P(X = 0) + 12 * P(X = 1) = 1 * =
Var[X] = E[X2] - E[X]2 = - =
b)
(By central limit theorem, )
Thus, is unbiased estimator of
c)
As, , is biased estimator of .
For n > 1, (n / n+2)2 < 1
thus,
d)
Bias for is,
b =
Risk = b2 + Var() =
Bias for is,
b =
Risk = b2 + Var() =
For low values of n and large values of , the risk for will be larger than that of , but for high values of n, the risk for will be lesser than that of .
Question 3: A random variable X has a Bernoulli distribution with parameter θ є (0,1) if...
I. Consider a variable y = θ + where θ is an unknown parameter and e is a random variable with mean zero. (a) What is the expected value of y? (b) Suppose you draw a sample of yi yn. Derive the least squares estimator for θ. For full credit you must check the 2nd order condition c) Can this estimator (0) be described as a method of moments estimator? (d) Now suppose є is independent normally distributed with mean...
Let X be a random variable with cdf FX (x:0), expected value EIX-μ and variance VlX- σ2. Let X1,X2, , Xn be an id sample drawn according to FX(x,8) where Fx (x,8) =万 for all x E (0,0). Let max(X1, X2, , X.) be an estimator of θ, suggested from pure common sense. Remember that if Y = max(X1, X2, , Xn). Then it can be shown that the cdf Fy () of Y is given by Fr(u) (Fx()" where...
Exercice 6. Let be (Xi,..., Xn) an iid sample from the Bernoulli distribution with parameter θ, ie. I. What is the Maximum Likelihood estimate θ of θ? 2. Show that the maximum likelihood estimator of θ is unbiased. 3. We're looking to cstimate the variance θ (1-9) of Xi . x being the empirical average 2(1-2). Check that T is not unli ator propose an unbiased estimator of θ(1-0).
Let X be a random variable with probability density function (pdf) given by fx(r0)o elsewhere where θ 0 is an unknown parameter. (a) Find the cumulative distribution function (cdf) for the random variable Y = θ and identify the distribution. Let X1,X2, . . . , Xn be a random sample of size n 〉 2 from fx (x10). (b) Find the maximum likelihood estimator, Ỗmle, for θ (c.) Find the Uniform Minimum Variance Unbiased Estimator (UMVUE), Bumvue, for 0...
7. Let X1,....Xn random sample from a Bernoulli distribution with parameter p. A random variable X with Bernoulli distribution has a probability mass function (pmf) of with E(X) = p and Var(X) = p(1-p). (a) Find the method of moments (MOM) estimator of p. (b) Find a sufficient statistic for p. (Hint: Be careful when you write the joint pmf. Don't forget to sum the whole power of each term, that is, for the second term you will have (1...
Advanced Statistics, I need help with (c) and (d) 2. Let X1, X2, ..., Xn be a random sample from a Bernoulli(6) distribution with prob- ability function Note that, for a random variable X with a Bernoulli(8) distribution, E [X] var [X] = θ(1-0) θ and (a) Obtain the log-likelihood function, L(0), and hence show that the maximum likelihood estimator of θ is 7l i= I (b) Show that dE (0) (c) Calculate the expected information T(e) EI()] (d) Show...
7. Section 6.4, Exercise 1 Let X. X be a random sample from the U(0,0) distribution, and let , 2X and mx X, be estimators for 0. It is given that the mean and variance of oz are (a) Give an expression for the bias of cach of the two estimators. Are they unbiased? (b) Give an expression for the MSE of cach of the two estimators. (c) Com pute the MSE of each of the two ctrnators for n...
Question 3 [17 marks] The random variable X is distributed exponentially with parameter A i.e. X~ Exp(A), so that its probability density function (pdf) of X is SO e /A fx(x) | 0, (2) (a) Let Y log(X. When A = 1, (i) Show that the pdf of Y is fr(y) = e (u+e-") (ii) Derive the moment generating function of Y, My(t), and give the values of t such that My(t) is well defined. (b) Suppose that Xi, i...
Let X1,X2,,X be a random sample from a distribution function f(x,8) = θ"(1-8)1-r for x = 0,1 (a) Show that Y = Σ.1X, is a sufficient statistic for θ. (i) Find a function of Y that is an unbiased estimate for θ (ii) Hence, explain why this function is the minimum variance unbiased estimator(MVUE) for θ (c) Is1-the MVUE for Please explain.
Question 5 15 marks] Let X be a random variable with pdf -{ fx(z) = - 0<r<1 (1) 0 :otherwise, Xa, n>2, be iid. random variables with pdf where 0> 0. Let X. X2.... given by (1) (a) Let Ylog X, where X has pdf given by (1). Show that the pdf of Y is Be- otherwise, (b) Show that the log-likelihood given the X, is = n log0+ (0- 1)log X (0 X) Hence show that the maximum likelihood...