Consider a simple random sample X1,..., Xn from a Geo() distribution. (a) What is the exact...
2. Let X1, ..., Xn be a random sample from a Poisson random variable X with parameter 1 (a) Derive the method of moments estimator of 1 (4 marks) (b) Derive the maximum likelihood estimator of 1 (4 marks) (c) Derive the least squares estimator of 1 (4 marks) (d) Propose an estimator for VAR[X] (4 marks) (e) Propose an estimator for E[X²] (4 marks)
Question 3: Bernoulli distribution (23/100 points) Consider a random sample X1,...,Xn from a Bernoulli distribution with unknown parameter p that describes the probability that Xi is equal to 1. That is, Bernoulli(p), i = 1, ..., n. (10) The maximum likelihood (ML) estimator for p is given by ÔML = x (11) n It holds that NPML BIN(n,p). (12) 3.a) (1 point) Give the conservative 100(1 – a)% two-sided equal-tailed confidence interval for p based on ÔML for a given...
Consider a random sample X1, ..., Xn from a normal distribution with known mean 0 and unknown variance 0 = 02 (a) Write the likelihood and log-likelihood function (b) Derive the maximum likelihood estimator for 6 (c) Show that the Fisher information matrix is I(O) = 2014 (d) What is the variance of the maximum likelihood estimator for @? Does it attain the Cramer-Rao lower bound? (e) Suppose that you are testing 0 = 1 versus the alternative 0 #...
Let X1, X2,.. .Xn be a random sample of size n from a distribution with probability density function obtain the maximum likelihood estimator of θ, θ. Use this maximum likelihood estimator to obtain an estimate of P[X > 4 when 0.50, 2 1.50, x 4.00, 4 3.00.
a) Consider a random sample {X1, X2, ... Xn} of X from a uniform distribution over [0,0], where 0 <0 < co and e is unknown. Is п Х1 п an unbiased estimator for 0? Please justify your answer. b) Consider a random sample {X1,X2, ...Xn] of X from N(u, o2), where u and o2 are unknown. Show that X2 + S2 is an unbiased estimator for 2 a2, where п п Xi and S (X4 - X)2. =- п...
Let X1, X2, ..., Xn be a random sample from a Gamma( a , ) distribution. That is, f(x;a,0) = loga xa-le-210, 0 < x <co, a>0,0 > 0. Suppose a is known. a. Obtain a method of moments estimator of 0, 0. b. Obtain the maximum likelihood estimator of 0, 0. c. Is O an unbiased estimator for 0 ? Justify your answer. "Hint": E(X) = p. d. Find Var(ë). "Hint": Var(X) = o/n. e. Find MSE(Ô).
1. Let X1, ..., Xn be a random sample from a distribution with cumulative dist: 10, <<0 F(x) = (/), 0<x<B | 1, >B > (a) For this part, assume that is known and B is unknown. Find the method of moments estimator Boom of B. (b) For this part, assume that both 6 and B are unknown. Find the maximum likelihood estimators of 8 and B.
6. Suppose that X1, ..., Xn is a random sample from a population with the probability density function f(x;0), 0 E N. In this case, the esti- mator ÔLSE = arg min (X; – 6)? n DES2 i=1 is called the least square estimator of Ô. Now, suppose that X1, ..., Xn is a random sample from N(u, 1), u E R. Prove that the least square estimator of u is the same as maximum likelihood estimator of u.
Let X1,X2,...,Xn denote a random sample from the Rayleigh distribution given by f(x) = (2x θ)e−x2 θ x > 0; 0, elsewhere with unknown parameter θ > 0. (A) Find the maximum likelihood estimator ˆ θ of θ. (B) If we observer the values x1 = 0.5, x2 = 1.3, and x3 = 1.7, find the maximum likelihood estimate of θ.
1. Let X1, X2,... .Xn be a random sample of size n from a Bernoulli distribution for which p is the probability of success. We know the maximum likelihood estimator for p is p = 1 Σ_i Xi. ·Show that p is an unbiased estimator of p.