5. Let X; (i = 1, 2, 3) be be independent gamma random variables with a;...
Let X1 and X2 be independent random variables so X1~ N(u,1) and X2 N(u,4) Where u R a) Show that the likelihood for , given that X1 = x1 and X2 = xz is 8 4T b) Show, that the maxium likelihood estimate for u is 4x1+ x2 и (х, х2) e) Show that СтN -("x"x) .я d) and enter a formula for the 95% confidence interval for
Let X1 and X2 be independent random variables so X1~ N(u,1) and...
Let X1, X2, ..., Xn be a random sample from a Gamma( a , ) distribution. That is, f(x;a,0) = loga xa-le-210, 0 < x <co, a>0,0 > 0. Suppose a is known. a. Obtain a method of moments estimator of 0, 0. b. Obtain the maximum likelihood estimator of 0, 0. c. Is O an unbiased estimator for 0 ? Justify your answer. "Hint": E(X) = p. d. Find Var(ë). "Hint": Var(X) = o/n. e. Find MSE(Ô).
2 Let X1, X2, ..., X, be independent continuous random variables from the following distribution: (*) - ar-(-) where I 21 and a > 1 You may use the fact: E[X] = -1 2.1 Show that the maximum likelihood estimator of a is â MLE - Srlos Xi 2.2 Show that the method moment estimator for a is: &mom = 1 2.3 Derive a sufficient statistic for a. What theorem are you using to determine sufficiency?
Let X1 and X2 be independent gamma distribution random variables with gamma (a1,1) and gamma (a2, 1). Find the marginal distributions of x1/(x1+x2) and x2/(x1+x2).
2 Let X1, X2, ...,X, be independent continuous random variables from the following distribution: f(3) = ox-(0-1) where : > 1 and a > 1 You may use the fact: E[X]- .- 2.1 Show that the maximum likelihood estimator of a is ômle = Ei log Xi 2.3 Derive a sufficient statistic for a. What theorem are you using to determine sufficiency? 2.4 Show that the fisher information in the whole sample is: 1(a)= 2.5 What Cramer Rao lower bound...
2 Let X1, X2, ..., X., be independent continuous random variables from the following distribution: f(x)=ox-(-V where = 1 and a > 1 You may use the fact: E(X)= -1 2.1 Show that the maximum likelihood estimator of a isante = sok X. 2.2 Show that the method moment estimator for a is: & mom = 1 2.3 Derive a sufficient statistic for a. What theorem are you using to determine sufficiency?
Question 5 15 marks] Let X be a random variable with pdf -{ fx(z) = - 0<r<1 (1) 0 :otherwise, Xa, n>2, be iid. random variables with pdf where 0> 0. Let X. X2.... given by (1) (a) Let Ylog X, where X has pdf given by (1). Show that the pdf of Y is Be- otherwise, (b) Show that the log-likelihood given the X, is = n log0+ (0- 1)log X (0 X) Hence show that the maximum likelihood...
Let x and x, be independent random variables with Mean u and variance o2. Suppose that we have two estimators Of u : A @= X1 + X2 2 and ©2 = X, +3X2 2 (a) Are both estimators unbiased estimators of u? (b) What is the variance of each estimator?
Let X1,X2,...,Xn be iid exponential random variables with unknown mean β. (b) Find the maximum likelihood estimator of β. (c) Determine whether the maximum likelihood estimator is unbiased for β. (d) Find the mean squared error of the maximum likelihood estimator of β. (e) Find the Cramer-Rao lower bound for the variances of unbiased estimators of β. (f) What is the UMVUE (uniformly minimum variance unbiased estimator) of β? What is your reason? (g) Determine the asymptotic distribution of the...
3. Let Ya» . . . , Yn be independent normally distributed random variables with E(X) Gai and V(X)-1. Recall that the normal density with mean μ and variance σ given by TO 202 (a) Find the maximum likelihood estimator β of β (b) Show that ß is unbiased. (c) Determine the distribution of β (d) Recall that the likelihood ratio test of Ho : θ 02] L1] L2] θ° is to θ0 against H1: θ reject Ho if L(e)...