1. Let Xi,..., Xn be a random sample from a distribution with p.d.f. f(x:0)-829-1 , 0...
2. Let Xi,... ,Xn be a random sample from a distribution with p.d.f for 0 < x < θ f(x; 0) - 0 elsewhere . (a) Find an estimator for θ using the method of moments. (b) Find the variance of your estimator in (a).
Question 5 15 marks] Let X be a random variable with pdf -{ fx(z) = - 0<r<1 (1) 0 :otherwise, Xa, n>2, be iid. random variables with pdf where 0> 0. Let X. X2.... given by (1) (a) Let Ylog X, where X has pdf given by (1). Show that the pdf of Y is Be- otherwise, (b) Show that the log-likelihood given the X, is = n log0+ (0- 1)log X (0 X) Hence show that the maximum likelihood...
be a random sample from the density 16 1. Let Xi, . f(x; β) otherwise 8(1-/4). You may suppose that E(X)(/ (a) Find a sufficient statistic Y for B and Var(X) C21 C2] 031 (b) Find the maximum likelihood estimator B of B and show that it is a function (c) Determine the Rao-Cramér lower bound (RCLB) for the variance of unbiased (d) Use the following data and maximum likelihood estimator to give an approxi- 2.66, 2.02, 2.02, 0.76, 1.70,...
Let X1, . . . , Xn be a random sample from a population X with p.d.f fθ(x) = θ xθ−1 , for 0 < x < 1 0, otherwise, where θ > 1 is parameter. Find the MLE of 1/θ. If it is an unbiased estimator of 1/θ, compare its variance with the Cramer-Rao lower bound.
Advanced Statistics, I need help with (c) and (d) 2. Let X1, X2, ..., Xn be a random sample from a Bernoulli(6) distribution with prob- ability function Note that, for a random variable X with a Bernoulli(8) distribution, E [X] var [X] = θ(1-0) θ and (a) Obtain the log-likelihood function, L(0), and hence show that the maximum likelihood estimator of θ is 7l i= I (b) Show that dE (0) (c) Calculate the expected information T(e) EI()] (d) Show...
7. Let X1,....Xn random sample from a Bernoulli distribution with parameter p. A random variable X with Bernoulli distribution has a probability mass function (pmf) of with E(X) = p and Var(X) = p(1-p). (a) Find the method of moments (MOM) estimator of p. (b) Find a sufficient statistic for p. (Hint: Be careful when you write the joint pmf. Don't forget to sum the whole power of each term, that is, for the second term you will have (1...
, , Yn is a random sample from a distribution with pdf f,0% θ)-22, 3. (20 points) If Y., Y2, 0 Syse, a. find cÝ, where c is a constant, that is an unbiased estimator of θ; and b. show that the variance of is less than the Cramér-Rao lower bound for fr (y; 0) c. Why isn't this a violation of the Cramér-Rao inequality? , , Yn is a random sample from a distribution with pdf f,0% θ)-22, 3....
Let X,, X,,...X be a random sample of size n from a normal distribution with parameters a. Derive the Cramer-Rao lower bound matrix for an unbiased estimator of the vector of parameters (μ, σ2). b. Using the Cramer-Rao lower bound prove that the sample mean X is the minimum variance unbiased estimator of u Is the maximum likelihood estimator of σ--σ-->|··( X,-X ) unbiased? c. Let X,, X,,...X be a random sample of size n from a normal distribution with...
Let X1, X2, ..., Xn be a random sample from the distribution with pdf f(3;6) = V porta exp ( 0) 10.02) for some parameter 2 > 0. (a) Find the MLE for 0. (b) Find the Cramér-Rao lower bound for the variance of all unbiased estimators of 0. (c) Find the asymptotic distribution of your MLE from part (a).
Let X be a random variable with p.d.f. f(x) = θx^(θ−1) , for 0 < x < 1. Let X1, ..., Xn denote a random sample of size n from this distribution. (a) Find E(X) [2] (b) Find the method of moment estimator of θ [2] (c) Find the maximum likelihood estimator of θ [3] (d) Use the following set of observations to obtain estimates of the method of moment and maximum likelihood estimators of θ. [1 each] 0.0256, 0.3051,...