(a)
The likelihood function for independent samples X1, X2, ..., Xn is,
for xi > 1
The log-likelihood function is,
For maximum likelihood estimate,
Thus, the maximum likelihood estimate of is,
(b)
First moment (mean) of f(x) is,
----(1)
> 1 => + 1 < 0 => as .
Thus,
So, the method of moments estimator is,
where
(c)
For , the first moment (mean) of f(x,) is undefined.
For ,
+ 1 > 0 => as
So, by equation (1), the first moment (mean) of f(x) is undefined for . Hence we cannot find a method of moments estimator for .
4. Let X be continuous random variable whose PDF is given by for r>1 otherwise, where...
FF1:18 1H20B B 80 ma2500a16-1 ma2500s14 ma2500a15 ma2500s15 ma2500a17 2. Let Xi, X2 , X10 be a random sample of observations from the N(μ, σ*) distribution where μ is unknown and σ2-10. We reject the null hypothesis Ho : μ-5 in lavour of the alternative hypothesis H1 : μ < 5 if sum of the observations is less than or equal to 35 (a) What is the critical region for the test? (b) Compute the size of the test (c)...
3. Let X1,... ,Xn be a random sample from a population with pdf 0, otherwise, where θ > 0. (a) Find the method of moments estimator of θ. (b) Find the MLE θ of θ. (c) Find the pdf of θ in (b).
3. Let X1,... ,Xn be a random sample from a population with pdf 0, otherwise, where θ > 0. (a) Find the method of moments estimator of θ. (b) Find the MLE θ of θ. (c) Find the pdf of θ in (b).
3. Let Xi,... , X,n be a random sample from a population with pdf 0, otherwise, where θ > 0. a) Find the method of moments estimator of θ. (b) Find the MLE θ of θ (c) Find the pdf of θ in (b).
Suppose that X1, X2,....Xn is an iid sample of size n from a Pareto pdf of the form 0-1) otherwise, where θ > 0. (a) Find θ the method of moments (MOM) estimator for θ For what values of θ does θ exist? Why? (b) Find θ, the maximum likelihood estimator (MLE) for θ. (c) Show explicitly that the MLE depends on the sufficient statistic for this Pareto family but that the MOM estimator does not
Problem 1.2 Let Xi, X2, ..., Xn be a random sample from the pdf a) Find the maximum likelihood estimator of. θΜΕ- b) Find the method of moments estimator of 0. NDM c) If a random sample of n - 4 yields the following data: method of moments estimate of θ would be θΜΟΜ- MOM 7.50, 3.73, 4.52, 3.35 then the maximumn likelihood estimate of θ would be éMLE-- and the
Suppose X1, X2, ..., Xn is an iid sample from fx(r ja-θ(1-z)0-11(0 1), where x θ>0. (a) Find the method of moments (MOM) estimator of θ. (b) Find the maximum likelihood estimator (MLE) of θ (c) Find the MLE of Po(X 1/2) d) Is there a function of θ, say T 0), for which there exists an unbiased estimator whose variance attains the Cramér-Rao Lower Bound? If so, find it and identify the corresponding estimator. If not, show why not.
Let X be a random variable with probability density function (pdf) given by fx(r0)o elsewhere where θ 0 is an unknown parameter. (a) Find the cumulative distribution function (cdf) for the random variable Y = θ and identify the distribution. Let X1,X2, . . . , Xn be a random sample of size n 〉 2 from fx (x10). (b) Find the maximum likelihood estimator, Ỗmle, for θ (c.) Find the Uniform Minimum Variance Unbiased Estimator (UMVUE), Bumvue, for 0...
Question 5 15 marks] Let X be a random variable with pdf -{ fx(z) = - 0<r<1 (1) 0 :otherwise, Xa, n>2, be iid. random variables with pdf where 0> 0. Let X. X2.... given by (1) (a) Let Ylog X, where X has pdf given by (1). Show that the pdf of Y is Be- otherwise, (b) Show that the log-likelihood given the X, is = n log0+ (0- 1)log X (0 X) Hence show that the maximum likelihood...
1. Consider a variable y = θ+e where θ is an unknown parameter and e is a random variable with mean zero (a) What is the expected value of y (b) Suppose you draw a sample of in y-Derive the least squares estimator for θ. For full credit you must check the 2nd order condition. (c) Can this estimator () be described as a method of moments estimator? (d) Now suppose e is independent normally distributed with mean 0 and...