4. Let X1, . . . , Xn be a random sample from a normal random variable X with probability density function f(x; θ) = (1/2θ 3 )x 2 e −x/θ , 0 < x < ∞, 0 < θ < ∞. (a) Find the likelihood function, L(θ), and the log-likelihood function, `(θ). (b) Find the maximum likelihood estimator of θ, ˆθ. (c) Is ˆθ unbiased? (d) What is the distribution of X? Find the moment estimator of θ, ˜θ.
Let X1, X2, ..., Xn be a random sample from a Gamma( a , ) distribution. That is, f(x;a,0) = loga xa-le-210, 0 < x <co, a>0,0 > 0. Suppose a is known. a. Obtain a method of moments estimator of 0, 0. b. Obtain the maximum likelihood estimator of 0, 0. c. Is O an unbiased estimator for 0 ? Justify your answer. "Hint": E(X) = p. d. Find Var(ë). "Hint": Var(X) = o/n. e. Find MSE(Ô).
Let Xi,... , Xn be a random sample from a normal random variable X with E(X) 0 and var(X)-0, i.e., X ~N(0,0) (a) What is the pdf of X? (b) Find the likelihood function, L(0), and the log-likelihood function, e(0) (c) Find the maximun likelihood estimator of θ, θ (d) Is θ unbiased?
(5) Let X, i = 1,...,n be iid sample from density fx(x) = f(x) e-/201(x > 0), 4 > 0 V TO (a) Find k. (b) Find E(X). (c) Find Var(X). (d) Find the MLE for 0. (e) Find MOM estimator for A. (f) Find bias for MLE. (g) Find MSE of MLE. (h) Let Y = x, find probability density function of Y. (i) Let Y = X?, find cumulative distribution function of Y. 5
Let X1 Xn be a random sample from a distribution with the pdf f(x(9) = θ(1 +0)-r(0-1) (1-2), 0 < x < 1, θ > 0. the estimator T-4 is a method of moments estimator for θ. It can be shown that the asymptotic distribution of T is Normal with ETT θ and Var(T) 0042)2 Apply the integral transform method (provide an equation that should be solved to obtain random observations from the distribution) to generate a sam ple of...
Suppose X1, X2, ..., Xn is an iid sample from fx(r ja-θ(1-z)0-11(0 1), where x θ>0. (a) Find the method of moments (MOM) estimator of θ. (b) Find the maximum likelihood estimator (MLE) of θ (c) Find the MLE of Po(X 1/2) d) Is there a function of θ, say T 0), for which there exists an unbiased estimator whose variance attains the Cramér-Rao Lower Bound? If so, find it and identify the corresponding estimator. If not, show why not.
4. Let X be continuous random variable whose PDF is given by for r>1 otherwise, where 0 is an unknown scalar parmeter. (a) Find the maximum likelihood estimator of . (b) Find a method-of-moments estimator of θ for the case when θ > 1. (c) Why can we not find a method-of-moments estimator when θ < 1? 151 151 151
Let X1, X2,.. .Xn be a random sample of size n from a distribution with probability density function obtain the maximum likelihood estimator of θ, θ. Use this maximum likelihood estimator to obtain an estimate of P[X > 4 when 0.50, 2 1.50, x 4.00, 4 3.00.
2. Suppose X1, X2, . .., Xn are a random sample from θ>0 0, otherwise Note: If X~fx(a; 0), thenXEx(0). (a) Find the CRLB of any unbiased estimator of θ (b) Is the MLE for θ the MVUE?
1. Let Xi,..., Xn be a random sample from a distribution with p.d.f. f(x:0)-829-1 , 0 < x < 1. where θ > 0. (a) Find a sufficient statistic Y for θ. (b) Show that the maximum likelihood estimator θ is a function of Y. (c) Determine the Rao-Cramér lower bound for the variance of unbiased estimators 12) Of θ