Please give detailed steps. Thank you. 5. Let {X, : i-1..n^ denote a random sample of size n from a population described by a random varaible X following a Poisson(θ) distribution with PDF given by θ and var(X) θ (i.e. you do not You may take it as given that E(X) need to show these) a. Recall that an estimator is efficient, if it satisfies 2 conditions: 2) it achieves the Cramer-Rao Lower Bound (CLRB) for unbiased estimators: Show that...
Question 3 [17 marks] The random variable X is distributed exponentially with parameter A i.e. X~ Exp(A), so that its probability density function (pdf) of X is SO e /A fx(x) | 0, (2) (a) Let Y log(X. When A = 1, (i) Show that the pdf of Y is fr(y) = e (u+e-") (ii) Derive the moment generating function of Y, My(t), and give the values of t such that My(t) is well defined. (b) Suppose that Xi, i...
3. Let Xi,... , X,n be a random sample from a population with pdf 0, otherwise, where θ > 0. a) Find the method of moments estimator of θ. (b) Find the MLE θ of θ (c) Find the pdf of θ in (b).
3x2 for 0 < x < θ and zero otherwise. With the parameter θ > 0. We wish to Consider the pdf,f(x) estimate θ using the sample maximum from a random sample (iid) of size n. 0n-maxi Xi. (hint: first find the CDF and PDF of the estimator) Show this estimator is consistent a. b. Show this estimator is biased C. Suggest a better estimator and show that it is UC d. Show that n(9-an) converges (using the original estimator,...
4. Let Xi, . . . , xn be a random sample from the inverse Gaussian distribution, IG(μ, λ), whose pdf is: (a) Show that the MLE of μ and λ are μ-X and (b) It is known that n)/λ ~X2-1. Use this to derive a 100 . (1-a)% CI for λ.
4. Let Xi, . . . , xn be a random sample from the inverse Gaussian distribution, IG(μ, λ), whose pdf is: (a) Show that the MLE of μ and λ are μ-X and (b) It is known that n)/λ ~X2-1. Use this to derive a 100 . (1-a)% CI for λ.
Let X1, . . . , Xn be a sample taken from the Gamma distribution Γ(2, θ−1) with pdf f(x,θ)= θ^2xexp(−θx) if x ≥ 0, θ ∈ (0,∞), and 0 otherwise, (A) Show that Y = ∑ni=1 Xi is a complete and sufficient statistic. (B) Find E(1/Y) . Hint: If W ∼ χ2(k) then E(W^m) = 2mΓ(k/2+m) for m > −k/2. Note also that Y Γ(k/2) Γ(n) = (n − 1)!, n ∈ N∗ . Facts from 1(C) are useful:...
Let Xi,... , Xn be a random sample from a normal random variable X with E(X) 0 and var(X)-0, i.e., X ~N(0,0) (a) What is the pdf of X? (b) Find the likelihood function, L(0), and the log-likelihood function, e(0) (c) Find the maximun likelihood estimator of θ, θ (d) Is θ unbiased?
l. X) points Lei Xi, X, X b e random variables . I. adl X, is "uifornly disi rilnicd 。" on [0,1]. The random variables Xi, X2, X3,... are independent. The random variable N is the first integer n 2 1 such that Xn 2 c where 0< c< is a constant. That is, N = min(n : Xn-c). What is EM?
1. A certain continuous distribution has cumulative distribution function (CDF) given by F(x) 0, r<0 where θ is an unknown parameter, θ > 0. Let X, be the sample mean and X(n)max(Xi, X2,,Xn). (i) Show that θ¡n-(1 + )Xn ls an unbiased estimator of θ. Find its mean square error and check whether θ¡r, is consistent for θ. (i) Show that nX(n) is a consistent estimator of o (ii) Assume n > 1 and find MSE's of 02n, and compare...