Let X1, X2,...,Xn denote a random sample from a distribution that is N(0, θ).
a) Show that Y = sigma (1 to n) Xi2 is a complete sufficient statistic for θ. (solved)
b) Find the UMVUE of θ2. (need help with this one)
Note: I am in particular having trouble finding out what distribution Y = sigma Xi^2 is. The professor advise us to find the second moment generating function for Y, but I not sure how I find that. Please do show work and also explanation.
Here we have used concepts of sampling distribution and moment generating functions. Therefore obtained umvue of the required parameter.
Let X1, X2,...,Xn denote a random sample from a distribution that is N(0, θ). a) Show...
Let Xi , X2,. … X, denote a random sample of size n > 1 from a distribution with pdf f(x:0)--x'e®, x > 0 and θ > 0. a. Find the MLE for 0 b. Is the MLE unbiased? Show your steps. c. Find a complete sufficient statistic for 0. d. Find the UMVUE for θ. Make sure you indicate how you know it is the UMVUE. Let Xi , X2,. … X, denote a random sample of size n...
Let X1, ..., Xn be a sample from a U(0, θ) distribution where θ > 0 is a constant parameter. a) Density function of X(n) , the largest order statistic of X1,..., Xn. b) Mean and variance of X(n) . c) show Yn = sqrt(n)*(θ − X(n) ) converges to 0, in prob. d) What is the distribution of n(θ − X(n)).
Let X1, · · · ,Xn be iid from Uniform(−θ,θ), where θ > 0. Let X(1) < X(2) < ... < X(n) denotes the order statistics. (a) Find a minimal sufficient statistics for θ (d) Find the UMVUE for θ. (e) Find the UMVUE for τ(θ) = P(X1 > k).
Suppose that X1, X2, ,Xn is an iid sample from Íx (x10), where θ Ε Θ. In each case below, find (i) the method of moments estimator of θ, (ii) the maximum likelihood estimator of θ, and (iii) the uniformly minimum variance unbiased estimator (UMVUE) of T(9) 0. exp fx (x10) 1(0 < x < 20), Θ-10 : θ 0}, τ(0) arbitrary, differentiable 20 (d) n-1 (sample size of n-1 only) ー29 In part (d), comment on whether the UMVUE...
Let X1, . . . , Xn be a sample taken from the Gamma distribution Γ(2, θ−1) with pdf f(x,θ)= θ^2xexp(−θx) if x ≥ 0, θ ∈ (0,∞), and 0 otherwise, (A) Show that Y = ∑ni=1 Xi is a complete and sufficient statistic. (B) Find E(1/Y) . Hint: If W ∼ χ2(k) then E(W^m) = 2mΓ(k/2+m) for m > −k/2. Note also that Y Γ(k/2) Γ(n) = (n − 1)!, n ∈ N∗ . Facts from 1(C) are useful:...
Let X1 , . . . , xn be n iid. random variables with distribution N (θ, θ) for some unknown θ > 0. In the last homework, you have computed the maximum likelihood estimator θ for θ in terms of the sample averages of the linear and quadratic means, i.e. Xn and X,and applied the CLT and delta method to find its asymptotic variance. In this problem, you will compute the asymptotic variance of θ via the Fisher Information....
Let X1, X2, ... , Xn be a random sample of size n from the exponential distribution whose pdf is f(x; θ) = (1/θ)e^(−x/θ) , 0 < x < ∞, 0 <θ< ∞. Find the MVUE for θ. Let X1, X2, ... , Xn be a random sample of size n from the exponential distribution whose pdf is f(x; θ) = θe^(−θx) , 0 < x < ∞, 0 <θ< ∞. Find the MVUE for θ.
[4] (15 pts) Let X1, ... , Xn (n > 2) be a random sample from a Poisson distribution with unknown mean 8 >0. Find the UMVUE of n = P(X1 > 1) = 1 - - (5) (30 pts ; 15 pts each) (a) Let X1,.,X, be a random sample from a Pareto distribution, Pareto(a,1), with pdf f(x; a) = 0x ax-(+1)I(1,00)() where a > 0 is unknown. Find the UMVUE of n = P. (X1 > c) =...
Let X1, X2, ..., Xn be iid with pdf f(x|θ) = θ*x(θ-1). a) Find the Maximum Likelihood Estimator of θ, and b) show that its variance converges to 0 as n approaches infinity. I have no problem with part a, finding the MLE of θ. However, I'm having some trouble with finding the variance. The professor walked us through part b generally, but I need help with univariate transformation for sigma(-ln(xi)) (see picture below - the professor used Y=sigma(-ln(x)), and...
Let X1,X2,...,Xn denote a random sample from the Rayleigh distribution given by f(x) = (2x θ)e−x2 θ x > 0; 0, elsewhere with unknown parameter θ > 0. (A) Find the maximum likelihood estimator ˆ θ of θ. (B) If we observer the values x1 = 0.5, x2 = 1.3, and x3 = 1.7, find the maximum likelihood estimate of θ.