With this in mind:
(i) Probability model: {f(x; ) = μ exp {−xμ} , = μ > 0, x > 0} .
(ii) Sampling model: {X1,X2, . . . ,Xn} is a random sample.
Show that no unbiased estimator of μ exists. (Argue by contradiction. That is, suppose
that there is an unbiased estimator of μ, then deduce a contradiction)
With this in mind: (i) Probability model: {f(x; ) = μ exp {−xμ} , = μ...
Let X be a random variable with cdf FX (x:0), expected value EIX-μ and variance VlX- σ2. Let X1,X2, , Xn be an id sample drawn according to FX(x,8) where Fx (x,8) =万 for all x E (0,0). Let max(X1, X2, , X.) be an estimator of θ, suggested from pure common sense. Remember that if Y = max(X1, X2, , Xn). Then it can be shown that the cdf Fy () of Y is given by Fr(u) (Fx()" where...
5. Let X ~ Exp(A) with λ unknown, and suppose X1,X2 is a random sample of size 2, Show that M-X (Hint: During your journey, you' need the help of the gamma distribution, the gamma function, and the knowledge that Г(1/2-ут) X1 X2 is a biased estimator of - and modify it to create an unbiased estimator
1. (40) Suppose that X1, X2, Xn forms an independent and identically distributed sample from a normal distribution with mean μ and variance σ2, both unknown: 2nơ2 (a) Derive the sample variance, S2, for this random sample. (b) Derive the maximum likelihood estimator (MLE) of μ and σ2 denoted μ and σ2, respectively. (c) Find the MLE of μ3 (d) Derive the method of moment estimator of μ and σ2, denoted μΜΟΜΕ and σ2MOME, respectively (e) Show that μ and...
Let X1, X2, ...,Xn be a random sample of size n from a population that can be modeled by the following probability model: axa-1 fx(x) = 0 < x < 0, a > 0 θα a) Find the probability density function of X(n) = max(X1, X2, ...,xn). b) Is X(n) an unbiased estimator for e? If not, suggest a function of X(n) that is an unbiased estimator for 0.
Let X1, X2, ..., Xn be a random sample of size n from a population that can be modeled by the following probability model: axa-1 fx(x) = 0 < x < 0, a > 0 θα a) Find the probability density function of X(n) max(X1,X2, ...,Xn). b) Is X(n) an unbiased estimator for e? If not, suggest a function of X(n) that is an unbiased estimator for e.
Suppose X1, X2, . . . , Xn (n ≥ 5) are i.i.d. Exp(µ) with the density f(x) = 1 µ e −x/µ for x > 0. (a) Let ˆµ1 = X. Show ˆµ1 is a minimum variance unbiased estimator. (b) Let ˆµ2 = (X1 +X2)/2. Show ˆµ2 is unbiased. Calculate V ar(ˆµ2). Confirm V ar(ˆµ1) < V ar(ˆµ2). Calculate the efficiency of ˆµ2 relative to ˆµ1. (c) Show X is consistent and sufficient. (d) Show ˆµ2 is not consistent...
Suppose X1, X2, . . . , Xn are i.i.d. Exp(µ) with the density f(x) = for x>0 (a) Use method of moments to find estimators for µ and µ^2 . (b) What is the log likelihood as a function of µ after observing X1 = x1, . . . , Xn = xn? (c) Find the MLEs for µ and µ^2 . Are they the same as those you find in part (a)? (d) According to the Central Limit...
Problem III. (12 points) Consider the following probability distribution. X 0 6 P(X = x) 1/4 1/4 1/4 1/4 1. (2 points) Find E(X). 2. (5 points) Find the sampling distribution of the sample mean X for samples of size = 2. n = 3. (5 points) Suppose we draw n random samples (X1, ... , Xn), and an estimator 0(X1, ... , Xn) is proposed as @(X1, ... , Xn) = -XI(X; #0, and X: #6), п i=1 where...
Let X1, X2, ..., Xn be a random sample with probability density function a) Is ˜θ unbiased for θ? Explain. b) Is ˜θ consistent for θ? Explain. c) Find the limiting distribution of √ n( ˜θ − θ). need only C,D, and E Let X1, X2, Xn be random sample with probability density function 4. a f(x:0) 0 for 0 〈 x a) Find the expected value of X b) Find the method of moments estimator θ e) Is θ...
5. Let X ∼ Exp(λ) with λ unknown, and suppose X1, X2 is a random sample of size 2. Show that M = sqrt( X1 · X2 ) is a biased estimator of 1/λ and modify it to create an unbiased estimator. (Hint: During your journey, you’ll need the help of the gamma distribution, the gamma function, and the knowledge that Γ(1/2) = √ π.)