Let X1, · · · ,Xn be iid from Uniform(−θ,θ), where θ > 0. Let X(1) < X(2) < ... < X(n) denotes the order statistics.
(a) Find a minimal sufficient statistics for θ
(d) Find the UMVUE for θ.
(e) Find the UMVUE for τ(θ) = P(X1 > k).
To find the minimal sufficient statistics for θ in the given problem, we'll start with the definition of sufficient statistics. A statistic T(X) is said to be sufficient for a parameter θ if the conditional distribution of the sample X, given T(X), does not depend on θ.
(a) Minimal Sufficient Statistics for θ: In this case, we have a sample X1, X2, ..., Xn from a Uniform(-θ, θ) distribution. The joint probability density function (PDF) for the sample is given by:
f(x1, x2, ..., xn | θ) = (1 / (2θ)^n) for -θ ≤ xi ≤ θ (for all i)
To find the minimal sufficient statistics, we can use the Factorization Theorem. According to the Factorization Theorem, a statistic T(X) is a minimal sufficient statistic if and only if the joint PDF can be factored into two functions, one that depends on the data only through T(X) and the other that does not depend on the parameter θ.
Let's find the factorization for the sample:
f(x1, x2, ..., xn | θ) = (1 / (2θ)^n) for -θ ≤ xi ≤ θ (for all i)
Now, we can rewrite the joint PDF as:
f(x1, x2, ..., xn | θ) = [1 / (2θ)^n] * [1 for -θ ≤ xi ≤ θ (for all i)]
Now, it is evident that the above expression can be factorized into:
f(x1, x2, ..., xn | θ) = g(T(X) | θ) * h(X)
where: T(X) = (X1, X2, ..., Xn) (the sample itself) g(T(X) | θ) = 1 (since it does not depend on θ) h(X) = [1 for -θ ≤ xi ≤ θ (for all i)]
Since we can find a factorization where the factor g(T(X) | θ) does not depend on θ, the sample T(X) = (X1, X2, ..., Xn) is a minimal sufficient statistic for θ.
(d) UMVUE for θ: The UMVUE (Uniformly Minimum Variance Unbiased Estimator) for θ can be obtained using the Rao-Blackwell Theorem, which states that taking the conditional expectation of an unbiased estimator with respect to a minimal sufficient statistic produces a new estimator that is still unbiased and has a smaller variance (or at least the same).
We know that the sample mean (X̄) is an unbiased estimator for the mean of the Uniform(-θ, θ) distribution, which is 0. Now, we need to find the conditional expectation of X̄ given the minimal sufficient statistic T(X) = (X1, X2, ..., Xn).
E(X̄ | T(X) = t) = E(X̄ | X1 = t1, X2 = t2, ..., Xn = tn)
Since the X's are independent and identically distributed (iid), the conditional expectation for each Xi given Ti is simply the same as the overall expectation, which is 0.
Therefore, the UMVUE for θ is simply the sample mean X̄.
(e) UMVUE for τ(θ) = P(X1 > k): To find the UMVUE for τ(θ) = P(X1 > k), where k is a constant, we'll use the invariance property of UMVUE. If g(T) is an unbiased estimator for θ, then g(h(T)) is an unbiased estimator for g(θ) for any function h.
The indicator function I(X1 > k) takes the value 1 when X1 > k and 0 otherwise. Therefore, E[I(X1 > k)] gives us P(X1 > k).
E[I(X1 > k)] = P(X1 > k)
Now, since X̄ is the UMVUE for θ (as shown in part (d)), we can apply the invariance property:
g(T) = X̄ g(θ) = θ
So, the UMVUE for τ(θ) = P(X1 > k) is simply the sample mean X̄.
Let X1, · · · ,Xn be iid from Uniform(−θ,θ), where θ > 0. Let X(1)...
Suppose that X1, X2, ,Xn is an iid sample from Íx (x10), where θ Ε Θ. In each case below, find (i) the method of moments estimator of θ, (ii) the maximum likelihood estimator of θ, and (iii) the uniformly minimum variance unbiased estimator (UMVUE) of T(9) 0. exp fx (x10) 1(0 < x < 20), Θ-10 : θ 0}, τ(0) arbitrary, differentiable 20 (d) n-1 (sample size of n-1 only) ー29 In part (d), comment on whether the UMVUE...
, xn is an iid sample from fx(x10)-θe-8z1(x > 0), where θ > 0. Suppose X1, X2, For n 2 2, n- is the uniformly minimum variance unbiased estimator (UMVUE) of 0 (d) For this part only, suppose that n-1. If T(Xi) is an unbiased estimator of e, show that Pe(T(X) 0)>0
Let X1, ..., Xn be IID observations from Uniform(0, θ). T(X) = max(X1, . . . Xn) is a sufficient statistic (additionally, T is the MLE for θ). Find a (1 − α)-level confidence interval for θ. [Note: The support of this distribution changes depending on the value of θ, so we cannot use Fisher’s approximation for the MLE because not all of the regularity assumptions hold.]
Let X1, . . . , Xn ∼ iid N(θ, σ^2 ), where σ^2 is known. We wish to estimate φ = θ^2 . Find the MLE for φ and the UMVUE for φ. Then compare the bias and mean squared error's of the two estimators
Suppose X1, X2, . . . , Xn are iid with pdf f(x|θ) = θx^(θ−1) I(0 ≤ x ≤ 1), θ > 0. (a) Is − log(X1) unbiased for θ^(−1)? (b) Find a better estimator than log(X1) in the sense of with smaller MSE. (c) Is your estimator in part (b) UMVUE? Explain.
Again, let X1,..., Xn be iid observations from the Uniform(0,0) distribution. (a) Find the joint pdf of Xi) and X(n)- (b) Define R = X(n)-X(1) as the sample range. Find the pdf of R. (c) It turns out, if Xi, . . . , xn (iid) Uniform(0,0), E(R)-θ . What happens to E® as n increases? Briefly explain in words why this makes sense intuitively.
Let X1, ..., Xn be a sample from a U(0, θ) distribution where θ > 0 is a constant parameter. a) Density function of X(n) , the largest order statistic of X1,..., Xn. b) Mean and variance of X(n) . c) show Yn = sqrt(n)*(θ − X(n) ) converges to 0, in prob. d) What is the distribution of n(θ − X(n)).
Let X1 , . . . , xn be n iid. random variables with distribution N (θ, θ) for some unknown θ > 0. In the last homework, you have computed the maximum likelihood estimator θ for θ in terms of the sample averages of the linear and quadratic means, i.e. Xn and X,and applied the CLT and delta method to find its asymptotic variance. In this problem, you will compute the asymptotic variance of θ via the Fisher Information....
1.(c) 2.(a),(b) 5. Let Xi,..., X, be iid N(e, 1). (a) Show that X is a complete sufficient statistic. (b) Show that the UMVUE of θ 2 is X2-1/n x"-'e-x/θ , x > 0.0 > 0 6. Let Xi, ,Xn be i.i.d. gamma(α,6) where α > l is known. ( f(x) Γ(α)θα (a) Show that Σ X, is complete and sufficient for θ (b) Find ElI/X] (c) Find the UMVUE of 1/0 -e λ , X > 0 2) (x...
Let X1, . . . , Xn ∼ iid Unif(θ − 1/2 , θ + 1/2 ) for θ unknown. Find an asymptotic confidence interval for θ.