6.1.12 Suppose that (x1,..., xn) is a sample from a Geometric(θ) distribution, where θ ∈ [0, 1] is unknown. Determine the likelihood function and a minimal sufficient statistic for this model. (Hint: Use the factorization theorem and maximize the logarithm of the likelihood.)
6.1.12 Suppose that (x1,..., xn) is a sample from a Geometric(θ) distribution, where θ ∈ [0,...
Let X1, ..., Xn be a sample from a U(0, θ) distribution where θ > 0 is a constant parameter. a) Density function of X(n) , the largest order statistic of X1,..., Xn. b) Mean and variance of X(n) . c) show Yn = sqrt(n)*(θ − X(n) ) converges to 0, in prob. d) What is the distribution of n(θ − X(n)).
2. (a) Suppose that x1,... , Vn are a random sample from a gamma distribution with shape parameter α and rate parameter λ, Here α > 0 and λ > 0. Let θ-(α, β). Determine the log-likelihood, 00), and a 2-dimensional sufficient statistic for the data (b) Suppose that xi, ,Xn are a random sample from a U(-9,0) distribution. f(x; 8) otherwise Here θ > 0, Determine the likelihood, L(0), and a one-dimensional sufficient statistic. Note that the likelihood should...
Let XI, X2, , Xn İs a random sample from the probability density function Use factorization theorem to show that X(1) = min(X1 , . . . , Xn) is sufficient for θ Is X(1) minimal sufficient for θ? a. b.
Let X1, . . . , Xn be a random sample from a population with density 8. Let Xi,... ,Xn be a random sample from a population with density 17 J 2.rg2 , if 0<、〈릉 0 , if otherwise ( a) Find the maximum likelihood estimator (MLE) of θ . (b) Find a sufficient statistic for θ (c) Is the above MLE a minimal sufficient statistic? Explain fully.
et (X1,··· ,Xn) be a sample from U[0,θ], where θ ∈ (0,1) is unknown, and θ has a prior distribution U[0,1].
Suppose that x = (x1,…..,xn) is a sample from the Gamma Distribution. Determine the minimal sufficient statistics for this model.
Suppose X1, X2, . . . , Xn are a random sample from a Uniform(0, θ) distribution, where θ > 0. Consider two different estimators of θ: R1 = 2X¯ R2 =(n + 1)/n max(X1, . . . , Xn) (a) For each of the estimators R1 and R2, assess whether it is an unbiased estimator of θ. (b) Compute the variances of R1 and R2. Under what conditions will R2 have a smaller variance than R1?
1. Let Xi,..., Xn be a random sample from a distribution with p.d.f. f(x:0)-829-1 , 0 < x < 1. where θ > 0. (a) Find a sufficient statistic Y for θ. (b) Show that the maximum likelihood estimator θ is a function of Y. (c) Determine the Rao-Cramér lower bound for the variance of unbiased estimators 12) Of θ
xercise 7.5: Suppose Xi, X2, ..., Xn are a random sample from the u distribution U(9-2 ,0+ ), where θ e (-00, Exercise 7.5: Suppose X1, X2, . .. , sufficient for θ. a) Show that the smallest and largest of Xi, ..., Xn are jointliy (b) If p@-constant, θ e (-00, oo), is the prior distribution of θ, find its posterior distribution xercise 7.5: Suppose Xi, X2, ..., Xn are a random sample from the u distribution U(9-2 ,0+...
Suppose X1, X2, , Xn is an iid sample from a uniform distribution over (θ, θΗθ!), where (a) Find the method of moments estimator of θ (b) Find the maximum likelihood estimator (MLE) of θ. (c) Is the MLE of θ a consistent estimator of θ? Explain.