Let X1,…Xn ~ iid Gamma (α, θ) where the α is known and interested in the rate parameter θ, and we chosen a prior θ~ Gamma (3, 1).
Find the posterior distribution
Let X1,…Xn ~ iid Gamma (α, θ) where the α is known and interested in the...
Let X1, . . . , Xn ∼ iid N(θ, σ^2 ), where σ^2 is known. We wish to estimate φ = θ^2 . Find the MLE for φ and the UMVUE for φ. Then compare the bias and mean squared error's of the two estimators
7.60 Let Xi, of 1/β. , xn be iid gamma(α, β) with α known. Find the best unbiased estimator
Let X1, . . . , Xn be independent Poisson(θ) random variables with parameter θ > 0. (1) Find the Bayes estimator of θ for a Gamma(α, β) prior. (2) Find the MSE of the Bayes estimator.
Let X1, . . . , Xn be independent Beta(θ, 1) random variables with parameter θ > 0. (1) Find the Bayes estimator of θ for a Gamma(α, β) prior. (2) Find the MSE of the Bayes estimator.
Let X1, ..., Xn be IID observations from Uniform(0, θ). T(X) = max(X1, . . . Xn) is a sufficient statistic (additionally, T is the MLE for θ). Find a (1 − α)-level confidence interval for θ. [Note: The support of this distribution changes depending on the value of θ, so we cannot use Fisher’s approximation for the MLE because not all of the regularity assumptions hold.]
Conditional on θ, the random variables X1, X2, ,Xn are îid from In turn, the parameter θ is best regarded as random with prior distribution αθ where a 0 is known (a) Find the posterior mean of θ (b) Discuss how you would formulate the Bayesian test of versus Conditional on θ, the random variables X1, X2, ,Xn are îid from In turn, the parameter θ is best regarded as random with prior distribution αθ where a 0 is known...
Let X1, · · · ,Xn be iid from Uniform(−θ,θ), where θ > 0. Let X(1) < X(2) < ... < X(n) denotes the order statistics. (a) Find a minimal sufficient statistics for θ (d) Find the UMVUE for θ. (e) Find the UMVUE for τ(θ) = P(X1 > k).
Suppose that Xi, X2, , xn is an iid sample from a U(0,0) distribution, where θ 0. În turn, the parameter 0 is best regarded as a random variable with a Pareto(a, b) distribution, that is, bab 0, otherwise, where a 〉 0 and b 〉 0 are known. (a) Turn the "Bayesian crank" to find the posterior distribution of θ. I would probably start by working with a sufficient statistic (b) Find the posterior mean and use this as...
Let X1, . . . , Xn be a random sample following Gamma(2, β) for some unknown parameter β > 0. (i) Now let’s think like a Bayesian. Consider a prior distribution of β ∼ Gamma(a, b) for some a, b > 0. Derive the posterior distribution of β given (X1, . . . , Xn) = (x1,...,xn). (j) What is the posterior Bayes estimator of β assuming squared error loss?
Let X1, , Xn be a random sample gamma(α, β), assume a is known. Consider testing Ho : β-A-Derive the Score test for testing Ho- Let X1, , Xn be a random sample gamma(α, β), assume a is known. Consider testing Ho : β-A-Derive the Score test for testing Ho-