Conditional on θ, the random variables X1, X2, ,Xn are îid from In turn, the parameter θ is best ...
Suppose that Xi, X2, , xn is an iid sample from a U(0,0) distribution, where θ 0. În turn, the parameter 0 is best regarded as a random variable with a Pareto(a, b) distribution, that is, bab 0, otherwise, where a 〉 0 and b 〉 0 are known. (a) Turn the "Bayesian crank" to find the posterior distribution of θ. I would probably start by working with a sufficient statistic (b) Find the posterior mean and use this as...
Suppose that X1, X2, ..., Xn are independent random variables (not iid) with densities ÍXi(z10,) -.2 e _ θ:/z1(z > 0), where θί 〉 0, for i = 1, 2, , n. (a) Derive the form of the likelihood ratio test (LRT) statistic for testing versuS H1: not Ho. You do not have to find the distribution of the likelihood ratio test (LRT) statistic under Ho- Just find the form of the statistic. (b) From your result in part (a),...
Suppose that Xi, X2,..., Xn are independent random variables (not iid) with densities x, (x^, where 6, > 0, for i-1, 2, , n. versus H1: not Ho (c) Suppose Ho is true so that the common distribution of X1, X2,..., Xn, now viewed as being conditional on 6, is described by where θ > 0. Identify a conjugate prior for 0. Specify any hyperparameters in your prior (pick values for fun if you want). Show how to carry out...
Let X1, . . . , Xn be independent Poisson(θ) random variables with parameter θ > 0. (1) Find the Bayes estimator of θ for a Gamma(α, β) prior. (2) Find the MSE of the Bayes estimator.
Let X1, . . . , Xn be independent Beta(θ, 1) random variables with parameter θ > 0. (1) Find the Bayes estimator of θ for a Gamma(α, β) prior. (2) Find the MSE of the Bayes estimator.
xercise 7.5: Suppose Xi, X2, ..., Xn are a random sample from the u distribution U(9-2 ,0+ ), where θ e (-00, Exercise 7.5: Suppose X1, X2, . .. , sufficient for θ. a) Show that the smallest and largest of Xi, ..., Xn are jointliy (b) If p@-constant, θ e (-00, oo), is the prior distribution of θ, find its posterior distribution xercise 7.5: Suppose Xi, X2, ..., Xn are a random sample from the u distribution U(9-2 ,0+...
Problem 3.1 Suppose that XI, X2,... Xn is a random sample of size n is to be taken from a Bermoulli distribution for which the value of the parameter θ is unknown, and the prior distribution of θ is a Beta(α,β) distribution. Represent the mean of this prior distribution as μο=α/(α+p). The posterior distribution of θ is Beta =e+ ΣΧ, β.-β+n-ΣΧ.) a) Show that the mean of the posterior distribution is a weighted average of the form where yn and...
Let X1,…Xn ~ iid Gamma (α, θ) where the α is known and interested in the rate parameter θ, and we chosen a prior θ~ Gamma (3, 1). Find the posterior distribution
Suppose X1 and X2 are iid Poisson(θ) random variables and let T = X1 + 2X2. (a) Find the conditional distribution of (X1,X2) given T = 7. (b) For θ = 1 and θ = 2, respectively, calculate all probabilities in the above conditional distribution and present the two conditional distributions numerically.
Let X1, . . . , Xn be a random sample following Gamma(2, β) for some unknown parameter β > 0. (i) Now let’s think like a Bayesian. Consider a prior distribution of β ∼ Gamma(a, b) for some a, b > 0. Derive the posterior distribution of β given (X1, . . . , Xn) = (x1,...,xn). (j) What is the posterior Bayes estimator of β assuming squared error loss?