a) Here in this case we first need to calculate the posterior probability using the formula
One way to decide about between H0 and H1 is by comparing and accept the hypothesis with higher posterior probability. This is basically the Maximum Aposteriori Test (MAP). So according to MAP we reject H0 iff
In this case the
Taking the above inequality we get
b) In order to calculate the power of the test, we use this rejection condition as proved above where T follows exponential distribution with as parameter. So the power will be
Hence for =0.2,0.4,0.6,0.8 and n=50 and
The t values corresponding to the above values is calculated as
36.04365,35.06282,34.25189,33.27106 and hence by putting the values of t we get power as 1 for all the values. It is mostly due to large sample size.
For also the power is coming 1 owing to the large sample size
Let X1,…, Xn be a sample of iid random variable with pdf f (x; ?) = 1/(2x−?+1) on S = {?, ? + 1, ? + 2,…} with Θ = ℕ. Determine a) a sufficient statistic for ?. b) F(1)(x). c) f(1)(x). d) E[X(1)].
1. Let X1, ..., Xn be iid with PDF 1 xle f(x;0) = x>0 (a) Determine the likelihood ratio test to test Ho: 0 = 0, versus H:0700 (b) Determine Wald-type test to test Ho: 0 = 0, versus Hį:0 700 (C) Determine Rao's score statistic to test Ho: 0 = 0, versus Hų:0 700
Let (X1, ... , Xn) be an iid sample from the exponentially distributed X with pdf given by f(x;0) = -e ô, x > 0, 0 >0. Use the Neyman-Pearson Lemma to find the a-level most powerful test (MPT) of Ho : 0 = 2 vs H : 0 = 3.
Let X1, ..., Xn be IID observations from Uniform(0, θ). T(X) = max(X1, . . . Xn) is a sufficient statistic (additionally, T is the MLE for θ). Find a (1 − α)-level confidence interval for θ. [Note: The support of this distribution changes depending on the value of θ, so we cannot use Fisher’s approximation for the MLE because not all of the regularity assumptions hold.]
Suppose X1, X2, . . . , Xn are iid with pdf f(x|θ) = θx^(θ−1) I(0 ≤ x ≤ 1), θ > 0. (a) Is − log(X1) unbiased for θ^(−1)? (b) Find a better estimator than log(X1) in the sense of with smaller MSE. (c) Is your estimator in part (b) UMVUE? Explain.
Suppose that X1, X2, ..., Xn is an iid sample from the probability density function (pdf) given by where β > 0 is unknown and m is a known constant larger than 1. (a) Show that T-T(X)-Σ-i Xi is a complete and sufficient statistic for Ux(z|β) : β 〉 0} (b) Show that c) For t > 0, show that the conditional density of Xı, given T- t, is 「( mn 1_21) m(n-1)-1 (d) Show that m- 1 mn -...
Let X1, X2,..., Xn be a r.s. from f(x) = 0x0-1, for 0 < x <1,0 < a < 0o. (a) Find the MLE of 0. (b) Let T = -log X. Find the pdf of T. (c) Find the pdf of Y = DIT: (i.e., distribution of Y = - , log Xi). (d) Find E(). (e) Find E( ). (f) Show that the variance of 0 MLE → as n → 00. (g) Find the MME of 0.
Let X1,..., Xn be a random sample from the pdf f(x:0)-82-2, 0 < θ x < oo. (a) Find the method of moments estimator of θ. (b) Find the maxinum likelihood estimator of θ
6.4.3. Let X1, X2, ..., Xn be iid, each with the distribution having pdf f(x; 01, 02) = (1/02)e-(2–01)/02, 01 < x <ao, -20 < 02 < 0o, zero elsewhere. Find the maximum likelihood estimators of 01 and 02.
Please answer the following question and show every step. Thank you. Let Xi,..,Xn be a random sample from a population with pdf 0, x<0, where θ > 0 is unknown. (a) Show that the Gamma(a, b) prior with pdf 0, θ < 0. is a conjugate prior for θ (a > 0 and b > 0 are known constants). (b) Find the Bayes estimator of θ under square error loss. (c) Find the Bayes estimator of (2π-10)1/2 under square error...