Let X1, ..., Xn be a sample from a U(0, θ) distribution where θ > 0 is a constant parameter.
a) Density function of X(n) , the largest order statistic of X1,..., Xn.
b) Mean and variance of X(n) .
c) show Yn = sqrt(n)*(θ − X(n) ) converges to 0, in prob.
d) What is the distribution of n(θ − X(n)).
Let X1, ..., Xn be a sample from a U(0, θ) distribution where θ > 0...
1. Let X1, ..., Xn be a random sample of size n from a normal distribution, X; ~ N(M, 02), and define U = 21-1 X; and W = 2-1 X?. (a) Find a statistic that is a function of U and W and unbiased for the parameter 0 = 2u – 502. (b) Find a statistic that is unbiased for o? + up. (c) Let c be a constant, and define Yi = 1 if Xi < c and...
xercise 7.5: Suppose Xi, X2, ..., Xn are a random sample from the u distribution U(9-2 ,0+ ), where θ e (-00, Exercise 7.5: Suppose X1, X2, . .. , sufficient for θ. a) Show that the smallest and largest of Xi, ..., Xn are jointliy (b) If p@-constant, θ e (-00, oo), is the prior distribution of θ, find its posterior distribution xercise 7.5: Suppose Xi, X2, ..., Xn are a random sample from the u distribution U(9-2 ,0+...
suppose X1 -> Xn is a random sample from a uniform distribution on the interval [0,theta]. let X1 = min {X1,X2,...Xn} and let Yn= nX1. show that Yn converges in distribution to an exponential random variable with mean theta.
7.6.4. Let X1, X2,... , Xn be a random sample from a uniform (0,) distribution. Continuing with Example 7.6.2, find the MVUEs for the following functions of (a) g(0)-?2, i.e., the variance of the distribution (b) g(0)- , i.e., the pdf of the distribution C) or t real, g(9)- , î.?., the mgf of the distribution. Example 7.6.2. Suppose X1, X2,... , Xn are iid random variables with the com- mon uniform (0,0) distribution. Let Yn - max{X1, X2,... ,...
4. Let X1, X2, ...,Xn be a random sample from a normal distribution with mean 0 and unknown variance o2. (a) Show that U = <!-, X} is a sufficient statistic for o?. [4] (c) Show that the MLE of o2 is Ô = 2-1 X?. [4] (c) Calculate the mean and variance of Ô from (b). Explain why ő is also the MVUE of o2. [6]
Let X1,..., Xn be a random sample from a distribution. Suppose Ti (X),T2(X) and U(X) respectively are sufficient, minimal sufficient, and unbiased estimators for the parameter θ of the distribution. Let U1(X) = E U(X) T, (X), U2(X) = EU㈤ T2(X)] a. Show that U1(X) and U(X) are unbiased for θ. b. Show that U2(x)-E[Uj(X)ITLX] c. Show that U2 has a smaller variance than U
6.1.12 Suppose that (x1,..., xn) is a sample from a Geometric(θ) distribution, where θ ∈ [0, 1] is unknown. Determine the likelihood function and a minimal sufficient statistic for this model. (Hint: Use the factorization theorem and maximize the logarithm of the likelihood.)
et (X1,··· ,Xn) be a sample from U[0,θ], where θ ∈ (0,1) is unknown, and θ has a prior distribution U[0,1].
Let X1, X2, ..., Xn be a random sample with probability density function a) Is ˜θ unbiased for θ? Explain. b) Is ˜θ consistent for θ? Explain. c) Find the limiting distribution of √ n( ˜θ − θ). need only C,D, and E Let X1, X2, Xn be random sample with probability density function 4. a f(x:0) 0 for 0 〈 x a) Find the expected value of X b) Find the method of moments estimator θ e) Is θ...
Let X1, X2, · · · Xn be a i.i.d. sample from Bernoulli(p) and let . Show that Yn converges to a degenerate distribution at 0 as n → ∞.