1. Let X1, ·s, Xn be independent random variables taking values 0 or 1 with
P(Xi=1)=eθ-ai /(1+eθ-ai ), i=1, ……, n
for some given constants ai. Find a one-dimensional sufficient statistic for θ.
1. Let X1, ... , Xn be independent random variables taking values 0 or 1 with...
. Suppose that x1, . . . , xn are a random sample having probability density function f(x; θ) = (θ + 1)x^θ , 0 < x < (1) Here the parameter θ > 0. (a) Determine the log-likelihood, l(θ), and a 1-dimensional sufficient statistic. (b) Show that P(Xi ≤ b; θ) = b θ+1 for f(x; θ) given in (1). (c) Suppose now that because of a recurring computer glitch in storing the observations, only a random subset of...
explan the answer 1l. Suppose that X1, X2,... Xn are independent random variables. Assume that ElXi] /4 and Var(X )-σ, where i 1, 2, . .., n. If ai , aam. , an are constants. 1,a2, , an are constan (i) Write down expression for (i) E{Σ,i ai Xi) and (ii) Var(Li la(Xi). (i) Rewrite the expression if X,'s are not independent.
2. Let X1, X2,. . , Xn denote independent and identically distributed random variables with variance σ2, which of the following is sufficient to conclude that the estimator T f(Xi, , Xn) of a parameter 6 is consistent (fully justify your answer): (a) Var(T) (b) E(T) (n-1) and Var(T) (c) E(T) 6. (d) E(T) θ and Var(T)-g2. 72 121
Let X1, . . . , Xn be a random sample from a population with density 8. Let Xi,... ,Xn be a random sample from a population with density 17 J 2.rg2 , if 0<、〈릉 0 , if otherwise ( a) Find the maximum likelihood estimator (MLE) of θ . (b) Find a sufficient statistic for θ (c) Is the above MLE a minimal sufficient statistic? Explain fully.
Let X1, . . . , Xn be independent Beta(θ, 1) random variables with parameter θ > 0. (1) Find the Bayes estimator of θ for a Gamma(α, β) prior. (2) Find the MSE of the Bayes estimator.
Let X1, . . . , Xn be independent Poisson(θ) random variables with parameter θ > 0. (1) Find the Bayes estimator of θ for a Gamma(α, β) prior. (2) Find the MSE of the Bayes estimator.
Suppose that x1, . . . , xn are a random sample having probability density function f(x; θ) = (θ + 1)x^θ , 0 < x < 1. (1) Here the parameter θ > 0. (a) Show that P(Xi ≤ b; θ) = b^(θ+1) for f(x; θ) given in (1). (b) Suppose now that because of a recurring computer glitch in storing the observations, only a random subset of the xi are observed. For the rest of the observations, it...
Let Ņ, X1. X2, . . . random variables over a probability space It is assumed that N takes nonnegative inteqer values. Let Zmax [X1, -. .XN! and W-min\X1,... ,XN Find the distribution function of Z and W, if it suppose N, X1, X2, are independent random variables and X,, have the same distribution function, F, and a) N-1 is a geometric random variable with parameter p (P(N-k), (k 1,2,.)) b) V - 1 is a Poisson random variable with...
Let X1, X2, · · · be independent random variables, Xn ∼ U(−1/n, 1/n). Let X be a random variable with P(X = 0) = 1. (a) what is the CDF of Xn? (b) Does Xn converge to X in distribution? in probability?
Exercise 8.41. The random variables X1,..., Xn are i.i.d. We also know that ElXl] = 0. EĮKY = a and Elx?| = b. Let Xn-Xi+n+Xn. Find the third moment of Xn Exercise 8.41. The random variables X1,..., Xn are i.i.d. We also know that ElXl] = 0. EĮKY = a and Elx?| = b. Let Xn-Xi+n+Xn. Find the third moment of Xn