Suppose the random variables X1, X2, ..., Xn are independent each with the distribution 020 *;...
2. Suppose that {X1, ..., Xn} are independent and identically distributed random variables from a distribution with p.d.f. See-ox if x > 0 f(x) = 10 if x = 0 Let Y = min <i<n X;. Find the p.d.f. of Y.
Suppose that X1, X2, ..., Xn are independent random variables (not iid) with densities ÍXi(z10,) -.2 e _ θ:/z1(z > 0), where θί 〉 0, for i = 1, 2, , n. (a) Derive the form of the likelihood ratio test (LRT) statistic for testing versuS H1: not Ho. You do not have to find the distribution of the likelihood ratio test (LRT) statistic under Ho- Just find the form of the statistic. (b) From your result in part (a),...
L.11) Sums of independent random variables a) If X1 , X2 X, , , Xn are independent random variables all with Exponential μ distribution, then what is the distribution of XII + 2 +X3 + .tX b) If X is a random variable with Exponential[u] distribution, then what is the distribution of x +X1? c) If X1 , X2 , Х, , , X are independent random variables all with Normal 0. I distribution, then what is the distribution of...
Let Ņ, X1. X2, . . . random variables over a probability space It is assumed that N takes nonnegative inteqer values. Let Zmax [X1, -. .XN! and W-min\X1,... ,XN Find the distribution function of Z and W, if it suppose N, X1, X2, are independent random variables and X,, have the same distribution function, F, and a) N-1 is a geometric random variable with parameter p (P(N-k), (k 1,2,.)) b) V - 1 is a Poisson random variable with...
Suppose that (X1, X2,,,,Xn) are iid random variables. Find the maximum likelihood estimator of theta for the following distributions 1) Poi(theta) 2) N(Mu, theta) 3) Exp(theta)
Let X1, X2, ..., Xn be a random sample of size n from the distribution with probability density function f(x1) = 2 Æ e-dz?, x > 0, 1 > 0. a. Obtain the maximum likelihood estimator of 1 . Enter a formula below. Use * for multiplication, / for divison, ^ for power. Use m1 for the sample mean X, m2 for the second moment and pi for the constant n. That is, m1 = * = *Šxi, m2 =...
7. Suppose X1, X2, ..., Xn is a random sample from an exponential distribution with parameter K. (Remember f(x;2) = 2e-Ax is the pdf for the exponential dist”.) a) Find the likelihood function, L(X1, X2, ..., Xn). b) Find the log-likelihood function, b = log L. c) Find dl/d, set the result = 0 and solve for 2.
Let X1, X2, · · · be independent random variables, Xn ∼ U(−1/n, 1/n). Let X be a random variable with P(X = 0) = 1. (a) what is the CDF of Xn? (b) Does Xn converge to X in distribution? in probability?
. If X1, X2,..., Xn are independent random variables with common mean μ and variances σ1, σ2, . . ., σα , prove that Σί (Xi-T)2/[n(n-1)] is an ว. 102n unbiased estimate of var[X] 3. Suppose that in Exercise 2 the variances are known. LeTw Σί uiXi
Let X1, X2, ..., Xn be a random sample from a Gamma( a , ) distribution. That is, f(x;a,0) = loga xa-le-210, 0 < x <co, a>0,0 > 0. Suppose a is known. a. Obtain a method of moments estimator of 0, 0. b. Obtain the maximum likelihood estimator of 0, 0. c. Is O an unbiased estimator for 0 ? Justify your answer. "Hint": E(X) = p. d. Find Var(ë). "Hint": Var(X) = o/n. e. Find MSE(Ô).