7. Suppose that Xi,..., Xk are independent random variables, and X, ~ Exp(B) for i = 1, . . . , k. Let Y = min(X1 , . . . , Xk). Show that Y ~ Exp(Σ-1 β).
Suppose X = Exp(1) and Y= -ln(x) (a)Find the cumulative distribution function of Y . (b) Find the probability density function of Y . (c) Let X1, X2, ... , Xk be i.i.d. Exp(1), and let Mk = max{X1,..... , Xk)(Maximum of X1, ..., Xk). Find the probability density function of Mk.(Hint: P(min(X1, X2, X3) > k) = P(X1 >= k, X2 >= k, X3 >= kq, how about max ?) (d) Show that as k → 00, the CDF...
6. (10 points) Suppose X – Exp(1) and Y = -In(X) (a) Find the cumulative distribution function of Y. (b) Find the probability density function of Y. (c) Let X1, X2,...,be i.i.d. Exp(1), and let Mk = max(X1,..., Xk) (Maximum of X1, ..., Xk). Find the probability density function of Mk (Hint: P(min(X1, X2, X3) > k) = P(X1 > k, X2 > k, X3 > k), how about max ?) (d) Show that as k- , the CDF of...
Please explain very carefully! 4. Suppose that x = (x1, r.) is a sample from a N(μ, σ2) distribution where μ E R, σ2 > 0 are unknown. (a) (5 marks) Let μ+σ~p denote the p-th quantile of the N(μ, σ*) distribution. What does this mean? (b) (10 marks) Determine a UMVU estimate of,1+ ơZp and justify your answer. 4. Suppose that x = (x1, r.) is a sample from a N(μ, σ2) distribution where μ E R, σ2 >...
The random vector x (XI, X2,... ,Xk)' is said to have a symmetric multivariate normal distribution if x ~ Ne(μ, Σ) where μ 1k, i.e., the mean of each X, is equal to the same constant μ, and Σ is the equicorre- lation dispersion matrix, i.e. when k 3, μ-0, σ2-2 and ρ 1/2, find the probability that Hint: Recall that if x = (Xi, , Xk), has a continuous symmetric dis tribution, then all possible permutations of X1,... ,Xk...
In 10. 11, Let X1, X2, . , Xn and Yi, Y2, . . . , Y,, be independent samples from N(μ, σ?) and N(μ, σ), respectively, where μ, σ. ơỈ are unknown. Let ρ-r/of and g m/n, and consider the problem of unbiased estimation of u In 10. 11, Let X1, X2, . , Xn and Yi, Y2, . . . , Y,, be independent samples from N(μ, σ?) and N(μ, σ), respectively, where μ, σ. ơỈ are unknown....
Please help a) How many parameters does a mixture of m Gaussians have? b) Let x1, . . . , xn be n observations drawn from a mixture of m Gaussians. Write down the log-likelihood function. (Hint: it should involve two summations.) c) Let 1 ≤ k ≤ m. Show that the maximum likelihood estimator for µk is given b and d) A mixture of m univariate Gaussians has the PDF TIL where each P3 > 0 and Σ-1 pi-|...
Let X1, ..., Xn be a random sample from a distribution with pdf 2πσχ (a) If σ and μ are both unknown, find a minimal sufficient statistic T. (b) If σ is known and μ is unknown, is T from last part a sufficient statistic? Is it a minimal sufficient statistic? Prove your answer. (c) Let V (II1 X)/m, what is the distribution of V? Are V andindependently distributed? Let X1, ..., Xn be a random sample from a distribution...
2. Suppose that ξι, ξ2, . . . are 1.1.d. RVs with Εξι-μ and Var (6)-σ2 E (0,00). Set X-3kE+2,1,2,, and let Sn X+Xn, n21 (a) Compute EXk, Var (Xk) and Cov (Xj Xk) for j k (b) Find the limit lim P r E R nVar (X1) 72 →00 as a sum of independent RVs. From the form of the expression in (1), one could expect that the answer will be in terms of the standard normal DF 1,...
Exercise 2 (Monte Carlo integration). Let (Xk)kzl be i.i.d. Uniform([0, 1]) RVs and let f: [0,1] -- R be a continuous function. For each n2 1, let (f(X)f(X2).+f(Xn)) (3) In = -- .. + Sof(x) dx in probability. (i) Suppose o f (x)| dx (ii) Further assume that f lf(x)2 dx <o0. Use Chebyshef's inequality to show that :< oo. Show that In P (IIn-I2 alVnVar(f(X1)) a2 f(x)2 dx (4)