3. (20 marks) Suppose Y...Y is a random sample of independent and identically distributed Gamma(c...
10. (8 marks) Suppose Y, Y is a random sample of independent and identically distributed random variables with density function given by else a) (5 marks) By conditioning (definition 9.3) show that Uis sufficient for 0 b) (3 marks) By factorization (theorem 9.4) show that U- is sufficient for 0 Definition 9.3: Sufficiency The statistic U-g(, , X,) is said to be sufficient for θ if the conditional distribution of Y, Y, given U, does not depend on e Theorem...
(1 point) In Unit 3, I claimed that the sum of independent, identically distributed exponential random variables is a gamma random variable. Now that we know about moment generating functions, we can prove it. Let X be exponential with mean A 4. The density is 4 a) Find the moment generating function of X, and evaluate at t 3.9 The mgf of a gamma is more tedious to find, so l'll give it to you here. Let W Gamma(n, A...
Suppose we have 5 independent and identically distributed random variables Xi,X2.X3,X4,X5 each with the moment generating function 212 Let the random variable Y be defined as Y -XX. The density function of Y is (a) Poisson with λ-40 (b) Gamma with α-10 and λ-8 (c) Normal with μ-40 and σ-3.162 (d) Exponential with λ = 50 (e) Normal with μ-50 and σ2-15
Suppose we have 5 independent and identically distributed random variables X1, X2, X3, X4,X5 each with the moment generating function 212 Let the random variable Y be defined as Y = Σ Find the probability that Y is larger than 9. Prove that the distribution you use is the exact distribution, nota Central Limit Theorem approximation
Suppose that X and Y are independent, identically distributed, geometric random variables with parameter p. Show that P(X = i|X + Y = n) = 1/(n-1) , for i = 1,2,...,n-1
Question 3 15 marks] Let X1,..,X be independent identically distributed random variables with pdf common ) = { (#)%2-1/64 0 fx (a;e) 0 where 0 >0 is an unknown parameter X-1. Show that Y ~ T (}, ); (a) Let Y (b) Show that 1 T n =1 is an unbiased estimator of 0-1 ewhere / (0; X) is the log- likeliho od function; (c) Compute U - (d) What functions T (0) have unbiased estimators that attain the relevant...
(b) For n = 100, give an approximaation for P(Y> 100) (c) Let X be the sample mean, then approximate P(1.1< 1.2) for -100. 2. Consider a random sample XX from CDF F(a) 1-1/ for z [1, 0o) and zero otherwise. (a) Find the limiting distribution of XiI.n, the smallest order statistic. (b) Find the limiting distribution of XI (c) Find the limiting distribution of n In X1:m- 3. Suppose that X,,, are iid. N(0,o2). Find a function of T(x)x...
50] 1. Suppose that Xi,X2.. are independent and identically distributed Bernoulli random vari-ables with success probability equal to an unknown parameter p E (0, 1). Let P,-n-1 Σǐl Xi denote the sample proportion. liol a. Ti, what des VatRtA-P) converge in law ? 10 a. To what does)converge in law ? [10] b. Use your answer to part a to propose an approximate 95% confidence interval for p. 10 c. Find a real-valued function g such that vn(g(p) -g(p)) converges...
Question 4 [15 marks] The random variables X1,... , Xn are independent and identically distributed with probability function Px (1 -px)1 1-2 -{ 0,1 fx (x) ; otherwise, 0 while the random variables Yı,...,Yn are independent and identically dis- tributed with probability function = { p¥ (1 - py) y 0,1,2 ; otherwise fy (y) 0 where px and py are between 0 and 1 (a) Show that the MLEs of px and py are Xi, n PY 2n (b)...
Let X and Y be two independent and identically distributed random variables with expected value 1 and variance 2.56. First, find a non-trivial upper bound for P(|X + Y − 2| ≥ 1). Now suppose that X and Y are independent and identically distributed N(1,2.56) random variables. What is P(|X + Y − 2| ≥ 1) exactly? Why is the upper bound first obtained so different from the exact probability obtained?