Solution:
Exercise l (Sum of 1.1.d. Exp is Erlang. Let Xi, X2, , Xn ~ Exp(λ) be...
Let λ >0 and suppose that X1,X2,...,Xn be i.i.d. random variables with Xi∼Exp(λ). Find the PDF of X1+···+Xn. Use convolution formula and prove by induction
Exercise 5.23. Let (Xn)nz1 be a sequence of i.i.d. Bernoulli(p) RVs. Let Sn -Xi+Xn (i) Let Zn-(Sn-np)/ V np (1-p). Show that as n oo, Zn converges to the standard normal RV Z~ N(0,1) in distribution. (ii) Conclude that if Yn~Binomial(n, p), then (iii) From i, deduce that have the following approximation x-np which becomes more accurate as n → oo.
Let X1, X2, . . . , Xn be a sequence of independent random variables, all having a common density function fX . Let A = Sn/n be their average. Find fA if (a) fX (x) = (1/ √ 2π)e −x 2/2 (normal density). (b) fX (x) = e −x (exponential density). Hint: Write fA(x) in terms of fSn (x).
4. Let Xi,X2, , Xn be n i.id. exponential random variables with parameter λ > Let X(i) < X(2) < < X(n) be their order statistics. Define Yǐ = nX(1) and Ya = (n +1 - k)(Xh) Xk-n) for 1 < k Sn. Find the joint probability density function of y, . . . , h. Are they independent? 15In
9 Let Xi, X2, ..., Xn be an independent trials process with normal density of mean 1 and variance 2. Find the moment generating function for (a) X (b) S2 =X1 + X2 . (c) Sn=X1+X2 + . . . + Xn. (d) An -Sn/n 9 Let Xi, X2, ..., Xn be an independent trials process with normal density of mean 1 and variance 2. Find the moment generating function for (a) X (b) S2 =X1 + X2 . (c)...
2. Let Xi exp(1) and X2 ~ variables with rate 1. Let: erp(1) be independent and identically-distributed exponential random (a) What is the cdf of X1? b) What is the joint pdf of (Xi, X2)? (c) What is the joint pdf of (Y, Z)? d) What is the marginal pdf of z?
This is a probability question. Please be thorough and detailed. 3. (8 pts.) Suppose that Xi ~ Exp(A) and X2 ~ Exp(A2) where λ1 and λ2 are positive con- λ2, but do assume that Xi and X2 are independent. Compute stants. Do not assume λι P(X1 < X2). Now note that the probability you just computed is in fact P(Xmin(XI, X2)). This suggests the following generalization. Suppose we have a collection of N independent ex- ponential random variables, X1, X2,...
(5) Let X1,X2,,Xn be independent identically distributed (i.i.d.) random variables from 1.1 U(0,1). Denote V max{Xi,..., Xn) and W min{Xi,..., Xn] (a) Find the distributions and the densities and the distributions of each of V and W. (b) Find E(V) and E(W) (5) Let X1,X2,,Xn be independent identically distributed (i.i.d.) random variables from 1.1 U(0,1). Denote V max{Xi,..., Xn) and W min{Xi,..., Xn] (a) Find the distributions and the densities and the distributions of each of V and W. (b)...
1. Let Xi, X2, X, be a 1.1.d. sample form Exp(1), and Y = Σ=i Xi. (a) Use CLT to get a large sample distribution of Y (b) For n = 100, give an approximation for P(Y > 100) (c) Let X be the sample mean, then approximate P(1.1 < X < 1.2) for n = 100.
Central Limit Theorem: let x1,x2,...,xn be I.I.D. random variables with E(xi)= U Var(xi)= (sigma)^2 defind Z= x1+x2+...+xn the distribution of Z converges to a gaussian distribution P(Z<=z)=1-Q((z-Uz)/(sigma)^2) Use MATLAB to prove the central limit theorem. To achieve this, you will need to generate N random variables (I.I.D. with the distribution of your choice) and show that the distribution of the sum approaches a Guassian distribution. Plot the distribution and matlab code. Hint: you may find the hist() function helpful