Let X1 and X2 be two independent continuous random variables. Define and S-Ixpo+2xso) where Ry and...
Let (X1, Y1) and (X2, Y2) be independent and identically distributed continuous bivariate random variables with joint probability density function: fX,Y (x,y) = e-y, 0 <x<y< ; =0 , elsewhere. Evaluate P( X2>X1, Y2>Y1) + P (X2 <X1, Y2<Y1) .
Let X1,X2 be two independent exponential random variables with λ=1, compute the P(X1+X2<t) using the joint density function. And let Z be gamma random variable with parameters (2,1). Compute the probability that P(Z < t). And what you can find by comparing P(X1+X2<t) and P(Z < t)? And compare P(X1+X2+X3<t) Xi iid (independent and identically distributed) ~Exp(1) and P(Z < t) Z~Gamma(3,1) (You don’t have to compute) (Hint: You can use the fact that Γ(2)=1, Γ(3)=2) Problem 2[10 points] Let...
15. Let X,, X2,.. . be independent, identically distributed random variables, EIXI oo, and denote S,-X1+... + Xn. Prove that [Use symmetry in the final step.] 15. Let X,, X2,.. . be independent, identically distributed random variables, EIXI oo, and denote S,-X1+... + Xn. Prove that [Use symmetry in the final step.]
O. Let X1 and X2 be two random variables, and let Y = (X1 + X2)2. Suppose that E[Y ] = 25 and that the variance of X1 and X2 are 9 and 16, respectively. O. Let Xi and X2 be two random variables, and let Y = (X1 X2)2. Suppose that and that the variance of X1 and X2 are 9 and 16, respectively E[Y] = 25 (63) Suppose that both X\ and X2 have mean zero. Then the...
3. Suppose X1, X2, -- are independent identically distributed random variables with mean 0 and variance 1.Let Sn denote the partial sum Let Fn denote the information contained in Xi, .X,. Suppoe m n. (1) Compute El(Sn Sm)lFm (2) Compute ESm(Sn Sm)|F (3) Compute ES|]. (Hint: Write S (4) Verify that S -n is a martingale. [Sm(Sn Sm))2) 3. Suppose X1, X2, -- are independent identically distributed random variables with mean 0 and variance 1.Let Sn denote the partial sum...
13. Let X1, X2, ...,Xy be a sequence of independent and identically distributed discrete random variables, each with probability mass function P(X = k)=,, for k = 0,1,2,3,.... emak (a) Find the expected value and the variance of the sample mean as = N&i=1X,. (b) Find the probability mass function of X. (c) Find an approximate pdf of X when N is very large (N −0).
(5) Let X1,X2,,Xn be independent identically distributed (i.i.d.) random variables from 1.1 U(0,1). Denote V max{Xi,..., Xn) and W min{Xi,..., Xn] (a) Find the distributions and the densities and the distributions of each of V and W. (b) Find E(V) and E(W) (5) Let X1,X2,,Xn be independent identically distributed (i.i.d.) random variables from 1.1 U(0,1). Denote V max{Xi,..., Xn) and W min{Xi,..., Xn] (a) Find the distributions and the densities and the distributions of each of V and W. (b)...
Let X1 and X2 be two independent standard normal random variables. Define two new random variables as follows: Y-Xi X2 and Y2- XiBX2. You are not given the constant B but it is known that Cov(Yi, Y2)-0. Find (a) the density of Y (b) Cov(X2, Y2)
4. Let X1,..., Xn be independent, identically distributed random vari- ables with common density 2 log c)? f(0; 1) = 0<<1, XCV21 (>0). : 212 (a) Find the form of the critical region C'* for the most powerful test of H:/= 1 vs. HQ: >1. (b) Suppose the n = 20 and a = .10. Find the specific value for the cutoff value) K from the critical region C* in part (a). (Hint: Show that Y = (log X/X) is...
Let X1, X2, ... be independent continuous random variables with a common distribution function F and density f. For k > 1, let Nk = min{n>k: Xn = kth largest of X1, ... , Xn} (a) Show Pr(Nx = n) = min-1),n>k. (b) Argue that fxx, (a) = f(x)+(a)k-( ++2)(F(x)* (c) Prove the following identity: al= (+*+ 2) (1 – a)', a € (0,1), # 22. i