Let Xi and X2 independent random variables, with distribution functions F1, and F2, respectively Let Y a Bernoulli random variable with parameter p. Suppose that Y, X1 and X2 are independent. Proof u...
Q2 Suppose X1, X2, X3 are independent Bernoulli random variables with p = 0.5. Let Y; be the partial sums, i.e., Y1 = X1, Y2 = X1 + X2, Y3 = X1 + X2 + X3. 1. What is the distubution for each Yį, i = 1, 2, 3? 2. What is the expected value for Y1 + Y2 +Yz? 3. Are Yį and Y2 independent? Explain it by computing their joint P.M.F. 4. What is the variance of Y1...
Let X1, X2, ..., Xr be independent exponential random variables with parameter λ. a. Find the moment-generating function of Y = X1 + X2 + ... + Xr. b. What is the distribution of the random variable Y?
(2) Given two independent variables X1 and X2 having Bernoulli distribution with parameter p=1/3, let Y1 = 2X1 and Y2 = 2X2. Then A E[Y1 · Y2] = 2/9 BE[Y1 · Y2] = 4/9 C P[Y1 · Y2 = 0) = 1/9 D P[Y1 · Y2 = 0) = 2/9 (3) Let X and Y be two independent random variables having gaussian (normal) distribution with mean 0 and variance equal 2. Then: A P[X +Y > 2] > 0.5 B...
Exercise 6.48. Let X1, X2, ..., Xin be independent exponential random variables, with parameter lį for Xi. Let Y be the minimum of these random variables. Show that Y ~ Exp(11 +...+ In).
Xi : i = 1,2,3,4 are independent and identically distributed Bernoulli variables with parameter p= 0.6. Find P(X1=X2), P(X1=X2≠X3), E[2X1+ 3X2−5], and E[(X1+X4)^3].
Let Xi, 1-1,2, , be independent Bernoulli() random variables and let Y,-ל 1-Xi. Use the delta method to find the limiting distribution of g(%)-YAI-%), for p # 2. 1
7. Let X1,....Xn random sample from a Bernoulli distribution with parameter p. A random variable X with Bernoulli distribution has a probability mass function (pmf) of with E(X) = p and Var(X) = p(1-p). (a) Find the method of moments (MOM) estimator of p. (b) Find a sufficient statistic for p. (Hint: Be careful when you write the joint pmf. Don't forget to sum the whole power of each term, that is, for the second term you will have (1...
Let X1,X2 be two independent
exponential random variables with λ=1, compute the
P(X1+X2<t) using the joint density function. And let Z be gamma
random variable with parameters (2,1). Compute the probability that
P(Z < t). And what you can find by comparing P(X1+X2<t) and
P(Z < t)? And compare P(X1+X2+X3<t) Xi iid
(independent and identically distributed) ~Exp(1) and P(Z < t)
Z~Gamma(3,1) (You don’t have to compute)
(Hint: You can use the fact that Γ(2)=1,
Γ(3)=2)
Problem 2[10 points] Let...
4 points) Let X1, X2 be independent random variables, with X1 uniform on (3,9) and X2 uniform on (3, 12). Find the joint density of Y = X/X2 and Z = Xi X2 on the support of Y, Z. f(y, z) =
Let Ņ, X1. X2, . . . random variables over a probability space It is assumed that N takes nonnegative inteqer values. Let Zmax [X1, -. .XN! and W-min\X1,... ,XN Find the distribution function of Z and W, if it suppose N, X1, X2, are independent random variables and X,, have the same distribution function, F, and a) N-1 is a geometric random variable with parameter p (P(N-k), (k 1,2,.)) b) V - 1 is a Poisson random variable with...