Question 7
Two discrete random variables Z1 and Z2 are independent if and only if , for all possible pairs (z1, z2).
Here,
Now,
Hence, X and Y are mutually independent.
Question 8
Let b be the least square estimator of .
Then,
Hence, a 68% confidence indeval for will be given by
Here, b = 100.23 / 220.11 = 0.455 (approximately)
Since we do not know the value , we will have to use its estimate to get the value of d. But we should use the cut-off based on t23 distribution, as distribution. But as an approximation, one can use the normal cutoff also.
Hence, d = 0.994 * sqrt(4.19 / 220.11) = 0.137 (approximately)
Thus, the 68% confidence interval is given by (0.455 - 0.137, 0.455 + 0.137)
Please explain both questions. Show work. 7. Suppose X and Y are the random variables with...
2. Let X and Y are independent random variables with the same mass function f(-1) f(1) = 1/2. Let Z = XY. Show that X, Y, Z are pairwise independent but they are not independent. (Here、X,, . .. , xn are said to be pairwise independent if every pair Xi, X, with i f j are independent.)
1) Let X and Y be random variables. Show that Cov( X + Y, X-Y) Var(X)--Var(Y) without appealing to the general formulas for the covariance of the linear combinations of sets of random variables; use the basic identity Cov(Z1,22)-E[Z1Z2]- E[Z1 E[Z2, valid for any two random variables, and the properties of the expected value 2) Let X be the normal random variable with zero mean and standard deviation Let ?(t) be the distribution function of the standard normal random variable....
4. Recall that the covariance of random variables X, and Y is defined by Cov(X,Y) = E(X - Ex)(Y - EY) (a) (2pt) TRUE or FALSE (circle one). E(XY) 0 implies Cov(X, Y) = 0. (b) (4 pt) a, b, c, d are constants. Mark each correct statement ( ) Cov(aX, cY) = ac Cov(X, Y) ( ) Cor(aX + b, cY + d) = ac Cov(X, Y) + bc Cov(X, Y) + da Cov(X, Y) + bd ( )...
Suppose that X and Y are independent standard normal random variables. Show that U = }(X+Y) and V = 5(X-Y) are also independent standard normal random variables.
Suppose the random variables X, Y and Z are related through the model Y = 2 + 2X + Z, where Z has mean 0 and variance σ2 Z = 16 and X has variance σ2 X = 9. Assume X and Z are independent, the find the covariance of X and Y and that of Y and Z. Hint: write Cov(X, Y ) = Cov(X, 2+2X+Z) and use the propositions of covariance from slides of Chapter 4. Suppose the...
Suppose X, Y and Z are three different random variables. Let X obey Bernoulli Distribution. The probability distribution function is p(x) = Let Y obeys the standard Normal (Gaussian) distribution, which can be written as Y ∼ N(0, 1). X and Y are independent. Meanwhile, let Z = XY . (a) What is the Expectation (mean value) of X? (b) Are Y and Z independent? (Just clarify, do not need to prove) (c) Show that Z is also a standard...
2. Suppose X and Y are independent continuous random variables. Show that P(Y < X) = | Fy(x) · fx (x) dx -oo where Fy is the CDF of Y and fx is the PDF of X [hint: P[Y E A] = S.P(Y E A|X = x) · fx(x) dx]. Rewrite the above equation as an expectation of a function of X, i.e. P(Y < X) = Ex[•]. Use the above relation to compute P[Y < X] if X~Exp (2)...
X and Y are random variables (a) Show that E(X)=E(B(X|Y)). (b) If P((X x, Y ) P((X x})P({Y y)) then show that E(XY) = E(X)E(Y), i.e. if two random variables are independent, then show that they are uncorrelated. Is the reverse true? Prove or disprove (c) The moment generating function of a random variable Z is defined as ΨΖφ : Eez) Now if X and Y are independent random variables then show that Also, if ΨΧ(t)-(λ- (d) Show the conditional...
7. Suppose that Xi,..., Xk are independent random variables, and X, ~ Exp(B) for i = 1, . . . , k. Let Y = min(X1 , . . . , Xk). Show that Y ~ Exp(Σ-1 β).
Please show all work, will rate immediately ?? Two statistically independent random Variables, x. and Y, are uniformly distributed between 0 and 2 and 0 and 4, respectively. Find and sketch (sketch with all necessary details) the Pdf of their sum, Z.