Complete solution for the problem can be found in the images
below with all the necessary steps.
} WE {Gry lx)} = E(y) Prove it X and Y are continous 6E {9 (9)[x)...
X and Y are random variables (a) Show that E(X)=E(B(X|Y)). (b) If P((X x, Y ) P((X x})P({Y y)) then show that E(XY) = E(X)E(Y), i.e. if two random variables are independent, then show that they are uncorrelated. Is the reverse true? Prove or disprove (c) The moment generating function of a random variable Z is defined as ΨΖφ : Eez) Now if X and Y are independent random variables then show that Also, if ΨΧ(t)-(λ- (d) Show the conditional...
Problem 2 Suppose two continuous random variables (X, Y) ~ f(x,y). (1) Prove E(X +Y) = E(X)+ E(Y). (2) Prove Var(X + Y) = Var(X) + Var(Y)2Cov(X, Y). (3) Prove Cov(X, Y) E(XY)- E(X)E(Y). (4) Prove that if X and Y are independent, i.e., f(x, y) Cov(X, Y) 0. Is the reverse true? (5) Prove Cov (aX b,cY + d) = acCov(X, Y). (6) Prove Cov(X, X) = Var(X) fx (x)fy(y) for any (x,y), then =
Let.X and 7 be random losses with joint density function fr, y- for x and y non-negative integers. What is the probability that XY S 3? e-9 72 xly! A 0.016 B 0.021 co.055 D0.082 E 0.86
We have two random variables X and Y. P( X = .25 ) = .25 , P(X = .5 ) = .5 , and the P( X = .75 ) = .25 Suppose Y is a Bernoulli Random Variable and the Joint Distribution of X and Y satisfies condiiton that E[Y|X] = X^2 Help me calculate E[XY] & E[Y/X] & E[X|Y] I imagine we start by calculating E[X] which i got as .5, then calculate E[X^2] as 9/32 since we...
variable X and Y is independent. How to prove that E(XY)=E(X)E(Y) use intgration. There is some kind double integration and I dont understand it.
Suppose the joint probability distribution of two binary random variables X and Y are given as follows. X/Y 1 0 1 2 1 4 0 + 1 (a) Show the marginal distribution of X. [2pts] (b) Find entropy H(Y). [2pts] (e) Find conditional entropy H(XY). (3pts] (d) Find mutual information I(X;Y). [3pts] 2 (e) Find joint entropy H(X,Y). (3pts) Note: The following three proofs are not related to the example in parts (a - e). You need to prove each...
2. (a) Prove by structural induction that for all x E {0,1}*, \x = x. (b) Consider the function reverse : {0,1}* + {0,1}* which reverses a binary string, e.g, reverse(01001) = 10010. Give an inductive definition for reverse. (Assume that we defined {0,1}* and concatenation of binary strings as we did in lecture.) (c) Using your inductive definition, prove that for all x, y E {0,1}*, reverse(xy) = reverse(y)reverse(x). (You may assume that concatenation is associative, i.e., for all...
2. Let f:R2 + R be defined by gry, if (x, y) + (0,0) f(x,y) := { x2 + y2 + 1 0 if (x, y) = (0,0). Show that OL (0, y) = 0 for all y E R and f(x,0) = x for all x E R. Prove that bebu (0,0) + (0,0).
Let X and Y be two independent random variables. Show that Cov (X, XY) = E(Y) Var(X).
(a) For the random variable X, show that E[(x – a)?] is minimized when a = E(X). (b) For random variables X and Y, show that Var(X + Y) < Var(x) + Var(Y), that is, the standard deviation of the sum is less than or equal to the sum of standard deviations. (c) For random variables X and Y, prove the Cauchy-Schwartz Inequality: [E(XY)]? 5 E(X2) E(Y2)