# R program
rm(list=ls(all=TRUE));
joint <- function(i, j){
p
<-
# expression of probability P(X=i, Y=j)
return(p);
}
#### Calculating Expectation of XY, X, X^2, Y and Y^2 i.e.
E(XY), E(X), E(X^2), E(Y) and E(Y^2)
Exy <- 0;
Ex <- 0;
Ey <- 0;
Ex_2 <- 0;
Ey_2 <- 0;
for(i in 0:1){
sum1 <- sum2 <- sum3 <- sum4 <- sum5
<- 0;
for(j in 1:3){
sum1 <- sum1 +
i*j*joint(i, j); # for E(XY)
sum2 <- sum2 +
i*joint(i, j); # for E(X)
sum3 <- sum3 +
(i^2)*joint(i, j); # for E(X^2)
sum4 <- sum4 +
j*joint(i, j); # for E(Y)
sum5 <- sum5 +
(j^2)*joint(i, j); # for E(Y^2)
}
Exy <- Exy + sum1;
Ex <- Ex + sum2;
Ex_2 <- Ex_2 + sum3;
Ey <- Ey + sum4;
Ey_2 <- Ey_2 + sum5;
}
variance_x <- Ex_2-(Ex^2)
variance_y <- Ey_2-(Ey^2)
covariance <- Exy-Ex*Ey;
correlation <- covariance/sqrt(variance_x*variance_y);
print(covariance);
print(correlation);
Write an R function joint (i,j) for computing P(X = , for i = 0,1;j =...
(5 points) Suppose the joint probability mass function (pmf) of integer- Y ī PlX = í,ys j) = (i + 2j)o, for 0 í valued random variables X and < 2,0 < j < 2, and i +j < 3, where c is a constant. In other words, the joint pmf of X and Y can be represented by the table: Y=2 |Y=0 Y=1 X=0| 0 2c 4c 3c 4c 5c X=21 2c (a) Find the constant c. (b) Compute...
X and Y are continuous R.V. with values X in (0,1), Y in (0,1) and joint PDF f(x,y)=6(x-y)^2, compute the covariance, Cov(X,Y). Thank you
Suppose X and Y have joint distribution function given by: p(x, y) = for (x, y) = (-1,0), (0,1), (1,0). (a) Are X and Y independent? (b) Find the Covariance of X and Y.
The following table presents the joint probability mass function pmf of variables X and Y 0 2 0.14 0.06 0.21 2 0.09 0.35 0.15 (a) Compute the probability that P(X +Y 3 2) (b) Compute the expected value of the function (X, Y)3 (c) Compute the marginal probability distributions of X and )Y (d) Compute the variances of X and Y (e) Compute the covariance and correlation of X and Y. (f) Are X and Y statistically independent? Clearly prove...
4.2 The Correlation Coefficient 1. Let the random variables X and Y have the joint PMF of the form x + y , x= 1,2, y = 1,2,3. p(x,y) = 21 They satisfy 11 12 Mx = 16 of = 12 of = 212 2 My = 27 Find the covariance Cov(X,Y) and the correlation coefficient p. Are X and Y independent or dependent?
Please answer all the questions thank you 1. Use the joint probability density function to answer the questions below. 0 otherwise (a) Find the expected value of X (b) Find the expected valuc of Y (e) Find the covariance between X and Y a) Find the expected valuc of X (d) Find the correlation coefficient p(X,Y) 1. Use the joint probability density function to answer the questions below. 0 otherwise (a) Find the expected value of X (b) Find the...
4.73 consider the joint probability distribution - Compuut We mean and variance for the leaf function W = X + Y. (4.73) Consider the joint probability distribution: X 1 0.30 0.20 0.25 1 0.25 a. Compute the marginal probability distributions for X and Y. b. Compute the covariance and correlation for X an c. Compute the mean and variance for the linear function W = 2X + Y. Consider the joint probability distribution: (b) (5 pts) / bu(am + 1)...
Let X and Y have joint density function: show s c(x² + y²), if os rs1.osys1 10, otherwise. (a) Determine the constant c. (b) Find P(X < 1/2, Y > 1/2), and P(Y < 1/2). (c) Find P(X - Y < 1/2) (d) Find the covariance Cov(X,Y). Are the random variables X and Y independent? (e) Find the correlation coefficient p.
The joint density function of X and Y is J x +y if 0 < x,y<1 f(x, y) = 3. otherwise. a) Are X and Y independent? b) Find the density of X. c) Find P(X + Y < 1).
Exercises: 1) The joint distribution of X and Y is given by the following table: y 1.5 2 fxy(x, y) 1/4 1/8 1/4 1/4 1/8 Compute: a) P(X=1.5, Y =2). b) P(X=1, Y =2). c) P(X=1.5). d) P(X<2.5, Y<3) e) P(Y>3) f) E(X), E(Y), V(X) and V(Y). g) The marginal distributions of X and of Y. h) Conditional probability distribution of Y given that X = 1.5. i) E(Y|X=1.5) j) E(XY) k) Are X and Y independent? Explain why or...