9. Suppose X and Y are the sum and difference of two coin tosses where =...
A5 Consider an experiment where you toss a coin as often as necessary to turn up one head. Suppose that the probability of having a tail is p (obviously probability of a head is 1 - p). Assume independence between tosses. a) State the sample space. b) Let X be the number of tosses needed to get one head. What is the support (possible values of X)? c) Find P(X = 1), P(X = 2) and P(X = 3). d)...
Suppose we toss a coin (with P(H) p and P(T) 1-p-q) infinitely many times. Let Yi be the waiting time for the first head so (i-n)- (the first head occurs on the n-th toss) and Xn be the number of heads after n-tosses so (X·= k)-(there are k heads after n tosses of the coin). (a) Compute the P(Y> n) (b) Prove using the formula P(AnB) P(B) (c) What is the physical meaning of the formula you just proved? Suppose...
A fair coin is tossed twice. Let X and Y be random variables such that: -X = 1 if the first toss is heads, and X = 0 otherwise. -Y = 1 if both tosses are heads, and Y = 0 otherwise. Determine whether or not X and Y are independent. So far, I have determined the the joint probability distribution as follows: x = 0 x = 1 y = 0 2/4 1/4 y = 1 0 1/4
2) Consider the sample space of three coin tosses: Ω = {HHH, HHT, HTH, HTT, THH, THT, TTH, TTT }. Assuming all elements to be equally likely, we assign P({ωi}) = 1/8, i = 1, 2, 3, 4, 5, 6, 7, 8. Define random variable to capture the second and third outcomes of the toss: X2 = { 0, if second outcome is T; 1, if second outcome is H and X3 = { 0, if third outcome is T;...
(1) Suppose the following is the joint PMF of random variables X and Y P(X x,Y y) c(3x + y), x1,2, y 1,2 where c is an unknown constant a. What is the value of c that makes this a valid joint PMF? b. Find Cov(X, Y)
2. The joint density of X and Y is given by Say 0SX S1,0 Sy sa fxy(x,y) = {o otherwise. (a) Find fxiy (ay). (b) Set up the integrals (do not evaluate) for evaluating Cov(X,Y). 3. In this question, you will identify the distribution of the sum of independent random variables. I expect you will find that the mgf approach is your friend. (a) Let X and Y be independent Poisson random variables with means X, and A2, respectively, and...
Please give help for this question. Question 4. Coin tossing, again. In class on Monday, January 29th, we discussed an example showing that the conditional independence of events does not imply their unconditional independence. As a reminder, the setup of the example was as follows. We had two coins, coin A and coin B. We chose a coin at random (i.e., with probability 0.5) and tossed the chosen coin repeatedly. Given the choice of a coin, the coin tosses were...
(5 points) Suppose the joint probability mass function (pmf) of integer- Y ī PlX = í,ys j) = (i + 2j)o, for 0 í valued random variables X and < 2,0 < j < 2, and i +j < 3, where c is a constant. In other words, the joint pmf of X and Y can be represented by the table: Y=2 |Y=0 Y=1 X=0| 0 2c 4c 3c 4c 5c X=21 2c (a) Find the constant c. (b) Compute...
4. Suppose that X is a random variable such that P(X < 0) = 0. You toss a fair coin and if the head comes up, you define Y to be VX; if the tail comes up, you define Y to be - VX. a. Find the cumulative distribution function of Y in terms of the cumulative distribution function of X. (You will probably want to consider two cases, one for y<0 and the other for y> 0.) b. Now...
Please solve part b) and d)!!! 1. Suppose X, Y are two discrete RV's with joint p.m.f. according to the table below. (a) Calculate the marginal p.m.f. of X and of Y (b) Calculate P(0< sin(X) el < 4) (c) Are X and Y independent? (d) Compute cov(X,Y). Table 1: The joint p.m.f. of X, Y 1/21/121/81/8 1/12 0 1/121/91/9 6 1/12 1120 1/9