3. Let X and 6 be two random variables. Let X given 0 have a Bernoulli...
5. Let Xi, , X, (n 3) be iid Bernoulli random variables with parameter θ with 0<θ<1. Let T = Σ_iXi and 0 otherwiase. (a) Derive Eo[6(X,, X.)]. (b) Derive Ee16(X, . . . , Xn)IT = t], for t = 0, i, . . . , n.
m 1: Suppose that.X, form a random sample from a Bernoulli distribution for s unknown (0 θ < 1). Suppose also that the prior distribu- the beta distribution with parameters a >0 and 8> 0. Then the posterior distribution which the value of the parameter i of θ given that Xi z, (i l, where -n isthe beta distribution with parameters (0.3.), -: Proof:
m 1: Suppose that.X, form a random sample from a Bernoulli distribution for s unknown (0...
4. (3 points) Let X,.., X be an i.i.d. Bernoulli random variables with parameter p. Is it reasonable to use the exponential distribution to describe the prior distribution of p? Answer 'yes' or 'no ad exain
4. (3 points) Let X,.., X be an i.i.d. Bernoulli random variables with parameter p. Is it reasonable to use the exponential distribution to describe the prior distribution of p? Answer 'yes' or 'no ad exain
Question 3: A random variable X has a Bernoulli distribution with parameter θ є (0,1) if X {0,1} and P(X-1)-θ. Suppose that we have nd random variables y, x, following a Bernoulli(0) distribution and observed values y1,... . Jn a) Show that EIX) θ and Var[X] θ(1-0). b) Let θ = ỹ = (yit . .-+ yn)/n. Show that θ is unbiased for θ and compute its variance. c) Let θ-(yit . . . +yn + 1)/(n + 2) (this...
(2) Given two independent variables X1 and X2 having Bernoulli distribution with parameter p=1/3, let Y1 = 2X1 and Y2 = 2X2. Then A E[Y1 · Y2] = 2/9 BE[Y1 · Y2] = 4/9 C P[Y1 · Y2 = 0) = 1/9 D P[Y1 · Y2 = 0) = 2/9 (3) Let X and Y be two independent random variables having gaussian (normal) distribution with mean 0 and variance equal 2. Then: A P[X +Y > 2] > 0.5 B...
Let the continuous random variables X and (0, 2) and (3, 0). Y have a joint PDF which is uniform over the trig (U,0 a. Find the joint PDF of X and Y b. Find the marginal PDF of Y c. Find the conditional PDF of Xgiven Y. d. Find EIY/X x]
1. Suppose X,Y are random variables whose joint pdf is given by f(x, y) = 1/ x , if 0 < x < 1, 0 < y < x f(x, y) =0, otherwise . Find the covariance of the random variables X and Y . 2.Let X1 be a Bernoulli random variable with parameter p1 and X2 be a Bernoulli random variable with parameter p2. Assume X1 and X2 are independent. What is the variance of the random variable Y...
3. Let the random variables X and Y have the joint probability density function 0 y 1, 0 x < y fxy(x, y)y otherwise (a) Compute the joint expectation E(XY) (b) Compute the marginal expectations E(X) and E (Y) (c) Compute the covariance Cov(X, Y)
2. Let the random variables X and Y have the joint PDF given
below:
(a) Find P(X + Y ≤ 2).
(b) Find the marginal PDFs of X and Y.
(c) Find the conditional PDF of Y |X = x.
(d) Find P(Y < 3|X = 1).
Let the random variables X and Y have the joint PDF given below: 2e -0 < y < 00 xY(,) otherwise 0 (a) Find P(XY < 2) (b) Find the marginal PDFs of...
If two random variables have the joint density (x + y2), for 0 < x < 1, 0 < y < 1 0, elsewhere. find the probability that 0.2 < X < 0.5 and 0.4 <Y < 1.6. With reference to the previous Problem 6, find both marginal densities and use them to find the probabilities that a. X > 0.8; b. Y < 1.5.