Short answer
S = X1+ X2+ ..Xn
where Xi = Unif(0,1)
E(S) = n/2, Var(S) = n/12
put n = 1,2,3 for a),b) and c) part
Long answer
3. Let X.. U(-1,1) for i € {1, 2, 3). (a) Write down the expected value...
- Let X, 1.2.4. U(-1,1) for i € {1, 2, 3). (a) Write down the expected value and variance of X]. Sketch the pdf fxı. [3] (b) Let Y = X1 + X2. Compute the pdf fy of Y and sketch it. Using fy, or otherwise, compute the expected value and variance of Y. (7) (c) Let Z = X1 + X2 + X3 = Y + X3. Using exactly the same technique that you used in part (b), it...
c 3. Let X have density fx () = 1+1 -1<<1. (a) Compute P(-2 < X <1/2). (b) Find the cumulative distribution Fy(y) and probability density function fy(y) of Y = X? (c) Find probability density function fz() of Z = X1/3 (a) Find the mean and variance of X. (e) Calculate the expected value of Z by (i) evaluating S (x)/x(x)dr for an appropriate function (). (ii) evaluating fz(z)dz, pansion of 1/3 (ii) approximation using an appropriate formula based...
4.) Let X1, X2 and X3 be independent uniform random variables on [0,1]. Write Y = X1 + X, and Z X2 + X3 a.) Compute E[X, X,X3]. (5 points) b.) Compute Var(x1). (5 points) c.) Compute and draw a graph of the density function fy (15 points)
3 -0.751 (X1,X2, X3) be jointly Gaussian with ux (1,-2,3) and Cx 1. Let X = 3 0.25 4 L-0.75 0.25 Hint: If a set of random variables (RVs) are jointly Gaussian, then any subset of those RVs are also jointly Gaussian. Similarly, adding constants to (or taking linear combinations of) jointly Gaussian RVs results in jointly Gaussian RVs. Using this property you can solve problem 1 without using integration. When appropriate, you may express your answer by saying that...
1. Let $(x) = 2x2 and let Y = $(x). (a) Consider the case X ~U(-1,1). Obtain fy and compute E[Y] (b) Now instead assume that Y ~ U(0,1/2) and that X is a continuous random variable. Explain carefully why it is possible to choose fx such that fx (2) = 0 whenever 21 > 1. Obtain an expression linking fx(2) to fx(-x) for 3 € (-1,1). Show that E[X] = -2/3 + 2 S xfx(x) dx. Using your expression...
Let X1 d = R(0,1) and X2 d= Bernoulli(1/3) be two independent random variables, define Y := X1 + X2 and U := X1X2. (a) Find the state space of Y and derive the cdf FY and pdf fY of Y . (You may wish to use {X2 = i}, i = 0,1, as a partition and apply the total probability formula.) (b) Compute the mean and variance of Y in two different ways, one is through the pdf of...
Let X1 d= R(0,1) and X2 d= Bernoulli(1/3) be two independent random variables, define Y := X1 + X2 and U := X1X2. (a) Find the state space of Y and derive the cdf FY and pdf fY of Y . (You may wish to use {X2 = i}, i = 0,1, as a partition and apply the total probability formula.) (b) Compute the mean and variance of Y in two different ways, one is through the pdf of Y...
9. Green's Theorenm a. Green's Theorem: ap Fdx+Fzdy- b. Let C be the path from (0,0) to (1,1) along the graph of y-x3 and from (1,1) to (0,0) along the graph of y x. Draw a sketch of C. Theorerm to compute ф F-ds where Fay3 dx + (x343xy?) dy and C is the path that you drew in 11a. 9. Green's Theorenm a. Green's Theorem: ap Fdx+Fzdy- b. Let C be the path from (0,0) to (1,1) along the...
Let X be a random variable with cdf FX (x:0), expected value EIX-μ and variance VlX- σ2. Let X1,X2, , Xn be an id sample drawn according to FX(x,8) where Fx (x,8) =万 for all x E (0,0). Let max(X1, X2, , X.) be an estimator of θ, suggested from pure common sense. Remember that if Y = max(X1, X2, , Xn). Then it can be shown that the cdf Fy () of Y is given by Fr(u) (Fx()" where...
2.6.9 Let X have density function fx(x) = x/4 for 0 < x < 2, otherwise fx(x)=0. (a) Let Y = X. Compute the density function fy(y) for Y. (b) Let Z = X. Compute the density function fz(z) for Z.