Problem 5. Prove the following result for any number a and discrete random variable X. 티(X-a...
Problem 3. Let X be a discrete random variable, gx) - a+ bX+ cX, and let a. b, c be constants. Prove, using the definition of expectation of a function of a random variable, namely , that E(a + bX + cx?) = a + bE(X) + cE(X2)
Need help with this Problem 4 A discrete random variable X follows the geometric distribution with parameter p, written X ~Geom(p), if its distribution function is fx(x) = p(1-p)"-1, xe(1, 2, 3, . . .} The Geometric distribution is used to model the number of flips needed before a coin with probability p of showing Heads actually shows Heads. a) Show that Ix(z) is indeed a probability inass function, i.e., the sum over all possible values of z is one...
A discrete random variable X follows the geometric distribution with parameter p, written X ∼ Geom(p), if its distribution function is A discrete random variable X follows the geometric distribution with parameter p, written X Geom(p), if its distribution function is 1x(z) = p(1-P)"-1, ze(1, 2, 3, ). The Geometric distribution is used to model the number of flips needed before a coin with probability p of showing Heads actually shows Heads. a) Show that fx(x) is indeed a probability...
Problem 2 Prove the following bound known as the Chemoff bound: Let X be a random variable with moment generating function X (s) defined for s > 0, Then for any a and any s > 0, Hint: To prove the bound apply Markov's inequality with X replaced by e) Apply the се Chemoff bound in case X is a standard normal random variable and a > 0. Find the value of s >0 that gives the sharpest bound, i.e,...
using excel answer the problem below Let X be a discrete random variable having following probability distribution. x 2 4 6 8 P(x) 0.2 0.35 0.3 0.15 Complete the following table and compute mean and variance for X x P(x) x· P(x) x2. P(x) 2 0.2 4 0.35 6 0.3 8 0.15 Total 1 Expected value E(X) = u = Variance Var = o2 =
Let X be a discrete random variable with the following PMF 6 for k € {-10,-9, -, -1,0, 1, ... , 9, 10} Px(k) = otherwise The random variable Y = g(X) is defined as Y = g(x) = {x if X < 0 if 0 < X <5 otherwise Calculate E[X], E[Y], var(X), and var(Y) for the two variables X and Y
Problem 4 Let X be a discrete random variable with probability mass function fx(x), and let t be a function. Define Y = t(X): that is, Y is the randon variable obtained by applying the function t to the value of X Transforming a random variable in this way is frequently done in statistics. In what follows, let R(X) denote the possible values of X and let R(Y) denote the possible values of To compute E[Y], we could irst find...
Verify the linearity of expectation: if X is a discrete random variable (with a finite range), and its expectation is defined as where f is the probability mass function of X. prove that E = [X+Y]=E[X]+E[Y],andE[cX]=cE[X] for any real number c. E[x] => + f(x) T
5 Consider a discrete random variable X with the probability mass function rp(x) Consider Y = g( X ) =- 0.2 0.4 0.3 0.1 a) Find the probability distribution of Y. b Find the expected value of Y, E(Y). Does μ Y equal to g(Hy )? 4
2. Let X and Y be two random variables with a joint distribution (discrete or continuous). Prove that Cov(X,Y)= E(XY) - E(X)E(Y). (15 points) 3. Explain in detail how we can derive the formula Var(X) = E(X) - * from the formula in Problem 2 above. (Please do not use any other method of proof.) (10 points)