Recall that the variance of a random variable is defined as
Var[X]=E[(X−μ)2], where μ = E[X]. Use the properties of expectation to show that we can rewrite the variance of a random variable X as Var [X]=E[X^2]−(E[X])^2
Var(X) =E(X-E(X))2 =E(X2+(E(X))2-2XE(X)) =E(X2)+E(((E(X))2-2*E(x*E(X))
=E(X2)+(E(X))2-2*E(X)*E(X) =E(X2)+(E(X))2-2*(E(X))2
Var(X)=E(X2)-(E(X))2
Recall that the variance of a random variable is defined as Var[X]=E[(X−μ)2], where μ = E[X]....
2. (7 pt) Recall that the variance of a random variable X is defined by Var(X) - E(X - EX)2. Select all statements that are correct for general random variables X,Y. Throughout, a, b are constants. ( Var(X) E(X2) (EX)2 ( ) Var(aX + b) = a2 Var(X) + b2 Var(aXb)a Var(X)+b ( ) Var(X + Y) = Var(X) + Var(Y) ) Var(x) 2 o ) Var(a)0 ( ) var(x") (Var(X))"
Q2. More about operations with expectation and covariances Recall that the variance of random variable X is defined as Var(X) Ξ E 1(X-E(X))2」, the covariance is Cor(X, Y-E (X-E(X))(Y-E(Y)), and the correlation is Corr(X,Y) Ξ (a) What is the value of EX-E(X))? (Hint: Let μ denote E(X). Then, the parameter μ is a unknown, but fixed value like a constant.) (0.5 pt) b) The following is the proof that Var(X) E(X2) E(X)2: -E(x)-E(x)2 In a similar way, prove that Cov(X,...
3. Let X be a continuous random variable with E(X)-μ and Var(X)-σ2 < oo. Suppose we try to estimate μ using these two estimators from a random sample X, , X,: For what a and b are both estimators unbiased and the relative efficiency of μι to is 45n?
5. Suppose X is a normally distributed random variable with mean μ and variance 2. Consider a new random variable, W=2X + 3. i. What is E(W)? ii. What is Var(W)? 6. Suppose the random variables X and Y are jointly distributed. Define a new random variable, W=2X+3Y. i. What is Var(W)? ii. What is Var(W) if X and Y are independent?
Let σ2 be the variance of a random variable X, show that σ2 = μ′2 − μ2 where μ′2 is the second moment about the origin and μ is the mean of X.
5) Let X be a random variable with mean E(X) = μ < oo and variance Var(X) = σ2メ0. For any c> 0, This is a famous result known as Chebyshev's inequality. Suppose that Y,%, x, ar: i.id, iandool wousblsxs writia expliiniacy" iacai 's(%) fh o() airl íinic vaikuitx: Var(X) = σ2メ0. With Υ = n Ση1 Y. show that for any c > 0 Tsisis the celebraed Weak Law of Large Numben
I. Suppose we have a simple random sample from x,, , x, which have mean μ-E(X) and variance = tar Find (a) E ((X-μ)2 (d) E ((X-μ)2), where 1--ΣΧ, I. Suppose we have a simple random sample from x,, , x, which have mean μ-E(X) and variance = tar Find (a) E ((X-μ)2 (d) E ((X-μ)2), where 1--ΣΧ,
O RANDOM VARIABLES AND DISTRIBUTIONS Expectation and variance of a random variable Let X be a random variable with the following probability distribution: Value x of X P(X-) 0.35 0.40 0.10 0.15 10 0 10 20 Find the expectation E (X) and variance Var(X) of X. (If necessary, consult a list of formulas.) Var(x) -
Let X be a random variable with E[X] = 2, Var(X) = 4. Compute the expectation and variable of 3 - 2X.
3. (8 pt, 2 each) (Ross) Let X be a random variable taking values in the finite interval 0, c]. You may assume that X is discrete, though this is not necessary for this problem (a) Show that EX c and EX2 cEX (b) Use the inequalities above to show that Var(X) <c2[u(1-u)] u=EXE[0, 1]. where (e) Use the result of part (b) to show that Var(cx) se/ (d) Use the result in (c) to bound the variance of a...