5) let Y e β(n,m) (b) SX00 tha+ -Logy has same dist as Σ xk where...
7. Suppose that Xi,..., Xk are independent random variables, and X, ~ Exp(B) for i = 1, . . . , k. Let Y = min(X1 , . . . , Xk). Show that Y ~ Exp(Σ-1 β).
Let βˆ = (X′X)−1X′y where y ∼
N(Xβ,σ2I), X is an n×(k+1) matrix, and β is a (k+1)×1 vector. Are
βˆ′A′[A(X′X)−1A′]−1Aβˆ and y′[I − X(X′X)−1X′]y independent?
Let B (X'X)-X'y where y ~ N(XB,02I), X is an n x (k+ 1) matrix, and B is a (k+1) x1 vector Are BA A (X'X)-A]-AB and yI - X(X'X)-xy independent?
Let B (X'X)-X'y where y ~ N(XB,02I), X is an n x (k+ 1) matrix, and B is a (k+1) x1 vector Are...
QUESTION 2 Let Xi.. Xn be a random sample from a N (μ, σ 2) distribution, and let S2 and Š-n--S2 be two estimators of σ2. Given: E (S2) σ 2 and V (S2) - ya-X)2 n-l -σ (a) Determine: E S2): (l) V (S2); and (il) MSE (S) (b) Which of s2 and S2 has a larger mean square error? (c) Suppose thatnis an estimator of e based on a random sample of size n. Another equivalent definition of...
3. X is a continuous RV with pdf f(x) and CDF F(x). a) Derive the dist of Y=F(X) b) Show that Z=-21n(Y) has a Gamma dist. & derive it. 4. X-i ~ cont with pdf fi(x) and CDF Fi(x), i=1, 2, , k. all independent. Define YjaFi(Xi), i=1, , k. Derive the distribution of
3. X is a continuous RV with pdf f(x) and CDF F(x). a) Derive the dist of Y=F(X) b) Show that Z=-21n(Y) has a Gamma dist....
Let Y_1~Gamma(α=3,β=3), Y_2~Gamma(α=5,β=1), and W=2Y_1+6Y_2.
a) (9 pts) Find the moment generating function ofW Justify all steps b) (3 pts) Based on your result in part (a), what is the distribution of W(name and parameters)? n 2N(O, I) 2. IfZ NO, 1), then Ux(1) 3. ItY Gmmaa,B) and W then Wx(n) - s, and i-1 7. y's~ Poisson(W (i-l, ,Rind) and U-ŽYi, then U-Poisson(XA) 8 If%-Gamma(a, β) (i-I, ,Rind) and U-ΣΥί , then U~Gamma( ,4 β).(Note: all same β) 9...
Let y = Xß + ε where ε ~ N(0, σ21). Let β = (XTX)-"XTy and let è-y-X β. (a) Show that è-(1-Pxje where Px (b) Compute Ee -e 2. X(XTX)-1x" (1 (c) Compute Varle-e.
Let y = Xß + ε where ε ~ N(0, σ21). Let β = (XTX)-"XTy and let è-y-X β. (a) Show that è-(1-Pxje where Px (b) Compute Ee -e 2. X(XTX)-1x" (1 (c) Compute Varle-e.
(5) Fixm 2 1, an integer, and suppose P~ Uniform([0, 1]) and N ~Binomial(m, P) (a) Determine E(Xk(NP) where χκ (n), k-0, 1, 2, . . . , are defined as follows: 1 if n-k 0 otherwise (b) Determine E(Xk(N)h(N)) for a general function h : R R (c) Determine E(PIN) Warning: E(PN) is not N/m as you might be tempted to guess. Hint: Use the law of total probability together with the following result which you showed (in greater...
Please show every step, thank you.
Let Xi ~ N(μ, σ?), where ơỈ are known and positive for i-1, are independent. Let /- (a) Find the mean and variance of μ. (b) Compare μ to X,-n-Σί.i Xi as an estimator of μ. , n, and Xi, X, , E-1(1/o .m be the MLE of μ.
Let Xi ~ N(μ, σ?), where ơỈ are known and positive for i-1, are independent. Let /- (a) Find the mean and variance of μ....
In 10. 11, Let X1, X2, . , Xn and Yi, Y2, . . . , Y,, be independent samples from N(μ, σ?) and N(μ, σ), respectively, where μ, σ. ơỈ are unknown. Let ρ-r/of and g m/n, and consider the problem of unbiased estimation of u
In 10. 11, Let X1, X2, . , Xn and Yi, Y2, . . . , Y,, be independent samples from N(μ, σ?) and N(μ, σ), respectively, where μ, σ. ơỈ are unknown....
Exercise 2.6: Consider the models y Xßte and y* X"β+c" where E(e) = 0, cov(e) = σ21, y* = ГУ, X* = ГХ, e* =「ε and r is a known n x n orthogonal matrix. Show that: 1. E(e) 0, cov(e) σ21 2. b b and s2 s2, where b and b' are the least squares estimates of β and 82 and s+2 are the estimates of σ2 obtained from the two models.