Let p0 =P(X=1) and suppose that 0<p0 <1. Let μ=E(X) and σ2 =var(X).
a.) Find E[X|X ̸= 1]
b.) Find var(X|X ̸= 1)
Let p0 =P(X=1) and suppose that 0<p0 <1. Let μ=E(X) and σ2 =var(X). a.) Find E[X|X...
3. Let X be a continuous random variable with E(X)-μ and Var(X)-σ2 < oo. Suppose we try to estimate μ using these two estimators from a random sample X, , X,: For what a and b are both estimators unbiased and the relative efficiency of μι to is 45n?
2. Suppose that ξι, ξ2, . . . are 1.1.d. RVs with Εξι-μ and Var (6)-σ2 E (0,00). Set X-3kE+2,1,2,, and let Sn X+Xn, n21 (a) Compute EXk, Var (Xk) and Cov (Xj Xk) for j k (b) Find the limit lim P r E R nVar (X1) 72 →00 as a sum of independent RVs. From the form of the expression in (1), one could expect that the answer will be in terms of the standard normal DF 1,...
4. Suppose Yi, Yn are iid randonn variables with E(X) = μ, Var(y)-σ2 < oo. For large n, find the approximate distribution of p = n Σηι Yi, Be sure to name any theorems you used.
Please explain very carefully!
4. Suppose that x = (x1, r.) is a sample from a N(μ, σ2) distribution where μ E R, σ2 > 0 are unknown. (a) (5 marks) Let μ+σ~p denote the p-th quantile of the N(μ, σ*) distribution. What does this mean? (b) (10 marks) Determine a UMVU estimate of,1+ ơZp and justify your answer.
4. Suppose that x = (x1, r.) is a sample from a N(μ, σ2) distribution where μ E R, σ2 >...
Problem 4 Suppose X ~N(0, 1) (1) Explain the density of X in terms of diffusion process. (2) Calculate E(X), E(X2), and Var(X). (3) Let Y = μ +ơX. Calculate E(Y) and Var(Y). Find the density of Y.
Problem 4 Suppose X ~N(0, 1) (1) Explain the density of X in terms of diffusion process. (2) Calculate E(X), E(X2), and Var(X). (3) Let Y = μ +ơX. Calculate E(Y) and Var(Y). Find the density of Y.
3. Let Xi, , Xn be i.i.d. Lognormal(μ, σ2) (a) Suppose σ-1, prove that S-X(n)/X(i) is an ancillary statistics. (b) Suppose p 0, prove T-X(n) is a sufficient and complete statistics (c) Find a minimal sufficient statistics.
3. Let Xi, , Xn be i.i.d. Lognormal(μ, σ2) (a) Suppose σ-1, prove that S-X(n)/X(i) is an ancillary statistics. (b) Suppose p 0, prove T-X(n) is a sufficient and complete statistics (c) Find a minimal sufficient statistics.
4. Suppose Yi Y, are id randonn variables with E(Y )-μ, Var(Y)= σ2 < o For large n, find the approximaate distribution of YBeure to name any theorems you used.
5.26 Suppose that y is N, (μ, 2), where μ LJ and -σ2ρ for all Thus E(yi-μ for all i, var(yi) 0" for all i, and cov(yoy ij; that is, the y's are equicorrelated. (a) Show that Σ can be written in the form Σ-σ2(I-P)1+a (b) Show that Σ-i(vi-y?/(r2(1-p] is X2(n-1)
5.26 Suppose that y is N, (μ, 2), where μ LJ and -σ2ρ for all Thus E(yi-μ for all i, var(yi) 0" for all i, and cov(yoy ij; that...
Let X and Y have a bivariate normal distribution with parameters
μX = 10, σ2 X = 9, μY = 15, σ2 Y = 16, and ρ = 0. Find (a) P(13.6
< Y < 17.2). (b) E(Y | x). (c) Var(Y | x). (d) P(13.6 < Y
< 17.2 | X = 9.1).
4.5-8. Let X and Y have a bivariate normal distribution with parameters Ax-10, σ(-9, Ily-15, σǐ_ 16, and ρ O. Find (a) P(13.6< Y < 17.2)...
5.13. Suppose X1, X2, , xn are iid N(μ, σ2), where-oo < μ < 00 and σ2 > 0. (a) Consider the statistic cS2, where c is a constant and S2 is the usual sample variance (denominator -n-1). Find the value of c that minimizes 2112 var(cS2 (b) Consider the normal subfamily where σ2-112, where μ > 0. Let S denote the sample standard deviation. Find a linear combination cl O2 , whose expectation is equal to μ. Find the...