Consider two sequences of random variables X1, X2,... and Y1, Y., .... Suppose that Xn converges...
O. Let X1 and X2 be two random variables, and let Y = (X1 +
X2)2. Suppose that E[Y ] = 25 and that the variance of X1 and X2
are 9 and 16, respectively.
O. Let Xi and X2 be two random variables, and let Y = (X1 X2)2. Suppose that and that the variance of X1 and X2 are 9 and 16, respectively E[Y] = 25 (63) Suppose that both X\ and X2 have mean zero. Then the...
Let X1, X2,...be a
sequence of random variables. Suppose that Xn?a in probability for
some a ? R. Show that (Xn) is Cauchy convergent in probability,
that is, show that for all
> 0 we have P(|Xn?Xm|> )?0 as n,m??.Is the converse true?
(Prove if “yes”, find a counterexample if “no”)
8. Let X1, X2,...,X, U(0,1) random variables and let M = max(X1, X2,...,xn). - Show that M. 1, that is, M, converges in probability to 1 as n o . - Show that n(1 - M.) Exp(1), that is, n(1 - M.) converges in distribution to an exponential r.v. with mean 1 as n .
Consider a random sample (X1, Y1),(X2, Y2), . . . ,(Xn, Yn) where Y | X = x is modeled by a N(β0 + βx, σ2 ) distribution, where β0, β1 and σ 2 are unknown. (a) Prove that the mle of β1 is an unbiased estimator of β1. (b) Prove that the mle of β0 is an unbiased estimator of β0.
4. Let X1, X2, . .. be independent random variables satisfying E(X) E(Xn) --fi. (a) Show that Y, = Xn - E(Xn) are independent and E(Yn) = 0, E(Y2) (b) Show that for Y, = (Y1 + . . + Y,)/n, <B for some finite B > 0 and VB,E(Y) < 16B. 16B 6B 1 E(Y) E(Y) n4 i1 n4 n3 (c) Show that P(Y, > e) < 0 and conclude Y, ->0 almost surely (d) Show that (i1 +...
Suppose that X1, X2,.... Xn and Y1, Y2,.... Yn are independent random samples from populations with the same mean μ and variances σ., and σ2, respectively. That is, x, ~N(μ, σ ) y, ~ N(μ, σ ) 2X + 3Y Show that is a consistent estimator of μ.
Consider a sequence of random variables X1, ..., Xn, ..., where for each n, Xn~ tn. We will use Slutsky's Theorem to show that as the degrees of freedom go to infinity, the distribution converges to a standard normal. (a) Let V1, ..., Vn, ... be such that Vn ~ X2. Find the value b such that Vn/n þy b. (b) Letting U~ N(0,1), show that In = ☺ ~tn and that Tn "> N(0,1). VVn/n
Q3 Suppose X1, X2, ..., Xn are i.i.d. Poisson random variables with expected value ). It is well-known that X is an unbiased estimator for l because I = E(X). 1. Show that X1+Xn is also an unbiased estimator for \. 2 2. Show that S2 (Xi-X) = is also an unbaised esimator for \. n-1 3. Find MSE(S2). (We will need two facts) E com/questions/2476527/variance-of-sample-variance) 2. Fact 2: For Poisson distribution, E[(X – u)4] 312 + 1. (See for...
8. Let {Xn, n = 1, 2, . . . } and (, , n = 1, 2, . . . } be two sequences of random variables, defined on the sample space Suppose that we know . Xn → X, G.8 Prove that XnYX+Y.
8. Let {Xn, n = 1, 2, . . . } and (, , n = 1, 2, . . . } be two sequences of random variables, defined on the sample space Suppose that...
Let (X1, Y1) and (X2, Y2) be independent and identically distributed continuous bivariate random variables with joint probability density function: fX,Y (x,y) = e-y, 0 <x<y< ; =0 , elsewhere. Evaluate P( X2>X1, Y2>Y1) + P (X2 <X1, Y2<Y1) .