Let {Xn} be a sequence of RVs with Xn~G(n,β), where β>0 is a constant (independent of n). Find the limiting distribution of Xn/n.
Let {Xn} be a sequence of RVs with Xn~G(n,β), where β>0 is a constant (independent of...
Question 6 Let X1, . . . , Xn denote a sequence of independent and identically distributed i.id. N(14x, σ2) random variables, and let Yı, . . . , Yrn denote an independent sequence of iid. Nụy, σ2) ran- dom variables. il Λί and Y is an unbiased estimator of μ for any value of λ in the unit interval, i.e. 0 < λ < 1. 2. Verify that the variance of this estimator is minimised when and determine the...
Exercise 5.23. Let (Xn)nz1 be a sequence of i.i.d. Bernoulli(p) RVs. Let Sn -Xi+Xn (i) Let Zn-(Sn-np)/ V np (1-p). Show that as n oo, Zn converges to the standard normal RV Z~ N(0,1) in distribution. (ii) Conclude that if Yn~Binomial(n, p), then (iii) From i, deduce that have the following approximation x-np which becomes more accurate as n → oo.
Exercise 5.22. Let (Xn)nel be a sequence of i.i.d. Poisson(a) RVs. Let Sn-X1++Xn (i) Let Zn-(Sn-nA)/Vm. Show that as n-, oo, Zn converges to the standard normal RV Z ~ N(0,1) in distribution (ii) Conclude that if Yn~Poisson(nX), then ii) Fromii) deduce that we have the following approximation which becomes more accurate as noo.
For any sequence of RVs {Xn}. Show that max|Xk|→0 in pr →n^-1Sn→0 in pr.
Let Xn be a sequence of Rvs with common finite variance σ∧2. Suppose that the correlation coefficient between Xi and Xj is<0 for all i ≠j. Show that the WLLN holds for the sequence {Xn}.
Let Z1, Z2, . . . be a sequence of independent standard normal random variables. Define X0 = 0 and Xn+1 = (nXn + (Zn+1))/ (n + 1) , n = 0, 1, 2, . . . . The stochastic process {Xn, n = 0, 1, 2, } is a Markov chain, but with a continuous state space. (a) Find E(Xn) and Var(Xn). (b) Give probability distribution of Xn. (c) Find limn→∞ P(Xn > epsilon) for any epsilon > 0.
2. Let Xn, n > 1, be a sequence of independent r.v., and Øn (t) = E (eitX»), ER be their characteristic functions. Let Yn = {k=0 Xk, n > 0, X0 = 0, and 8. () = {1*: (),ER. k = 1 a) Let t be so that I1=1 løk (t)) > 0. Show that _exp{itYn} ?, n > 0, On (t) is a martingale with respect to Fn = (Xo, ...,Xn), n > 0, and sup, E (M,|2)...
Let X1 ,……, Xn be a random sample from a Gamma(α,β) distribution, α> 0; β> 0. Show that T = (∑n i=1 Xi, ∏ n i=1 Xi) is complete and sufficient for (α, β).
Consider a sequence of random variables X1, . . . , Xn, . . .where for each n, Xn ∼ t distribution. Apply Slutsky’s Theorem to show that as the degrees of freedom go to infinity, the distribution converges to a standard normal. (a) Let V1, . . . , V_n, . . . be such that Vn ∼ Chi Sq, n df. Find the value b such that V/n in probability −→ b. (b) Letting U ∼ N(0, 1),...
(Stochastic process and probability theory) Let Xn, n > 1, denote a sequence of independent random variables with E(Xn) = p. Consider the sequence of random variables În = n(n-1) {x,x, which is an unbiased estimator of up. Does (a) in f H² ? (6) ûn 4* H?? (c) în + k in mean square? (d) Does the estimator în follow a normal distribution if n + ?