We have since , .
Now, using conditional expectation,
And, using conditional variance formula,
Thus the random variable
has mean 0 and variance 1.
According to Central Limit Theorem,
is equivalent to . Hence,
The proof is complete.
49. Suppose that N e Po(A) independent observations of a random variable, X, with mean 0...
3. Suppose X1,X2, are independent identically distributed random variables with mean μ and variance σ2. Let So = 0 and for n > 0 let Sn denote the partial sumi Let Fn denote the information contained in X1, ,Xn. (1) Verify that Sn nu is a martingale. (2) Assume that μ 0, verify that Sn-nơ2 is a martingale. 3. Suppose X1,X2, are independent identically distributed random variables with mean μ and variance σ2. Let So = 0 and for n...
Suppose you have a sample of n independent observations X1,X2,...,Xn from a normal population with mean μ (known) and variance σ2 (unknown). (a) Find the ML estimator of σ2 . (b) Show that the ML estimator in (a) is a consistent estimator of θ. (c) Find a sufficient statistic for σ2. (d) Give a MVUE for θ based on the sufficient statistic.
Let X1, . . . , Xn be independent with common density f(x) = 2x 1[0 < x < 1]. Set Vn = max(X1, . . . , Xn). (a) Verify Vn → 1 in P. (b) Show that n(1-Vn) → W in D holds for some random variable W and find the distribution function of W.
4. Let X1, X2, . .. be independent random variables satisfying E(X) E(Xn) --fi. (a) Show that Y, = Xn - E(Xn) are independent and E(Yn) = 0, E(Y2) (b) Show that for Y, = (Y1 + . . + Y,)/n, <B for some finite B > 0 and VB,E(Y) < 16B. 16B 6B 1 E(Y) E(Y) n4 i1 n4 n3 (c) Show that P(Y, > e) < 0 and conclude Y, ->0 almost surely (d) Show that (i1 +...
A random variable X has probability density function f(x)=(a-1)x^(-a),for x>=1. (a) For independent observations x1,...,xn show that the log-likelihood is given by, l(a;x1,...,xn)=nlog(a-1)-a (b) Hence derive an expression for the maximum likelihood estimate for ↵. (c) Suppose we observe data such that n = 6 and 6 i=1 log(xi) = 12. Show that the associated maximum likelihood estimate for ↵ is given by aˆ ↵ =1 .5. logri We were unable to transcribe this image
3.1 There is a random variable X with observations {X1,X2, ..., Xn). It is known that these observations follow the normal distribution with mean μ and variance σ2. Which of the following will lead to a standard normal distribution? (a) (X-A)/o (b) (X- )/a2 (c) (X + μ)/o2 (d) (X + μ)/σ 3.2 In standard normal distribution, 99.7% of observations lie in the range between 3.3 A cumulative distribution function of a random variable Xis by definition a probability that...
Problem 4. Let X1, . . . , Xn be independent with common density f(x) = 2x 1[0 < x < 1]. Set Vn = max(X1, . . . , Xn). . (b) Show that n(1 − Vn) → W in D holds for some random variable W and find the distribution function of W
1. Consider a random experiment that has as an outcome the number x. Let the associated random variable be X, with true (population) and unknown probability density function fx(x), mean ux, and variance σχ2. Assume that n 2 independent, repeated trials of the random experiment are performed, resulting in the 2-sample of numerical outcomes x] and x2. Let estimate f x of true mean ux be μΧ-(X1 + x2)/2. Then the random variable associated with estimate Axis estimator Ax- (XI...
Suppose the random variable X has probability density function (pdf) - { -1 < x<1 otherwise C fx (x) C0 : where c is a constant. (a) Show that c = 1/7; (b) Graph fx (х); (c) Given that all of the moments exist, why are all the odd moments of X zero? (d) What is the median of the distribution of X? (e) Find E (X2) and hence var X; (f) Let X1, fx (x) What is the limiting...
4. Suppose X1, . . . ,X, are independent, normally distributed with mean E(Xi) and variance Var(X)-σί. Let Żi-(X,-μ.)/oi so that Zi , . . . , Ζ,, are independent and each has a N(0, 1) distribution. Show that LZhas a x2 distribution. Hint: Use the fact that each Z has a xî distribution i naS