This is a simple problem related to convergence of radom variables and their probabilities.
We shall take the random variable distribution
Xn~N(1/2,1/n)
This distribution has a mean /expected value of 1/2 and variance of 1/n (dependent on n =sample size)
Now, our strategy to solve this problem will be to check for the convergence and variations in close proximity of the mean (=1/2) and we will observe its nature and behaviour and then we shall generalize it.
Let F be the distribution function for a point mass at 1/2.
Let us take a modified distribution √ nXn .
Clearly its standard deviation will be 1 (√ n.1/√ n).
Let Z denote a standard normal random variable representing the new distirbution √ nXn
Thus for t < 1/2,
Then as per our standard definition of F distribution
Clearly if we multiply each variable with √ n and hence for all t ,the same inequality holds for the f distribution
Hence
or,
or,
Now think !
what happen of this when n tends to infinity?
Clearly √ n .t will tend to infinity since n will tend to be a very large number and hence we can see that Z will also tend to inifinity and the probability of it occuring will hence tend towards 0.
And hence
Now what does this mean ?
This means that that a small change to the left of 1/2 on the numberline will have a ZERO effect on the overall change.
Now let us check for the right hand tendency at 1/2
we will do the same process
For t>1/2
or,
or,
since ,
Hence from the left and right variable variations we can now say that
for
and thinking conversely in terms of Xn which defines the Fn distribution we can now say that if
then
or,
or
Convergence of the probability
We will take the help of Markov's inequality to prove this.
For any
Let us check
But by Markov's inequality
thus we have
Now as
So what do we infer ?
We infer that the probability of any variation /divergence from the given variable is 0
and hence we can say that the probability of the distribution tends to converge to the distribution of probablity of the sample distirbution.
Xn converges to X in probability
or,
but X=1/2 for the given sample
thus
2. Let Xn ~ NG, Intuitivel y, Xn will concentrate at as n -o. In this question, we will justify t...
number 3 please Hw4.1708.pd 1 2 TL (2) LP convergence vs. convergence in probability Let Xn, nNbe a sequence of random variables and let X be another random variable. Given l < p < oo, we say that Xn converges to X in Lp if E(Xn-X") → 0 as n → x Show that this implies that Xn converges to X in probability (3) Monte Carlo Let f : 10, 1] → R be continuous and let Xn, n on...
Probability and Measure; Recall the definition: Show the following Xn → X and Yn y then Xn+Yn X+Y A sequence (Xn of random variables converges in probability towards the random variable X if for all0 linn Pr(X,-지 〉 ε) = 0.
8. Let {Xn, n = 1, 2, . . . } and (, , n = 1, 2, . . . } be two sequences of random variables, defined on the sample space Suppose that we know . Xn → X, G.8 Prove that XnYX+Y. 8. Let {Xn, n = 1, 2, . . . } and (, , n = 1, 2, . . . } be two sequences of random variables, defined on the sample space Suppose that...
Consider two sequences of random variables X1, X2,... and Y1, Y., .... Suppose that Xn converges to a and Y, converges to b with probability 1. Show that X, + Y, converges to a+b, with probability 1. Next, we assume that the random variables Y cannot be equal to zero, show that X/Y, converges to a/b with probability 1.
Let A > 0 be fixed and for each n - 1,2,3.., let Xn be a Binomial Random variable with parameters n, and pn -^. (i.e The number of trials is n and thıe success probability is pn --) (a) Write the moment-generating-function, Mx (t of X,. (You do not have to 72 derive it from scratch. You may use the general formula for the mgf of a binomial variable as provided in the appendix of the text). (b) Show...
1. Let (N(t))>o be a Poisson process with rate X, and let Y1,Y2, ... bei.i.d. random variables. Fur- ther suppose that (N(t))=>0 and (Y)>1 are independent. Define the compound Poisson process N(t) Y. X(t) = Recall that the moment generating function of a random variable X is defined by ºx(u) = E[c"X]. Suppose that oy, (u) < for all u CR (for simplicity). (a) Show that for all u ER, ºx() (u) = exp (Atløy, (u) - 1)). (b) Instead...
Consider a sequence of random variables X1, . . . , Xn, . . .where for each n, Xn ∼ t distribution. Apply Slutsky’s Theorem to show that as the degrees of freedom go to infinity, the distribution converges to a standard normal. (a) Let V1, . . . , V_n, . . . be such that Vn ∼ Chi Sq, n df. Find the value b such that V/n in probability −→ b. (b) Letting U ∼ N(0, 1),...
Suppose that 1/2 where Z is any random variable with E22c, say, with c> 0 and a E R fixed, and X is any other random variable. (a) Let e > 0. Use Chebyshev's inequality to show that (b) For what values of does the argument in part (a) prove that Xn converges in probability to X? (c) For the values of α identified in part (b), what other mode of convergence of Xn to X is assured (without any...
Suppose that 1/2 where Z is any random variable with E22c, say, with c> 0 and a E R fixed, and X is any other random variable. (a) Let e > 0. Use Chebyshev's inequality to show that (b) For what values of does the argument in part (a) prove that Xn converges in probability to X? (c) For the values of α identified in part (b), what other mode of convergence of Xn to X is assured (without any...
1) Let X and Y be random variables. Show that Cov( X + Y, X-Y) Var(X)--Var(Y) without appealing to the general formulas for the covariance of the linear combinations of sets of random variables; use the basic identity Cov(Z1,22)-E[Z1Z2]- E[Z1 E[Z2, valid for any two random variables, and the properties of the expected value 2) Let X be the normal random variable with zero mean and standard deviation Let ?(t) be the distribution function of the standard normal random variable....