8. Let X1...., X, be i.i.d. ~E(1) random variables (i.e., they are independent and identically distributed,...
Let X1,X2,...,Xn be an independent and identically distributed (i.i.d.) random sample of Beta distribution with parameters α = 2 and β = 1, i.e., with probability density function fX(x) = 2x for x ∈ (0,1). Find the probability density function of the first and last order statistics Y1 and Yn.
3. (a) (5 points) Let Xi,... be a sequence of independent identically distributed random variables e of tnduqendent idente onm the interval (o, 1] and let Compute the (almost surely) limit of Yn (b) (5 points) Let X1, X2,... be independent randon variables such that Xn is a discrete random variable uniform on the set {1, 2, . . . , n + 1]. Let Yn = min(X1,X2, . . . , Xn} be the smallest value among Xj,Xn. Show...
(a) Suppose that Xi, X2,... are independent and identically distributed random variables each taking the value 1 with probability p and the value-1 with probability 1-p For n 1,2,..., define Yn -X1 + X2+ ...+Xn. Is {Yn) a Markov chain? If so, write down its state space and transition probability matrix. (b) Let Xı, X2, ues on [0,1,2,...) with probabilities pi-P(X5 Yn - min(X1, X2,.. .,Xn). Is {Yn) a Markov chain and transition probability matrix. be independent and identically distributed...
Let X1, ..., Xn be i.i.d. [Recall that i.i.d. stands for independent and identically distributed.] Since X1, ..., Xn all have the same distribution, they have the same expected value and variance. Let E(X1) = µ and V ar(X1) = σ 2 . Find the following in terms of µ and σ 2 . (a) E(X2 1 ). Note this is not µ 2 ! (b) E( Pn i=1 X2 i /n). (c) Now, define W by W = 1...
Let X1, X2, X3, . be a sequence of i.i.d. Uniform(0,1) random variables. Define the sequence Yn as Ymin(X1, X2,,Xn) Prove the following convergence results independently (i.e, do not conclude the weaker convergence modes from the stronger ones). d Yn 0. a. P b.Y 0. L 0, for all r 1 Yn C. a.s d. Y 0. Let X1, X2, X3, . be a sequence of i.i.d. Uniform(0,1) random variables. Define the sequence Yn as Ymin(X1, X2,,Xn) Prove the following...
(a) Suppose that X1, X2,... are independent and identically distributed random variables each taking the value 1 with probability p and the value -1 with probability 1-p. For n = Yn-X1 + X2 + . . . + Xn. Is {Y, a Markov chain? If so, write down its state space and transition probability matrix 1, 2, . . ., denne
(5) Let X1,X2,,Xn be independent identically distributed (i.i.d.) random variables from 1.1 U(0,1). Denote V max{Xi,..., Xn) and W min{Xi,..., Xn] (a) Find the distributions and the densities and the distributions of each of V and W. (b) Find E(V) and E(W) (5) Let X1,X2,,Xn be independent identically distributed (i.i.d.) random variables from 1.1 U(0,1). Denote V max{Xi,..., Xn) and W min{Xi,..., Xn] (a) Find the distributions and the densities and the distributions of each of V and W. (b)...
74. Let X1, X2, ... be a sequence of independent identically distributed contin- uous random variables. We say that a record occurs at time n if X > max(X1,..., Xn-1). That is, X, is a record if it is larger than each of X1, ... , Xn-1. Show (i) P{a record occurs at time n}=1/n; (ii) E[number of records by time n] = {}_1/i; (iii) Var(number of records by time n) = 2/_ (i - 1)/;2; (iv) Let N =...
Problem 7. Let Xi, X2,..., Xn be i.i.d. (independent and identically distributed) random variables with unknown mean μ and variance σ2. In order to estimate μ and σ from the data we consider the follwing estimates n 1 Show that both these estimates are unbiased. That is, show that E(A)--μ and
2. Let X1, X2,. . , Xn denote independent and identically distributed random variables with variance σ2, which of the following is sufficient to conclude that the estimator T f(Xi, , Xn) of a parameter 6 is consistent (fully justify your answer): (a) Var(T) (b) E(T) (n-1) and Var(T) (c) E(T) 6. (d) E(T) θ and Var(T)-g2. 72 121