Let X1, ..., Xn be i.i.d. [Recall that i.i.d. stands
for independent and identically distributed.] Since
X1, ..., Xn all have the same distribution, they have the same
expected value and variance. Let
E(X1) = µ and V ar(X1) = σ
2
. Find the following in terms of µ and σ
2
.
(a) E(X2
1
). Note this is not µ
2
!
(b) E(
Pn
i=1 X2
i
/n).
(c) Now, define W by
W =
1
3
(X1 + X2 + X3) − X4
Find E(W) and V ar(W)
Let X1, ..., Xn be i.i.d. [Recall that i.i.d. stands for independent and identically distributed.] Since...
(5) Let X1,X2,,Xn be independent identically distributed (i.i.d.) random variables from 1.1 U(0,1). Denote V max{Xi,..., Xn) and W min{Xi,..., Xn] (a) Find the distributions and the densities and the distributions of each of V and W. (b) Find E(V) and E(W) (5) Let X1,X2,,Xn be independent identically distributed (i.i.d.) random variables from 1.1 U(0,1). Denote V max{Xi,..., Xn) and W min{Xi,..., Xn] (a) Find the distributions and the densities and the distributions of each of V and W. (b)...
Suppose X1, X2, . . . , Xn (n ≥ 5) are i.i.d. Exp(µ) with the density f(x) = 1 µ e −x/µ for x > 0. (a) Let ˆµ1 = X. Show ˆµ1 is a minimum variance unbiased estimator. (b) Let ˆµ2 = (X1 +X2)/2. Show ˆµ2 is unbiased. Calculate V ar(ˆµ2). Confirm V ar(ˆµ1) < V ar(ˆµ2). Calculate the efficiency of ˆµ2 relative to ˆµ1. (c) Show X is consistent and sufficient. (d) Show ˆµ2 is not consistent...
Let X1,X2,...,Xn be an independent and identically distributed (i.i.d.) random sample of Beta distribution with parameters α = 2 and β = 1, i.e., with probability density function fX(x) = 2x for x ∈ (0,1). Find the probability density function of the first and last order statistics Y1 and Yn.
Problem 7. Let Xi, X2,..., Xn be i.i.d. (independent and identically distributed) random variables with unknown mean μ and variance σ2. In order to estimate μ and σ from the data we consider the follwing estimates n 1 Show that both these estimates are unbiased. That is, show that E(A)--μ and
Let X1, ..., Xn be a random sample (i.i.d.) from a normal distribution with parameters µ, σ2 . (a) Find the maximum likelihood estimation of µ and σ 2 . (b) Compare your mle of µ and σ 2 with sample mean and sample variance. Are they the same?
Readings: Review for the 5 properties of expected value and variance e iid. Recall that ii.d. stands for independent and identically distributed.) Since have the same expected value and variance. Le 5. Let X,... Xn, b 1. ..., Xn all have the same distributi E(X1)-: μ and Var(X1) σ. Find the following in terms of μ and σ. (a) E(X). Note this is not pH (b) E0%XYn). (c) Now, define W by Find E(W) and Var(W).
X1, X2, . . . , Xn i.i.d. ∼ N (µ, σ2 ). Assume µ is known; show that ˆθ = 1 n Pn i=1(Xi− µ) 2 is the MLE for σ 2 and show that it is unbiased. Exactly 6.4-2. Xi, X2, . . . , xn i d. N(μ, μ)2 is the MLE for σ2 and show that it is unbiased. r'). Assume μ is known; show that θ- n Ση! (X,-
3. Let {X1, X2, X3, X4} be independent, identically distributed random variables with p.d.f. f(0) = 2. o if 0<x< 1 else Find EY] where Y = min{X1, X2, X3, X4}.
Let X1,X2,...,Xn denote independent and identically distributed random variables with mean µ and variance 2. State whether each of the following statements are true or false, fully justifying your answer. (a) T =(n/n-1)X is a consistent estimator of µ. (b) T = is a consistent estimator of µ (assuming n7). (c) T = is an unbiased estimator of µ. (d) T = X1X2 is an unbiased estimator of µ^2. We were unable to transcribe this imageWe were unable to transcribe...
Suppose we have 5 independent and identically distributed random variables X1, X2, X3, X4,X5 each with the moment generating function 212 Let the random variable Y be defined as Y = Σ Find the joint probability that all Xi, (i-1,.5), are larger than 9.