8.7-11. Let Y1,Y2, ...,Yn be n independent random variables with normal distributions N(Bx;,02), where X],x2,...,xn are...
Let the independent normal random variables Y1,Y2, . . . ,Yn have the respective distributions N(μ, γ 2x2i ), i = 1, 2, . . . , n, where x1, x2, . . . , xn are known but not all the same and no one of which is equal to zero. Find the maximum likelihood estimators for μ and γ 2.
Let Y1, Y2, . . . , Yn be independent random variables with Exponential distribution with mean β. Let Y(n) = max(Y1,Y2,...,Yn) and Y(1) = min(Y1,Y2,...,Yn). Find the probability P(Y(1) > y1,Y(n) < yn).
4. Let X1, X2, . .. be independent random variables satisfying E(X) E(Xn) --fi. (a) Show that Y, = Xn - E(Xn) are independent and E(Yn) = 0, E(Y2) (b) Show that for Y, = (Y1 + . . + Y,)/n, <B for some finite B > 0 and VB,E(Y) < 16B. 16B 6B 1 E(Y) E(Y) n4 i1 n4 n3 (c) Show that P(Y, > e) < 0 and conclude Y, ->0 almost surely (d) Show that (i1 +...
3. Let Ya» . . . , Yn be independent normally distributed random variables with E(X) Gai and V(X)-1. Recall that the normal density with mean μ and variance σ given by TO 202 (a) Find the maximum likelihood estimator β of β (b) Show that ß is unbiased. (c) Determine the distribution of β (d) Recall that the likelihood ratio test of Ho : θ 02] L1] L2] θ° is to θ0 against H1: θ reject Ho if L(e)...
Let Xi, X2, , xn be independent Normal(μ, σ*) random variables. Let Yn = n Ση1Xi denote a sequence of random variables (a) Find E(%) and Var(%) for all n in terms of μ and σ2. (b) Find the PDF for Yn for all n c) Find the MGF for Y for all n
Consider a random sample (X1, Y1),(X2, Y2), . . . ,(Xn, Yn) where Y | X = x is modeled by a N(β0 + βx, σ2 ) distribution, where β0, β1 and σ 2 are unknown. (a) Prove that the mle of β1 is an unbiased estimator of β1. (b) Prove that the mle of β0 is an unbiased estimator of β0.
Let Y1, Y2, ..., Yn be independent random variables each having uniform distribution on the interval (0, θ) (c) Find var(Y(j) − Y(i)). Let Y İ, Y2, , Yn be independent random variables each having uniform distribu- tion on the interval (0,0) Let Y İ, Y2, , Yn be independent random variables each having uniform distribu- tion on the interval (0,0)
Suppose that X1, X2,.... Xn and Y1, Y2,.... Yn are independent random samples from populations with the same mean μ and variances σ., and σ2, respectively. That is, x, ~N(μ, σ ) y, ~ N(μ, σ ) 2X + 3Y Show that is a consistent estimator of μ.
Let X1, X2, ..., Xn be independent Exp(2) distributed random vari- ables, and set Y1 = X(1), and Yk = X(k) – X(k-1), 2<k<n. Find the joint pdf of Yı,Y2, ...,Yn. Hint: Note that (Y1,Y2, ...,Yn) = g(X(1), X(2), ..., X(n)), where g is invertible and differentiable. Use the change of variable formula to derive the joint pdf of Y1, Y2, ...,Yn.
Let Y1, Y2, . .. , Yn be independent and identically distributed random variables such that for 0 < p < 1, P(Yi = 1) = p and P(H = 0) = q = 1-p. (Such random variables are called Bernoulli random variables.) a Find the moment-generating function for the Bernoulli random variable Y b Find the moment-generating function for W = Yit Ye+ … + . c What is the distribution of W? 1.