Proof 1:
We know,
Now,
As given, E(Xi) = 0
Proof 2:
, As E(Xi) = 0
We know,
(Proved)
** If any confusion, please comment.
7. Show that σ2 E(X-0 and Var(X if X1, . . . , Xn are independent...
2. Let X1, X2,. . , Xn denote independent and identically distributed random variables with variance σ2, which of the following is sufficient to conclude that the estimator T f(Xi, , Xn) of a parameter 6 is consistent (fully justify your answer): (a) Var(T) (b) E(T) (n-1) and Var(T) (c) E(T) 6. (d) E(T) θ and Var(T)-g2. 72 121
8. Let X, X2, , xn all be be distributed Normal(μ, σ2). Let X1, X2, , xn be mu- tually independent. a) Find the distribution of U-Σǐ! Xi for positive integer m < n b) Find the distribution of Z2 where Z = M Hint: Can the solution from problem #2 be applied here for specific values of a and b?
Let X1, X2, , xn are independent random variables where E(X)-? and Var(X) ?2 for all i = 1, 2, , n. Let X-24-xitx2+--+Xy variables. is the average of those random Find E(X) and Var(X).
Observations X1,..., Xn are independent identically distributed, following the PDF fx:(xi) = 0x8-1, and that 0<Xi <1 for all i. The parameter is an unknown positive number. Find the ML estimator of e
X1, X2, . . . , Xn i.i.d. ∼ N (µ, σ2 ). Assume µ is known; show that ˆθ = 1 n Pn i=1(Xi− µ) 2 is the MLE for σ 2 and show that it is unbiased. Exactly 6.4-2. Xi, X2, . . . , xn i d. N(μ, μ)2 is the MLE for σ2 and show that it is unbiased. r'). Assume μ is known; show that θ- n Ση! (X,-
Problem 7. Let Xi, X2,..., Xn be i.i.d. (independent and identically distributed) random variables with unknown mean μ and variance σ2. In order to estimate μ and σ from the data we consider the follwing estimates n 1 Show that both these estimates are unbiased. That is, show that E(A)--μ and
σ2). 6. Suppose X1, Yİ, X2, Y2, , Xn, Y, are independent rv's with Xi and Y both N(μ, All parameters μί, 1-1, ,n, and σ2 are unknown. For example, Xi and Yi muay be repeated measurements on a laboratory specimen from the ith individual, with μί representing the amount of some antigen in the specimen; the measuring instrument is inaccurate, with normally distributed errors with constant variability. Let Z, X/V2. (a) Consider the estimate σ2- (b) Show that the...
3. Suppose X1,X2, are independent identically distributed random variables with mean μ and variance σ2. Let So = 0 and for n > 0 let Sn denote the partial sumi Let Fn denote the information contained in X1, ,Xn. (1) Verify that Sn nu is a martingale. (2) Assume that μ 0, verify that Sn-nơ2 is a martingale. 3. Suppose X1,X2, are independent identically distributed random variables with mean μ and variance σ2. Let So = 0 and for n...
4. Let X1, X2, . .. be independent random variables satisfying E(X) E(Xn) --fi. (a) Show that Y, = Xn - E(Xn) are independent and E(Yn) = 0, E(Y2) (b) Show that for Y, = (Y1 + . . + Y,)/n, <B for some finite B > 0 and VB,E(Y) < 16B. 16B 6B 1 E(Y) E(Y) n4 i1 n4 n3 (c) Show that P(Y, > e) < 0 and conclude Y, ->0 almost surely (d) Show that (i1 +...
NB: Please do it for Let X1, X2, ;;;;, Xn are independent and Not identiically instead of identically Not identically Let X ,X2, ..., Xy are independent and Identically distributed standard uniform random variables. Find the following expectations: (a) E[max(X1, X2, ...,XN)] (b) E[min(X1, X2, ...,Xy)]