Say we have data xi, . . ,z,, which are independent and identically distributed normal random variables with mean μ and variance 100. How often does this interval cover 11, 20 Say we have data x...
Problem 7. Let Xi, X2,..., Xn be i.i.d. (independent and identically distributed) random variables with unknown mean μ and variance σ2. In order to estimate μ and σ from the data we consider the follwing estimates n 1 Show that both these estimates are unbiased. That is, show that E(A)--μ and
Suppose we have 5 independent and identically distributed random variables Xi,X2.X3,X4,X5 each with the moment generating function 212 Let the random variable Y be defined as Y -XX. The density function of Y is (a) Poisson with λ-40 (b) Gamma with α-10 and λ-8 (c) Normal with μ-40 and σ-3.162 (d) Exponential with λ = 50 (e) Normal with μ-50 and σ2-15
If X1 and X2 are independent and identically distributed normal random variables with mean m and variance s2, find the probability distribution function for U=X1-3X2/2.
Let X1,, Xn be independent and identically distributed random variables with unknown mean μ and unknown variance σ2. It is given that the sample variance is an unbiased estimator of ơ2 Suggest why the estimator Xf -S2 might be proposed for estimating 2, justify your answer
Exercise 7. Let Xi, X2, . . . be independent, identically distributed rundorn variables uithEX and Var(X) 9, and let Yǐ = Xi/2. We also define Tn and An to be the sum and the sample mean, respectively, of the random variablesy, ,Y,- 1) Evaluate the mean and variance of Yn, T,, and A (2) Does Yn converge in probability? If so, to what value? 3) Does Tn converge in probability? If so, to what value? (4) Does An converge...
Let Xi, X2,... , Xn denote independent and identically distributed uniform random variables on the interval 10, 3β) . Obtain the maxium likelihood estimator for B, B. Use this estimator to provide an estimate of Var[X] when r1-1.3, x2- 3.9, r3-2.2
Let Xi, x,, ,X, be independent random variables with mean and variance σ . Let Y1-Y2, , Y, be independent random variables with mhean μ and variance a) Compute the expected value of W b) For what value of a is the variance of W a minimum? σ: Let W-aX + (1-a) Y, where 0 < a < 1. Let Xi, x,, ,X, be independent random variables with mean and variance σ . Let Y1-Y2, , Y, be independent random...
3. Suppose X1, X2, -- are independent identically distributed random variables with mean 0 and variance 1.Let Sn denote the partial sum Let Fn denote the information contained in Xi, .X,. Suppoe m n. (1) Compute El(Sn Sm)lFm (2) Compute ESm(Sn Sm)|F (3) Compute ES|]. (Hint: Write S (4) Verify that S -n is a martingale. [Sm(Sn Sm))2) 3. Suppose X1, X2, -- are independent identically distributed random variables with mean 0 and variance 1.Let Sn denote the partial sum...
3. Suppose X1,X2, are independent identically distributed random variables with mean μ and variance σ2. Let So = 0 and for n > 0 let Sn denote the partial sumi Let Fn denote the information contained in X1, ,Xn. (1) Verify that Sn nu is a martingale. (2) Assume that μ 0, verify that Sn-nơ2 is a martingale. 3. Suppose X1,X2, are independent identically distributed random variables with mean μ and variance σ2. Let So = 0 and for n...
1. (40) Suppose that X1, X2, Xn forms an independent and identically distributed sample from a normal distribution with mean μ and variance σ2, both unknown: 2nơ2 (a) Derive the sample variance, S2, for this random sample. (b) Derive the maximum likelihood estimator (MLE) of μ and σ2 denoted μ and σ2, respectively. (c) Find the MLE of μ3 (d) Derive the method of moment estimator of μ and σ2, denoted μΜΟΜΕ and σ2MOME, respectively (e) Show that μ and...