1. Let Y.Y2, ,y, be independent and identically distributed N(μ, σ2) random variables. Show that, where...
, Yn be independent and identically distributed N(μ, σ2) random variables. Show Let YİM, that, where φ(-) denotes the cumulative distribution function of standard normal. [You need to show both the equalities]
1. Let Yi,Y2, ,y, be independent and identically distributed N( 1,02) random variables. Show that, EVn P( Y where ) denotes the cumulative distribution function of standard normal You need to show both the equalities
1. Let X1, X2, , Xn be independent Normal μ, σ2) random variables. Let y,-n Σ_lx, denote a sequence of random variables (a) Find E(y,) and Var(y,) for all n in terms of μ and σ2. (b) Find the PDF for Yn for alln. (c) Find the MGF for Yn for all n.
Problem 7. Let Xi, X2,..., Xn be i.i.d. (independent and identically distributed) random variables with unknown mean μ and variance σ2. In order to estimate μ and σ from the data we consider the follwing estimates n 1 Show that both these estimates are unbiased. That is, show that E(A)--μ and
3. Suppose X1,X2, are independent identically distributed random variables with mean μ and variance σ2. Let So = 0 and for n > 0 let Sn denote the partial sumi Let Fn denote the information contained in X1, ,Xn. (1) Verify that Sn nu is a martingale. (2) Assume that μ 0, verify that Sn-nơ2 is a martingale. 3. Suppose X1,X2, are independent identically distributed random variables with mean μ and variance σ2. Let So = 0 and for n...
(10 marks) Let X1, X2,... be a sequence of independent and identically distributed random variables with mean EX1 = i and VarX1 = a2. Let Yı, Y2, ... be another sequence of independent and identically distributed random variables with mean EY = u and VarY1 a2 Define the random variable ( ΣxΣ) 1 Dn 2ng2 i= i=1 Prove that Dn converges in distribution to a standard normal distribution, i.e., prove that 1 P(Dn ) dt 2T as n >oo for...
Let X1,, Xn be independent and identically distributed random variables with unknown mean μ and unknown variance σ2. It is given that the sample variance is an unbiased estimator of ơ2 Suggest why the estimator Xf -S2 might be proposed for estimating 2, justify your answer
Suppose we have 5 independent and identically distributed random variables Xi,X2.X3,X4,X5 each with the moment generating function 212 Let the random variable Y be defined as Y -XX. The density function of Y is (a) Poisson with λ-40 (b) Gamma with α-10 and λ-8 (c) Normal with μ-40 and σ-3.162 (d) Exponential with λ = 50 (e) Normal with μ-50 and σ2-15
Let Y1, Y2, . .. , Yn be independent and identically distributed random variables such that for 0 < p < 1, P(Yi = 1) = p and P(H = 0) = q = 1-p. (Such random variables are called Bernoulli random variables.) a Find the moment-generating function for the Bernoulli random variable Y b Find the moment-generating function for W = Yit Ye+ … + . c What is the distribution of W? 1.
1. (40) Suppose that X1, X2, Xn forms an independent and identically distributed sample from a normal distribution with mean μ and variance σ2, both unknown: 2nơ2 (a) Derive the sample variance, S2, for this random sample. (b) Derive the maximum likelihood estimator (MLE) of μ and σ2 denoted μ and σ2, respectively. (c) Find the MLE of μ3 (d) Derive the method of moment estimator of μ and σ2, denoted μΜΟΜΕ and σ2MOME, respectively (e) Show that μ and...