a) The cumulative distribution of is given by
b) The pdf of is given by
c)
So, it is an unbiased estimator,
But, , so is an unbiased estimator of
2. Let Yı, ..., Yn be a random sample from an Exponential distribution with density function...
5. Let Yi,Y2, , Yn be a random sample of size n from the pdf (a) Show that θ = y is an unbiased estimator for θ (b) Show that θ = 1Y is a minimum-variance estimator for θ.
4. Let Yi, ½, . . . , Yn be a random sample from some pdf/pmf f(y; θ)·Let W be a point estimator h(y, Y2, . . . , Yn) for θ. The bias of W as a point estimator for θ is defined as W Blase(W) = E(W)- The mean square error of W is defined as MSEe(W) = E(W-0)2 (a) Using properties of expected values, and the definition of variance from PSTAT 120A/B, show that MSEe(W) = Vare(W)...
Let Yı, Y2, ...,Yn be an iid sample from a population distribution described by the pdf fy(y|0) = (@+ 1) yº, o<y<1 for 0> - -1. (a) Find the MOM estimator of 0. (b) Find the maximum likelihood estimator (MLE) of 0. (c) Find the MLE of the population mean E(Y) = 0 +1 0 + 2 You do not need to prove that the above is true. Just find its MLE.
8. Let X1,...,Xn denote a random sample of size n from an exponential distribution with density function given by, 1 -x/0 -e fx(x) MSE(1). Hint: What is the (a) Show that distribution of Y/1)? nY1 is an unbiased estimator for 0 and find (b) Show that 02 = Yn is an unbiased estimator for 0 and find MSE(O2). (c) Find the efficiency of 01 relative to 02. Which estimate is "better" (i.e. more efficient)? 8. Let X1,...,Xn denote a random...
(1 point) Let Yı, Y2, ..., Yn be a random sample from the probability density function f(yla) = |aya-2/5° f(y ) 0 <y< 5 otherwise 0 for > -1. Find an estimator for a using the method of moments.
Let Yı,Y2, ..., Yn be iid from a population following the shifted exponential distribution with scale parameter B = 1. The pdf of the population distribution is given by fy(y\0) = y-0) = e x I(y > 0). The "shift" @ > 0 is the only unknown parameter. (a) Find L(@ly), the likelihood function of 0. (b) Find a sufficient statistic for 0 using the Factorization Theorem. (Hint: O is bounded above by y(1) min{Y1, 42, ..., .., Yn}.) (c)...
3. Consider a random sample Yı, ,Yn from a Uniform[0, θ]. In class we discussed the method of ,y,). We moment estimator θ-2Y and the maximum likelihood estimator θ-maxx,Yo, derived the Bias and MSE for both estimators. With the intent to correct the bias of the mle θ we proposed the following new estimator -Imax where the subscript u stands for "unbiased." (a) Find the MSE of (b) Compare the MSE of θυ to the MSE of θ, the original...
1. Let Y1, . . . ,Y,, be a random sample from a population with density function 0, otherwise (a) Find the method of moments estimator of θ (b) Show that Yan.-max(Yi, . . . ,%) is sufficient for 02] (Hint: Recall the indicator function given by I(A)1 if A is true and 0 otherwise.) (c) Determine the density function of Yn) and hence find a function of Ym) that is an unbiased estimator of θ (d) Find c so...
Let Yı, ..., Yn be an independent and identically distribution sample from the distribution function f(y) = 3y?, for 0 Sy <1. (a) Show the sample mean, y converges in probability to some constant, c. Find c. (b) Find a function that converges in probability to log(C).
1. (a) Let Yi,... , Yn be a random sample from a distribution with mean θ and finite variance σ2. Find the BLUE of θ and justify that it is, in fact, the Best Linear Unbiased Estimate. sample variance. 1. (a) Let Yi,... , Yn be a random sample from a distribution with mean θ and finite variance σ2. Find the BLUE of θ and justify that it is, in fact, the Best Linear Unbiased Estimate. sample variance.