The Mean Squared Error (MSE) of an estimator ?̂ of θ is defined as
MSE = E[(?̂ − θ)2]
Prove that
MSE = Var(?̂) + [bias(?̂)]2
where
bias(θ) = E(?̂) − θ
The Mean Squared Error (MSE) of an estimator ?̂ of θ is defined as MSE =...
2. (a) Define the bias of ˆ θ as an estimator for the parameter θ. [2 marks] (b) For independent random variables X1,X2,...,Xn, assume that E(Xi) = µ and var(Xi) = σ2, i = 1,...,n. (i) Show that ˆ µ1 = {(X1+Xn)/2}is an unbiased estimator for µ and determine its variance. [3 marks] (ii) Find the relative efficiency of ˆ µ1 to the unbiased estimator ˆ µ2 = X, the sample mean. [2 marks] (iii) Is ˆ µ1 a consistent...
2. Suppose that X|θ ~ U(0.0), the uniform distribution on the interval (09). Assuming squared error loss, derive that Bayes estimator of θ with respect to the prior distribution P(α.θο), the two-parameter Pareto model specified in (3.36), first by explicitly deriving the marginal probability mass function of X, obtaining an expression for the posterior density of θ and evaluating E(θ x) and secondly by identifying g(θ|x) by inspection and noting that it is a familiar distribution with a known mean.
Problem 4. The standard error of an estimator is defined as the standard deviaition of that estimator. In class we introduced the sample mean X,-(1/n) Σί ix, as an estinator of E(X] where Xi's are iid samples of the random variable X. What is the standard error of the estimator Xn? Assume that the statndard deviation of X is σ.
Q1 and Q2 (please also show the steps): Q1 Prove that MSE) = Var(ë) + Bias(@?, i.e., El(Ô – 9)2) = E[(O - ECO)?] + [ECO) – 6)2. Q2 Suppose X1, X2, ..., X, are i.i.d. Bernoulli random variables with probability of success p. It is known that = is an unbiased estimator for p. n 1. Find E(2) and show that p2 is a biased estimator for p? (Hint: make use of the distribution of x. and the fact...
Given is an estimator of the population mean μ, where X 1and X 2 are from the same distribution with mean μ and standard deviation σ. 1. Find the following: a. E ( Θ ^ ) b. B i a s ( Θ ^ ) c. Is Θ ^ a biased estimator of μ? Please briefly explain your answer. d. V ( Θ ^ ) e. MSE ( Θ ^ ) lê – X-3X,
Please give detailed steps. Thank you. 5. Let {X1, X2,..., Xn) denote a random sample of size N from a population d escribed by a random variable X. Let's denote the population mean of X by E(X) - u and its variance by Consider the following four estimators of the population mean μ : 3 (this is an example of an average using only part of the sample the last 3 observations) (this is an example of a weighted average)...
Suppose that E h ˆθ1 i = E h ˆθ2 i = θ, Var h ˆθ1 i = σ 2 1 , Var h ˆθ2 i = σ 2 2 , and Cov h ˆθ1, ˆθ2 i = σ12. Consider the unbiased estimator ˆθ3 = aˆθ1 + (1 − a) ˆθ2. What value should be chosen for the constant a in order to minimize the variance and thus mean squared error of ˆθ3 as an estimator of θ?
how to choose these two question? To compute the MSE (mean squared error) we need: O All the Y's O All the X's O All the Y's and X's O No Y's or X's O We need more information. Q4 MC 1 Point To compute the SSTO (sum of squared total) we need: O All the Y's All the X's O All the Y's and X's O No Y's or X's We need more information.
3. Consider a random sample Yı, ,Yn from a Uniform[0, θ]. In class we discussed the method of ,y,). We moment estimator θ-2Y and the maximum likelihood estimator θ-maxx,Yo, derived the Bias and MSE for both estimators. With the intent to correct the bias of the mle θ we proposed the following new estimator -Imax where the subscript u stands for "unbiased." (a) Find the MSE of (b) Compare the MSE of θυ to the MSE of θ, the original...
Let δ be a minimax estimator of g(9) under squared error loss. Show that a84 b is minimax for ag(e) + b Let δ be a minimax estimator of g(9) under squared error loss. Show that a84 b is minimax for ag(e) + b