Let f(X⃗ ) be some estimator, and let y be the “true” value that f(X⃗ )...
Problem 3. (06.31) Let X1, ... , Xn iid N (1,02), and let 5 =** -) denote an estimator of o2. Find the bias, variance, and mean-squared error of this estimator.
Problem 3 Consider the linear MMSE estimator to the case where our estimation of a random variable Y is based on observations of multiple random variables, say XXX. Then, our linear MMSE estimator can be e written in the following fom: (a) Show that the optimal values of aa,a.a for the linear LMSE estimator is given as where E(X, a, Cxx is an covariance matrix of X,,X,...Xv and cxy is a cross-correlation vector, which is defined as E(x,r EtXyY (b)...
Problem 3 variables with parameter Let r be an unknown constant. Let W be an exponential random A-1/3. Let Xr+w. (a) What is the maximum likelihood estimator of r based on a single observation X (b) What is the mean-squared error of the estimator from part (a):? (c) Is the estimator from part (a) biased or unbiased? Problem 3 variables with parameter Let r be an unknown constant. Let W be an exponential random A-1/3. Let Xr+w. (a) What is...
SOLVE the following in R code: iid Let X1, , Xn ~ U (0,0). We are going to compare two estimators for θ: 01-2X, the method of moments estimator -maxX.... X1, the maximum likelihood estimator I. Generate 50,000 samples of size n-50 from U(0,5). For each sample compute both θ1 and 02 (Hint: You can use the R cornmand max (v) to find the maximum entry of a vector v). The results should be collected in two vectors of length...
Let X1,X2,...,Xn be iid exponential random variables with unknown mean β. (b) Find the maximum likelihood estimator of β. (c) Determine whether the maximum likelihood estimator is unbiased for β. (d) Find the mean squared error of the maximum likelihood estimator of β. (e) Find the Cramer-Rao lower bound for the variances of unbiased estimators of β. (f) What is the UMVUE (uniformly minimum variance unbiased estimator) of β? What is your reason? (g) Determine the asymptotic distribution of the...
4. Let Yi, ½, . . . , Yn be a random sample from some pdf/pmf f(y; θ)·Let W be a point estimator h(y, Y2, . . . , Yn) for θ. The bias of W as a point estimator for θ is defined as W Blase(W) = E(W)- The mean square error of W is defined as MSEe(W) = E(W-0)2 (a) Using properties of expected values, and the definition of variance from PSTAT 120A/B, show that MSEe(W) = Vare(W)...
10. Let Y1,..., Y, be a random sample from a distribution with pdf 0<y< elsewhere f(x) = { $(0 –» a) Find E(Y). b) Find the method of moments estimator for 8. c) Let X be an estimator of 8. Is it an unbiased estimator? Find the mean square error of X. Show work
Y = f(X) +€, e~ N(0,0%) Let $ ({X})$ be the estimate (or predicted model) of $ f({X})$. The mean squared error (MSE) at a data point x is E[(Y – f (x))] = (E[f (x)] – f(x))? + E[(f (x) – E[ f (x)])?] + o2 = Bias? + Variance + Irreducible Error
PROBLEM 3 Let X1, X2,L , X, be iid observations from a distribution with pdf given by f(xl0)=0x0-, 0<x<1, 0<O<00. a) Find the maximum likelihood estimator of O. b) Find the moment estimator of 0. c) (Extra credit) Compare the mean squared error of the two estimators in (a) and (b). Which one is better? (5 points)
Let Y and X be two random variables. Let g(X) be any function of X used to predict Y. Finally, let the Minimum Mean Squared Error Prediction (MMSE) problem be defined as: min E[(Y g(X)) g(X) Prove that E(Y|X) is the solution to the MMSE problem, that is to say: E[Y - E(YX)) E[{Y - g(X)) Let Y and X be two random variables. Let g(X) be any function of X used to predict Y. Finally, let the Minimum Mean...