Problem 3 Consider the linear MMSE estimator to the case where our estimation of a random variable Y is based on ob...
Problem 1: Random variables Y, and Y, are uncorrelated. We want a linear minimum mean- square error (MMSE) non-homogeneous estimate X, of the value of random variable X in terms of Y, and Y" The estimate has the form XL =g(Υ.Υ,) = a1+ bY, + c . Find the values of a, b and c that minimize the expected value of the error given by ECX-+by, +c)'). Express your answer in terms of the means and variances of Y, and...
3. Let y = (yi..... Yn) be a set of re- sponses, and consider the linear model y= +E, where u = (1, ..., and e is a vector of zero mean, uncorrelated errors with variance o'. This is a linear model in which the responses have a constant but unknown mean . We will call this model the location model. (a) If we write the location model in the usual form of the linear model y = X 8+...
1. Consider a variable y = θ+e where θ is an unknown parameter and e is a random variable with mean zero (a) What is the expected value of y (b) Suppose you draw a sample of in y-Derive the least squares estimator for θ. For full credit you must check the 2nd order condition. (c) Can this estimator () be described as a method of moments estimator? (d) Now suppose e is independent normally distributed with mean 0 and...
I. Consider a variable y = θ + where θ is an unknown parameter and e is a random variable with mean zero. (a) What is the expected value of y? (b) Suppose you draw a sample of yi yn. Derive the least squares estimator for θ. For full credit you must check the 2nd order condition c) Can this estimator (0) be described as a method of moments estimator? (d) Now suppose є is independent normally distributed with mean...
6.72 Let Y =X+N where X and N are independent Gaussian random variables with different variance and N is zero mean. (a) Plot the correlation coefficient between the “observed signal” Y and the “desired signal” X as a funtion of the signal-to-noise ratio (b) Find the minimum mean square error estimator for X given Y (c)Find the MAP and ML estimators for X given Y (d) Compare the mean square error of the estimators in parts a, b, and c.
Problem 2.1 Consider a process consisting of a linear trend with an additive noise term consisting of independent random variables Zt with zero means and variances σ2, that is, where Arßi are fixed constants. a) Prove that Xe is non-stationary b) Prove that the first difference series VX,-X, -X-1 is stationary by finding its mean and autocovariance function. c) Repeat part (b) if Z is replaced by a general stationary process, say Y,, with mean function py and autocovariance function...
Problem 3: Assume that 'nature' behaves according to the following linear additive model: Y = Bo + B1X +€, where ε is a Gaussian random variable N (0,02). Using this model, nature generates the following training dataset: D = {(Li, yi)}}–1 = {(–2, 47/2),(-1, -3), (0,0), (1,3), (2,7/2)}. Please, answer the questions below without the help of any computer software: a. Compute the estimates of Bo and @1 for a linear estimator û = Bo + 1X using the data...
Let X be uniformly distributed in the unit interval [0, 1]. Consider the random variable Y = g(X), where c^ 1/3, 2, if x > 1/3 g(x)- (a) Compute the PMF of Y b) Compute the mean of Y using its PMF (c) Compute the mean of Y by using the formula E g(X)]9)fx()d, where fx is the PDF of X
(Do this problem without using R) Consider the simple linear regression model y =β0 + β1x + ε, where the errors are independent and normally distributed, with mean zero and constant variance σ2. Suppose we observe 4 observations x = (1, 1, −1, −1) and y = (5, 3, 4, 0). (a) Fit the simple linear regression model to this data and report the fitted regression line. (b) Carry out a test of hypotheses using α = 0.05 to determine...