6.72 Let Y =X+N where X and N are independent Gaussian random variables with different variance and N is zero mean.
(a) Plot the correlation coefficient between the “observed signal” Y and the “desired signal” X as a funtion of the signal-to-noise ratio
(b) Find the minimum mean square error estimator for X given Y
(c)Find the MAP and ML estimators for X given Y
(d) Compare the mean square error of the estimators in parts a, b, and c.
6.72 Let Y =X+N where X and N are independent Gaussian random variables with different variance and N is zero mean. (a) Plot the correlation coefficient between the “observed signal” Y and the “desire...
Section 6.5: Mean Square Estimation 6.68. Let X and Y be discrete random variables with three possible joint pmf's: Let X and Y have joint pdf: fx.y(x, y) -k(x + y) for 0 sxs 1,0s ys1 Find the minimum mean square error linear estimator for Y given X. Find the minimum mean square error estimator for Y given X. Find the MAP and ML estimators for Y given X. Compare the mean square error of the estimators in parts a,...
Let ˜x and ˜y be zero-mean, unit variance Gaussian random variables with correlation coefficients, . Suppose we form two new random variables using linear transformations: Let and be zero-mean, unit variance Gaussian random variables with correlation coefficients, p. Suppose we form two new random variables using linear transformations: Find constraints on the constants a, b, e, and d such that ù and o are inde- pendent.
X & Y jointly gaussian zero mean Groom waste common variance 6² & correlation coefficient pto Find Vanance of Z=x² + X Y
Let x and x, be independent random variables with Mean u and variance o2. Suppose that we have two estimators Of u : A @= X1 + X2 2 and ©2 = X, +3X2 2 (a) Are both estimators unbiased estimators of u? (b) What is the variance of each estimator?
Let X and Y be two independent Gaussian random variables with common variance σ2. The mean of X is m and Y is a zero-mean random variable. We define random variable V as V- VX2 +Y2. Show that: 0 <0 Where er cos "du is called the modified Bessel function of the first kind and zero order. The distribution of V is known as the Ricean distribution. Show that, in the special case of m 0, the Ricean distribution simplifies...
Let X and Y be independent Gaussian(0,1) random variables. Define the random variables R and Θ, by R2=X2+Y2,Θ = tan−1(Y/X).You can think of X and Y as the real and the imaginary part of a signal. Similarly, R2 is its power, Θ is the phase, and R is the magnitude of that signal. (a) Find the joint probability density function of R and Θ, i.e.,fR,Θ(r,θ).
Suppose that X is a Gaussian Random Variable with zero mean and unit variance. Let Y=aX3 + b, a > 0 Determine and plot the PDF of Y
7. Let X and Y be independent Gaussian random variables with identical densities N(0,1). Compute the conditional density of the random variable of X given that the sum Z = X + Y is known (i.e., XIX + Y)
Problem 3 Consider the linear MMSE estimator to the case where our estimation of a random variable Y is based on observations of multiple random variables, say XXX. Then, our linear MMSE estimator can be e written in the following fom: (a) Show that the optimal values of aa,a.a for the linear LMSE estimator is given as where E(X, a, Cxx is an covariance matrix of X,,X,...Xv and cxy is a cross-correlation vector, which is defined as E(x,r EtXyY (b)...
Implement the following problems in MATLAB and document your work using the report guidelines on the next page. 1. Use the subplot function to place the relevant graphical results in a single figure. Generate a sequence of 1000 pairs of independent zero-mean Gaussian random variables, where X has variance 2 and N has variance 1, Let Y = X + N be the equation for a noisy signal. Plot X, N, and Y in a 3D scatter plot using Y...