This problem involves the use of crosscorrelation to detect a signal in noise and estimate the time delay in the signal. A signal x(n) consists of a pulsed sinusoid corrupted by a stationary zero-mean white noise sequence. That is,
where w(n) is the noise with variance and the signal is
The frequency is known but the delay no, which is a positive integer, is unknown, and is to be determined by crosscorrelating x (n) with y (n) . Assume that N > M +no.
Let
denote the crosscorrelation sequence between x (n) and y(n). In the absence of noise this function exhibits a peak at delay m = no. Thus no is determined with no error. The presence of noise can lead to errors in determining the unknown delay.
(a) For m = no, determine E[rxy(no)]. Also, determine the variance, var[rxy (no)] due to the presence of the noise. In both calculations, assume that the double frequency term averages to zero. That is, M >>
(b) Determine the signal-to-noise ratio, defined as
(c) What is the effect of the pulse duration M on the SNR?
We need at least 10 more requests to produce the solution.
0 / 10 have requested this problem solution
The more requests, the faster the answer.