Let X be discrete with probability mass function p(1) = 0.05, p(2) = 0.05, p(3) = 0.1. p(4) = 0.1, p(5) = 0.6, and p(6) = 0.1, and for i = 1, 2, …, 6, let q(i) = p(1) + p(2) + … + p(i). Convince yourself that the following algorithm is explicitly the discrete inverse-transform method with a simple left-to-right search:
1. Generate U ~ U(0, 1) and set i = 1.
2. If U ≤ q(i), return X = i. Otherwise, go to step 3.
3. Replace i by i + 1 and go back to step 2.
FIGURE 8.18 Representation of the two algorithms in Prob. 8.2.
Let N be the number of times step 2 is executed (so that N is also the number of comparisons). Show that N has the same distribution as X, so E(N) = E(X) = 4.45. This algorithm can be represented as in Fig. 8.18a, where the circled numbers are the values to which X is set if U falls in the interval directly below them and the search is left-to-right.
Alternatively, we could first sort the p(i)’s into decreasing order and form a coding vector iʹ(i), as follows. Let qʹ(1) = 0.6, qʹ(2) = 0.7, qʹ(3) = 0.8, qʹ(4) = 0.9, qʹ(5) = 0. 95. and qʹ(6) = 1; also let iʹ(1) = 5, iʹ(2) = 3, iʹ(3) = 4, iʹ(4) = 6, iʹ(5) = 1, and iʹ(6) = 2. Show that the following algorithm is valid:
1ʹ. Generate U ~ U(0, 1) and set i = 1.
2ʹ. If U ≤ qʹ(i), return X = iʹ(i). Otherwise, go to step 3ʹ.
3ʹ. Replace i by i + 1 and go back to step 2ʹ.
If Nʹ is the number of comparisons for this second algorithm, show that E(Nʹ) = 2.05, which is less than half of E(N). This saving in marginal execution time will depend on the particular distribution and must be weighed against the extra setup time and storage for the coding vector iʹ(i). This second algorithm can be represented as in Fig. 8.18b.
We need at least 10 more requests to produce the solution.
0 / 10 have requested this problem solution
The more requests, the faster the answer.