Given two strings x = x1x2 ∙ ∙ ∙ xn and y = y1y2 ∙ ∙ ∙ ym, we wish to find the lengthof their longest common substring, that is, the largest k for which there are indices i and j with xi xi+1 ∙ ∙ ∙ xi+k-1 = yj yj+1 ∙ ∙ ∙ yj+k-1. Show how to do this intime O(mn).
We need at least 10 more requests to produce the solution.
0 / 10 have requested this problem solution
The more requests, the faster the answer.