Let X and Y be strings of length n and m, respectively. Define B( j,k) to be the length of the longest common substring of the suffix X[n− j..n−1] and the suffix Y[m−k..m−1]. Design an O(nm)-time algorithm for computing all the values of B( j,k) for j = 1, . . . ,n and k = 1, . . . ,m.
Sorry the answer is not available at the moment…
If you are able to find the answer, please make sure to post it here. So that your Juniors have smile on their lips and feel happy.
Spread the 'tradition of sharing'.