In: Computer Science
This question concerns the case of two input sequences. Prove
that every algorithm that outputs all
longest common subsequences of two input sequences has a worst-case
running time that is exponential.
To do so, show how to define, for every positive integer n, two
length-n sequences Xn, Yn
with lower bound (c^n) different longest common subsequences, where
c>1 is a constant. You are allowed to use
an alphabet of size n, i.e., the symbols in Xn, Yn can come from a
set {a1, a2, . . . , an}.
Prove that every algorithm that outputs all longest common subsequences of two input sequences has worst case time complexity that is exponential
The longest common subsequence (LCS) problem is the problem of finding the longest subsequence that is present in the given two sub sequences of same order.
For every positive integer n two length subsequence Xn, Yn. With lower bound (c^n). There is different longest common subsequence where c>1 is a constant.
The solution to the problem of subsequence is found by from the first subsequence deleting some items and from second subsequence deleting some items. Subsequence doesn't require to occupy consecutive position within the original subsequence.
For example consider the two subsequence of length n. Xn, Yn.
X : BDABABC
Y : ABABDC
The length of LCS is 4.
LCS are BDAB, BCAB, BCBA
The common way of soving this problem is to check every subsequence of X is also a subsequence of Y.
As there is a 2m subsequence of X the complexity of solution could be O(n.2n) .
The LCS can be solved by using
Optimal substructure
Overlapping subproblems
Optimal substructure
The problem is broken into smaller subproblems which can broken again to smaller subproblems
Let us consider the two subsequence of length n such that Xn and Yn and that both end in same element.
Then the LCS of X and Y is the longer of two subsequence LCS(X[1.. n-1], Y[n-1].
The worst case time complexity is O(2(m+n) )
Overlapping subproblems
A problem is said to be a overlapping subproblems if the recursive algorithm for the problem solves the problem over and over rather than always generating new subproblems
It can be done in top down and bottom up approaches
In top down approach a recursion tree is created for the two sequences. While forming a recursion tree we can see that the same problem is solved again and again.
Problem having optimal substructure and overlapping subproblems can be solved by dynamic programming. In which the subproblems solution are memorized rather than computing again and again.
In the bottom up approach the smaller values are calculated first then the larger values are build around them.
In both these approaches the time complexity of solution is O(n2)