In: Computer Science
In computer science, when is the right time to find the Upper bound, Lower bound, and Tight bound? And what does Tight Bound show us?
Upper bound (big-O)- This notation bounds the growth of the running time of an algorithm only from above. The right time to find upper bound, when we want to bound the running time from above only or need worst-case running time.
It gives the worst-case running time.
This notation provides an asymptotic upper bound and we can say that an algorithm takes at most a certain amount of time.
Formal Definition :
O(g(n)) = { f(n): there exist positive constants c and n0 such that 0 <= f(n) <= c*g(n) for all n >= n0}
Lower bound (big-Ω)- This notation bounds growth of the running time of an algorithm only from below. The right time to find lower bound, when we want to bound the running time from below only or need best-case running time.
It gives the best-case running time.
This notation provides an asymptotic lower bound and we can say that an algorithm takes at least a certain amount of time.
Formal Definition :
Ω (g(n)) = {f(n): there exist positive constants c and n0 such that 0 <= c*g(n) <= f(n) for all n >= n0}.
Tight bound (Big-θ)- This notation bounds the growth of the running time of an algorithm only from above and below. The right time to find this bound, when we want to bound the running time from above and below or need average-case running time.
It gives the average-case running time.
Formal Definition :
Θ(g(n)) = {f(n): there exist positive constants c1, c2 and n0 such that 0 <= c1*g(n) <= f(n) <= c2*g(n) for all n >= n0}
Here,
The running time must be sandwiched between c1*g(n) and c2*g(n) for large values of n (n >= n0).
Asymptotically Tight bound on the running time.
"Asymptotically" because it matters for only large values of n.
"Tight bound" because we have sandwiched the running time to within a constant factor above and below.