In: Computer Science
The main idea of asymptotic analysis is to have a measure of efficiency of algorithms in terms of time and space complexity that doesn’t depend on machine specific constants(no matter how big they are), and doesn’t require algorithms to be implemented and time taken by programs to be compared. We have worst case, best case and average case time complexity. We will talk about the average case time complexity.
Θ Notation: The theta notation bounds a functions from above and below, so it defines exact asymptotic behavior.
Θ(g(n)) = {f(n): there exist positive constants c1, c2 and n0 such that 0 <= c1*g(n) <= f(n) <= c2*g(n) for all n >= n0}
So here 2 and 5 while measuring the asymptotic behaviour can be ignored and 5x2 and 2x2 can be treated as asymptotically equal as thy will have the same growth rate, their value does not depend on constants. So we can say that 5x2= Θ(2x2) = Θ(x2) = Θ(1000x2) the term that determines the growth rate is x2. we cannot write 5x2=Θ(1000000x), 1000000x will always be asymptotically smaller than 5x2 as machines run for a large values of x, neither can we write 5x2=Θ(x3) as theta bounds the function from above and below, but x3 binds 5x2 from above only. But 5x2= Θ(2x2) = Θ(x2) = Θ(1000x2) holds, ignoring constants.
But mathematically 5x2 and 2x2 are not equal, 2x2 is 5/2 times smaller than 5x2. But asymptotically they are same, i.e. they have same time/space complexity.