Question

In: Computer Science

Here are some basic rules for calculating the Big O for some T(n) or an algorithm....

Here are some basic rules for calculating the Big O for some T(n) or an algorithm.

1. Only the highest degree of n matters. For example T(n) = n^3 + 5n^2 + 107 → O(n^3 ) since once n becomes super-massively huge, the other terms just stop mattering.

2. Constant factors don’t matter. T(n) = 500n and T(n) = 0.005n both O(n). Again, as n becomes bigger, these constants stop mattering; what matters is the rate of growth. Example: T(n) = 50n^3 − 2n^2 + 400 → O(n^3 )

3. Counting the number of nested loops usually works.

Please provide your work with the answers.

Solutions

Expert Solution

1. "Only highest degree of n matters."

Only the highest degree of n matters because the lower degree becomes completely dwarfed by the higher degrees of the equations as we consider higher and higher values of n, the contribution of lower degrees becomes dwarfed and irrelevant. Consider the following example for better understanding, in the given equation let's compute the value of T(n) for various values on n like 1,10,100,1000,10000 etc..

For n = 1 : T(1) = 1+5+107 = 113

For n = 10   : T(10) = 1000+500+107 = 1607

For n = 100 : T(100) = 1000000 + 50000 + 107 = 1050107

For n = 1000 : T(1000) = 1000000000 + 5000000 + 107 = 1005000107 = 1.005 X 10

For n = 1010  : T(1010) = 1030+ 5 X 1020 + 107 = 1030 approx.

For n = 1020 : T(1020) = 1060 + 5 X 1040 + 107 = 1060 approx.

As it can be observed, the value of T(n) majorly depends only on higher degrees of n. So, it is evident from the above example that as the value n increases the lower order terms stop mattering. Hence, it can be concluded that "Only Highest Degree of n matters."

2. "Constant factors don't matter."

No, the constant factors don't matter in Big-O notation but "yes" they are absolutely relevant while computing computation time of an algorithm. Big-O notation only describes the long term growth rate of functions, rather than their absolute magnitudes. Multiplying a constant only influences its growth by a constant amount, so a function will still follow it's inherent behaviour i.e; a linear function will still grow linearly, a exponential function will still grow exponentially etc.

But these constants are absolutely significant. A function whose runtime is 1010n will be way slower than a function whose runtime is just n. The function whose runtime is n3/2 is much faster than one whose runtime is n3. The fact that the first two are both O(n) and the second two are both O(n3) doesn't change the fact that they both won't run in same amount time, since that's not what big-O notation is designed for.

In summary, constant doesn't matter in big-O notation, but they are absolutely significant from the perspective of computation time.

3. "Counting the number of nested loops usually works."

Counting the number of nested loops can give us an idea of complex an algorithm can be but we can not compare two algorithms by counting the no of loops present in them. For better understanding consider the below example,

Code 1 :

int i = n;

while (i > 0)

{

for (int j = 0; j < n; j++)

System.out.println("*");

i = i / 2;

}

Code 2 :

int i = n;

while (i > 0)

{

for (int j = 0; j < n; j++)

System.out.println("*");

i++;

}

In both Code 1 and Code 2 there are two loops, but with just this knowledge we can't comment that both the codes are equally complex. Complexity of the code depends on how the iterator variable is being updated after each iteration.

Code 1 has an complexity of O(n log(n)). The two loops here are nested, but the number of iterations the inner loop runs is independent of the outer loop. Therefore, the total volume of statements can be taken by multiplying both values together. The outer loop iterates O(log n) times. Within EACH iteration, the inner loop iterates n times, independent of the outer loop. Therefore, the time complexity of this code fragment is O(n log n).

Code 2 has an complexity of O(n2).The outer loop iterates O(n) times. Within EACH iteration, the inner loop iterates n times, independent of the outer loop. Therefore, the time complexity of this code fragment is O(n2).

So, even though both the codes have 2 loops but their time complexities are different. Hence, counting the number of nested loops may not work sometimes.


Related Solutions

What are Big-O expressions for the following runtine functions?      a) T(n)= nlog n + n...
What are Big-O expressions for the following runtine functions?      a) T(n)= nlog n + n log (n2 )        b) T(n)= (nnn)2           c) T(n)= n1/3 + n1/4 + log n      d) T(n)= 23n + 32n      e) T(n)= n! + 2n Write algorithm and program : Find the sum of the integers from 1 through n.     a)Use iterative algorithm and program         b) Use recursive algorithm and program. Order the following functions by growth rate:...
Algorithm problem 4 [3.1-4] Is (2^(n+1))∈O(2^n)? Is (2^(2n))∈O(2^n)?
Algorithm problem 4 [3.1-4] Is (2^(n+1))∈O(2^n)? Is (2^(2n))∈O(2^n)?
Present an O(n) algorithm that sorts n positive integer numbers a1, a2, . . . ,...
Present an O(n) algorithm that sorts n positive integer numbers a1, a2, . . . , an which are known to be bounded by n 2 (so ai ≤ n 2 , for every i = 1, . . . , n. Use the idea of Radix Sort (discussed in class and presented in Section 8.3 in the textbook). Illustrate your algorithm by showing on paper similar to Fig. 8.3, page 198 in the textbook (make sure you indicate clearly...
Show for the following recurrence that T(n) = T(n/3) + n*log(n) is O(n*log(n)) (not using the...
Show for the following recurrence that T(n) = T(n/3) + n*log(n) is O(n*log(n)) (not using the Master theorem please)
What characteristic of the binary search algorithm results in its speed? What is big O for...
What characteristic of the binary search algorithm results in its speed? What is big O for binary search? Binary search has a significant disadvantage -- what is it?
c++ is the language Code an O(nlog(n)) algorithm to count the number of inversions DO NOT...
c++ is the language Code an O(nlog(n)) algorithm to count the number of inversions DO NOT ADD OR EDIT FUNCTIONS USE THESE ONLY DO NOT EDIT THE TEST FILE EITHER countInv.cpp #include <vector> #include <algorithm> using namespace std; int mergeInv(vector<int>& nums, vector<int>& left, vector<int>& right) { // You will need this helper function that calculates the inversion while merging // Your code here } int countInv(vector<int>&nums) { // Your code here } countInv_test.cpp #include <iostream> #include <vector> using namespace std;...
Describe an O(n)-time algorithm that checks if brackets in a given arithmetic expression is properly placed,...
Describe an O(n)-time algorithm that checks if brackets in a given arithmetic expression is properly placed, i.e., all left brackets are matched with their right counterparts and there are no unmatched brackets between any pair of matched brackets. Note that there are generally more than one types of brackets and only those of the same type can match against each other. For simplicity, you can assume that all operands are represented as ‘a’ and addition, +, is the only available...
Suppose that we performed the algorithm SELECT whose running time is O(n) on 133 elements, and...
Suppose that we performed the algorithm SELECT whose running time is O(n) on 133 elements, and found the median of medians x by making groups of 5 elements. What is the maximum number of elements which are guaranteed to be greater than equals to x (without counting x, itself)?
1. Given three arrays of N names each, describe an O(nlogn) algorithm to determine if there...
1. Given three arrays of N names each, describe an O(nlogn) algorithm to determine if there is any name common to all three arrays, and if so, return the first such name.? 2. Given an input array with all equal values, compare the following sorting algorithm using big-O notation. Running Time Space Complexity Merge Sort Quick Sort Heap Sort
Calculate the Big-O time complexity. Show work 1. n^2 + 3n + 2 2. (n^2 +...
Calculate the Big-O time complexity. Show work 1. n^2 + 3n + 2 2. (n^2 + n)(n ^2 + π/2 ) 3. 1 + 2 + 3 + · · · + n − 1 + n
ADVERTISEMENT
ADVERTISEMENT
ADVERTISEMENT