In: Computer Science
Question 7 Use the definition of Ω to show that 20(?^3) + 5(n^2) ∈ Ω (?^3)
Big-O, Omega, Theta complexity of functions, Running time equations of iterative functions & recursive functions, Substitution method & Master theorem
Please answer within these topics.
i) The Definition of asymptotic Notation Big Omega(Ω) Best Case time Complexity:
A function f(n)=Ω(g(n)) if f(n)>=c g(n) where c>0 and n>=1
Here f(n)=20(n^3) +5(n^2) and g(n)=n^3
So according to definition of Ω: f(n)>=C g(n)
20(n^3) +5(n^2) >= C (n^3) condition will be true for c=1, n>=1
So f(n)>=C g(n)
f(n)=Ω g(n)
ii) The Definition of asymptotic Notation Big O Worst time Complexity:
A function f(n)=O(g(n)) if f(n)<=c g(n) where c>0 and n>=1
Here f(n)=20(n^3) +5(n^2) and g(n)=n^3
So according to definition of Ω: f(n)<=C g(n)
20(n^3) +5(n^2) <= C (n^3) condition will be true for c=23, n>=2
So f(n)>=C g(n)
f(n)=O g(n)
iii) The Definition of asymptotic Notation Big Theta Average time Complexity:
A function f(n)= θ(g(n)) if C1 g(n) <=f(n)<=C2 g(n) where C1,C2>0 and n>=1
Here f(n)=20(n^3) +5(n^2) and g(n)=n^3
So By Above Big O and Big Omega we can conclude that
C1(n^3) <= 20(n^3) +5(n^2) <= C2 (n^3) condition will be true for C1=1 and C2=23, n>=2
So C1 g(n) <=f(n)<=C2 g(n)
f(n)=θ g(n)
also when f(n)=O g(n) and f(n)=Ω g(n) then f(n)=θ g(n)
iv) Iterative function and recursive function:
A program can be written into two Ways :
i) iterative program: in iterative program we use loops to solve our problem as example given below:
void print(int *d,int n,int n) { int i, j; for(i=0; i<n; i++) { // will execute n times for(j=0;j<n;j++) { // will execute n times printf("Enter value for d[%d][%d]:", i, j); scanf("%d", &d[i][j]); } }
}
the time complexity of above iterative program is T(n)=O(n^2)
ii) recursive program: in recursive program we use recursive call of same function to solve our problem , it uses stack memory, as example given below:
int A (n)
{
if(n>1) // condition check constant time C
A(n-1); // recursive Call here
}
the time complexity of above recursive program is T(n)=T(n-1) + C
Substitution Method:
lets take above time complexity :T(n)=T(n-1) + C Base condition n>=1
T(n)=T(n-1) + C ....(1)
T(n-1)=T(n-2) + C ....(2)
T(n-2)=T(n-3) + C ....(3)
Substituting the value of 2,3 in equation 1
T(n)=T(n-3) + C +C +C
T(n)=T(n-k) + kC
Base condition n>=1 put n-k=1
so put, k=n-1
T(n)=T(n-(n-1)) + (n-1)C
T(n)=T(1) + (n-1)C
So T(n)=O(n)
Master Theorm:
This is another way of finding time complexity but here the following condition must satisfy:
it should be in the form of T(n)=a T(n/b) + n^k log ^p n ( where a>=1 , b>1, and K is any real number).
Master theorm given below in image.
Example:
T(n)= 3T(n/2) +n^2
a=3,b=2,k=2
So a<b^k its match to the 3rd Condition of Master theorm shown above.
p=0
SO T(n) = n^k= n^2
let me know if any doubt.