In: Computer Science
I didn't answer for the prevuise question
1. A program takes time proportional to the log of the input size i.e. T(n) = Order (log(n))
given T(n) = 100 ms for 1000 i.e 100 = O(log(1000))
for 1,000,000 which is 10002 we get T(n) = O(log(10002)) =O(2* log(1000)) =2* O(log(1000)) =200
2. A program takes time proportional to the square root of the input size. i.e. T(n) = Order( n0.5 )
given for O((500)0.5) output = 100 , for 2000 =500*4 we get output = O((500 *4)0.5) =2* O((500)0.5) =2 * 100 =200
3. A program takes time proportional to the cube of the input size T(n) =Order(n3)
given for s output is 10 so for 3s we get O(n3) =O((3s)3) = O(9s3)= 9*10 =90
4. the runtime of a compute program is proportional to n4 where n is the input size
for input 1000 output is 11 ie O(10004) = 11
for O(30004) we get 34 *O(10004) =81*11 =891
5. program takes time proportional to n2 where n is the input size
for input 10 output is 2 applying similar procedure output for 20 = 22* O(102) = 4*2 = 8