In: Computer Science
If we change the randomized QUICKSORT algorithm so that it repeatedly randomly selects a pivot and runs PARTITION until it finds a good pivot, and suppose we keep track of the pivots used so far so we never use one twice for the same array, what is the worst case cost of the algorithm? Give a recurrence for this and its asymptotic solution (you can use the master method.)
The Worst-Case complexity is still O( n^2 ). But Using random pivoting we improve the expected or average time complexity to O(nlogn).
Recurrence for radomized Quicksort is - T(n) = T(k − 1) + T(n − k) + Θ(n) , Eq.(12)
where the value k − 1 is the number of elements on the left side of the recursion tree, i.e., k tells us how good (balanced) the split is. If k = 1 then we have the worst case and if k = n/2 we have the best-case. More generally, if k = O(1) the running time is Θ(n^2 ) and if k = O(n) the running time is O(n log n). In Randomized Quicksort, all values of k are equally likely and thus the average running time is given by averaging Eq. (12) over all k:
where we have made the asymptotic substitution Θ(n) = c1n, for some constant c1. We can simplify this by substituting i = k − 1 in the first summation and i = n − k in the second, and then multiplying through by n:
Eq.(13)
Now we’ll use a technique you can use to solve some recurrence relations: subtract a recurrence for a problem of size n − 1 from the recurrence for a problem of size n. Using Eq. (13), this yields:
Eq.(14) & Eq.(15)
Where we divided both sides of Eq. (14) by n*(n + 1) to produce Eq. (15).
Now, one last change of variables: let T(n)/(n + 1) = D(n), and thus T(n − 1)/n = D(n − 1). This yields:
D(n) = D(n − 1) + O(1/n)
= O(log n) ,
where the last line follows from a bound on the harmonic series. Thus, reversing the substitution for D(n), we see that T(n) = O(n log n).