In: Computer Science
Hi,
I found this in the book balanced introduction to computer science,Javascript language, ⌈Log2(N)⌉ determine the number of checks it takes for binary search to find an item. I also saw this ⌈Log2(N)⌉+1 to find the number of checks,
what is the difference between these 2, which one should I use?
Binary search is a divide-and-conquer algorithm. To find an element in a sorted array, A , we first compare the search element with A[n2] , and recurse either on the first half, A[0,…,n2–1] , or on the second half, A[n2,…,n−1] .
Thus, we can find the recurrence to be T(n)=T(⌈n2⌉)+O(1) . The first part of the recurrence should be obvious. The O(1) part comes because we have to determine which half of the array to recurse on. This comparison can be done in constant time.
To solve this recurrence relation, we can either use the master theorem, or solve it algebraically. I’ll solve it algebraically, since the solution follows trivially from the master theorem with a=1,b=2,d=0 .
First, not that T(1)= 1 . Now, suppose that n=2k⟹k=⌈log2n⌉ . Thus, we have that
T(2k)=T(2k2)+1
=T(2k−1)+1
=[T(2k−2)+1]+1
=T(2k−2)+2
=[T(2k−3)+1]+3
⋮
=T(2k−k)+k
=T(20)+k
=T(1)+k
Now, since T(1)=1 , we have that T(2k)=1+k . Thus, T(n)=⌈log2n⌉+1 .
Therefore, performing binary search, you should expect to have ⌈log2n⌉+1 comparisons where n is the number of elements in your search space.