Logarithmic function in time complexity

How does the worst case of the program or the average case depend on the log function? How does the journal base work?

+3


source to share


1 answer


The log factor comes in when you break your problem into k

pieces, n/k

each one size , and then "recurse" (or simulate recursion) for some of them.

A simple example is the following loop:

foo(n):
  while n > 0:
     n = n/2
     print n

      

The above text will print n, n/2, n/4, .... , 1

- and there are O(logn)

such values. The complexity of the above program O(logn)

, since each print requires a constant amount of time and the number of values n

will go as follows:O(logn)

If you're looking for "real life" examples in quicksort (assuming exactly two halves for simplicity), you split an array n

into two size subarrays n/2

, and then you recurse on both of them - and invoke the algorithm on each half.

This makes a tricky function:

T(n) = 2T(n/2) + O(n)

      

From the main theorem , this is in Theta(nlogn)

.



Likewise, with a binary search, you split the problem into two parts and only recurse on one of them:

T(n) = T(n/2) + 1

      

What will be in Theta(logn)


The base is not a factor of great complexity O, because

log_k(n) = log_2(n)/log_2(k)

      

and log_2 (k) is constant, for any constant k

.

+5


source







All Articles