How common are time complexities like O (n / log n), O (n ^ 2 / log n)?

Algorithms with time complexity , etc. are very common . Is there an algorithm in practice that has time complexities such as (where is a polynomial in n)? If so, can anyone provide an example? If not, why are these temporary difficulties rarely encountered in practice? Thank.O(n), O(n*log n), O(2n)

O(n/log n), O(2^n/P(n))

P(n)

+3


source to share


1 answer


In fact, you are observing runtime such as O (n 2 / log n) in practice. There are many techniques out there, often referred to as the " Four Russians Method" , that can be used to speed up certain types of algorithms. Typically, the Four Russian trick speeds up the algorithm using brute force, computing the solution to the problem on all inputs of size & Theta; (log n) and then redrawing the original input of size n or n 2 as a group of blocks O (n / log n) or O (n 2/ log n), each of which is a subproblem of size & Theta; (log n). The execution time of these algorithms is usually some polynomial over some polylogarithmic term, where the polylogarithmic acceleration comes from the fact that the original size of the problem has been reduced using a polylogarithmic factor.

For example, the standard DP algorithm to compute the edit distance of two strings of length n runs O (n 2 ) time. Using acceleration with four Russians this can be improved to O (n 2 / log 2 n) . Boolean matrix multiplication usually takes O (n 3 ) time, which can be improved to O (n 3 / log n) using the original four-Russian trick.



You can imagine similar tricks that could give you runtime like O (2 n / poly (n)) - just try using algorithms like the ones above on inputs where the input is exponentially large.

Hope this helps!

+1


source







All Articles