Have I invented a new sorting algorithm? or is it the same as quicksort

I made an algorithm for sorting, but then I thought that maybe I just invented quicksort.

However, I've heard that quicksort is the O (N ^ 2) worst case; I think my algorithm should only be O (NLogN) at worst.

Is this the same as quicksort?

The algorithm works by swapping values ​​so that all values ​​less than the median move to the left of the array. Then it runs recursively on each side.

The algorithm starts with i = 0, j = n-1

i and j move towards each other when list [i] and list [j] are replaced as needed.

Here's some code for the first iteration before recursion:

_list = [1,-4,2,-5,3,-6]

def in_place(_list,i,j,median):
    while i<j:
        a,b = _list[i],_list[j]
        if (a<median and b>=median):
            i+=1
            j-=1
        elif (a>=median and b<median):
            _list[i],_list[j]=b,a
            i+=1
            j-=1
        elif a<median:
            i+=1
        else:
            j-=1
    print "changed to ", _list



def get_median(_list):
    #approximate median in O(N) with O(1) space
    return -4

median = get_median(_list)
in_place(_list,0,len(_list)-1,median)

"""
changed1 to  [-6, -5, 2, -4, 3, 1]
"""

      

+3


source to share


2 answers


http://en.wikipedia.org/wiki/Quicksort#Selection-based_pivoting



Conversely, once we know the O (n) worst-case selection algorithm we can use it to find the perfect pivot (median) at each quicksort step, creating a worst-case O (n log n) duration. However, in practical implementations, this option is on average much slower.

Another option is to choose the median of the medians as the pivot of the element instead of the median for splitting the elements. While maintaining an asymptotically optimal time complexity of O (n log n) (preventing worst-case partitions), this is also significantly faster than the option that chooses the median axis.

+4


source


For starters, I am assuming that the other code is not shown, as I am sure the code you showed on its own will not work.

I'm sorry to steal your fire, but I'm afraid the code you are showing seems to be Quicksort, and not only that, but it looks like the code may be suffering from some bugs.

Consider the case of sorting a list of identical elements. Your method _in_place

, which seems to be traditionally called a section in Quicksort, will not move any items correctly, but at the end j

and i

seems to reflect a list containing only one section containing the entire list, in which case you are back to the whole list forever. I am assuming that, as mentioned, you are not returning anything from it, or apparently not sorting at all anywhere, so I am left to guess how this will be used.



I'm afraid the real median for Quicksort is not only a fairly slow strategy in the average, but it doesn't rule out the worst case O (n ^ 2), again a list of identical items would provide such a worst case. However, I think a 3-way Quicksort section with this median selection algorithm guarantees O (n * log n) time. However, this is a known axis selection option and not a new algorithm.

In short, this appears to be an incomplete and possibly erroneous Quicksort, and without three-way splitting, using the median doesn't guarantee you O (n * log n). However, I feel it is good and helpful to congratulate you on how to use the median yourself, even if it was previously thought to be different.

+2


source







All Articles