Can my algorithm do better?

I was tasked with making the most efficient algorithm I can for the problem. Now I have come to the n * logn complexity. And I was wondering if it could be done better. Thus, the main task is that there are children having a counting game. You are given a number n, which indicates the number of children and m, how many times you miss someone before executing. You need to return a list that gives the execution order. I tried to do it as if you are using a skip list.

Current = m
while table.size>0:
    executed.add(table[current%table.size])
    table.remove(current%table.size)
    Current += m

      

Are my questions correct? Is it n * logn and can you do it better?

+3


source to share


2 answers


Is it correct?

Not. When you remove an item from the table, it table.size

decreases, and the expression current % table.size

usually points to another irrelevant item. For example, 44 % 11

there is 0

, but 44 % 10

is 4

, an element in a completely different place.

Is it n * logn?

Not. If table

is just a random access array, an operation may be required to remove the element n

. For example, if m = 1

, after correcting the above point, the program always deleted the first element of the array. When the array implementation is naive, it is required to perform operations table.size

to traverse the array each time, resulting in a total of the operations n^2 / 2

.

Now it would be n log n

, if table

copied, for example, by a balanced binary search tree with implicit indexes instead of keys, as well as the split and merge primitives. This, for example, traap, here is the result of a quick search for an English source. Such a data structure can be used as an array with the O(log n)

costs of accessing, merging and splitting. But so far nothing says that this is so, and there is no such data structure in the standard language libraries.

Can you do it better?



Correction: partially, yes; perhaps completely.

If we solve the problem in the opposite direction, we have the following subproblem.

Let there be a circle from k

kids, and the pointer is currently in the layout t

. We know that just a moment ago there was a circle of children k + 1

, but we do not know where, which child x

, the pointer was. Then we counted to m

, removed the toddler, and the pointer ended up in t

. Who did we just delete, and what is it x

?

It turns out that the "part x

" can be solved in O(1)

(the figure may be useful here), so the search for the last standing child is doable in O(n)

. As noted in the comments, this is all called The Josephus Problem , and its variants are widely studied, for example, in Concrete Mathematics by Knuth et al.

However, in O(1)

one step, it only finds the number of the last standing child. It does not automatically give the whole child counting order. Of course, there are ways to do this O(log(n))

in a step, O(n log(n))

in general. But as for O(1)

, I don't know at the moment.

+3


source


The complexity of your algorithm depends on the complexity of the operations executed.add(..)

and table.remove(..)

.

If both of them have O (1) complexity , your algorithm has O (n) complexity because the loop ends after n .

While it executed.add(..)

can be easily implemented in O (1) , table.remove(..)

it does take a little thought.




You can do it in O (n):

Store your faces in a LinkedList and link the last element to the first. Removing an item costs O (1) . Gogs for the next selected person would cost O (m) , but this is a constant = O (1) .

So the algorithm has complexity O (n * m) = O (n) (for constant m).

+1


source







All Articles