Time Algorithm Optimization

I am writing an algorithm for this problem, the algorithm is simple, I have already written the code, but I do not see any optimization:

I have a bucket with 100 stones and 5 kids who use it to decorate their sand castles. Each child picks up the stone every certain period of time, each child is independent of the others, more children can pick up the stone at the same time, 5 children in total:

  • Eric collects the stone every 5 minutes.
  • Mark collects the stone every 10 minutes.
  • Lara picks up the stone every 7 minutes.
  • Emma collects the stone every 3 minutes.
  • Frank picks up the stone every 3 minutes.

How many minutes does it take for an empty bucket?

To be clearer: After 10 minutes, Eric has two stones (minute 5 and minute 10) and Emma has 3 stones (minutes 3, 6 and 9).

So, after 10 minutes the children have 2 + 1 + 1 + 3 + 3 = 10 stones, there are 90 stones in the bucket

This is my code (Python 3):

children_rate = [3, 3, 5, 7, 10]
bucket = 100

minutes = 0

while True:
    minutes += 1
    for child in children_rate:
        if minutes % child == 0:
            bucket -= 1
            if bucket == 0:
                print('bucket empty in',minutes,'minutes')
                exit()

      

This code works, in this case the required minutes is 91, but I cannot use this code to handle a bucket with 1 million stones and 500 children.

The only optimization I see is to convert the mod operation to sum / add operations, since division / multiplication is more expensive. I can use numpy arrays etc. but nothing can speed up the process.

I tried to adapt the problem to some of the typical information problem described in my algorithm tutorial with no luck.

+3


source to share


3 answers


One thing you can do is binary search for the answer.

Suppose the answer is X

minutes. Then you know how many stones each child will take during this time. If the total number of stones received is less than expected, it X

should be higher. Otherwise, search in the lower half.



In code:

children_rate = [3, 3, 5, 7, 10]
bucket = 100

lo, hi = 0, bucket * max (children_rate)
while lo < hi:
    me = (lo + hi) // 2
    if sum (me // i for i in children_rate) < bucket:
        lo = me + 1
    else:
        hi = me

print (lo)

      

+1


source


You can turn the algorithm around so that in a given number of minutes, you will calculate how many stones were used by all children.

def compute_stones(minutes)
    stones = 0
    for child in children_rate:
        stones += minutes // child  # integer division
    return stones

      



Then you can do a binary skip to find the number of minutes during which stones = 100

+2


source


Each child chooses a stone at a certain period, all children together choose stones in a certain pattern, which also has a period. Period of this template The least common metric of each child period. It can be calculated in several ways, but in this case, you can use factorization of each period.

3  =   3
5  =     5
7  =       7
10 = 2 * 5

      

Such a general period 210 = 2 * 3 * 5 * 7

. At this time, Eric chooses 42 stones, Mark chooses 21 stones, Lara chooses 30 stones, and Emma and Frank chooses 70 stones each. That's 233 stones every 210 minutes. If you have 1 million stones in your bucket, they take 999803 stones in 901110 minutes, and you run your original code on the remaining 197 stones. Easy, isn't it?

+1


source







All Articles