How difficult are these three loops?

Availability:

  • input array A[1...n]

  • N

    length A

Algorithm:

for(int i=N; i>0; i--) { // Loop 1
    for(int j=1; j<N; j=j*2) { // Loop 2
        for(int k=0; k<j; k++) { // Loop 3
            // constant number of operations
        }
    }
}

      

I know that the loop 1

has O(N)

time complexity.

For a loop, 2

I would say time complexity O(logN)

.

What is the complexity for the loop and 3

(and 2

if I'm wrong) and for the algorithm?

+3


source to share


2 answers


O (N ^ 2)

Due to the interdependence of j

and k

(at k<j

), you cannot consider loop2 and loop3 separately. Let N=2^m

, for simplicity of calculations. So there j

will be 1, 2, 4, ..., 2 ^ (m-1) and loop3 runs j

once every time it is reached. So the combined loop2 and loop3 are done

  1   + 2   + 4   + ... + 2^(m-1)
= 2^0 + 2^1 + 2^2 + ... + 2^(m-1)

      

It is a Geometric progression and is 2^m - 1 = N - 1

equal to O (N). Now, tossing into loop1 is O (N), and we get O (N ^ 2).

Edit:



Here is the Perl code I ran to test this

print "N\tExpected\tCounted\n";

for my $N (10, 100, 1024, 8192)
{
    my $count = 0;
    for(my $i=$N; $i>0; $i--)
    {
        for(my $j=1; $j<$N; $j*=2)
        {
            for(my $k=0; $k<$j; $k++)
            {
                $count++;
            }
        }
    }
    my $expected_count = $N*$N - $N;
    print "$N\t$expected_count\t$count\n";
}

      

And the output is:

N       Expected        Counted
10      90              150
100     9900            12700
1024    1047552         1047552
8192    67100672        67100672

      

Please note that we do not get exactly on hold if N=2^m

.

+6


source


For k, the loop runs 2, 4, ..., N, or N-1 times, depending on whether N is even or odd, so the time complexity for three loops is O (nlognn).

Edit: Thanks for Degustaf. I messed this up and found that the loop k

depends on the loop j

, so the time complexity of the for (j) loop is: max (O (n), O (logn)) = O (n).



Change again. Consider a loop j

and a loop k

as a whole, it executes 2 + 4 + ... + N times, and the number of elements is logN. Time complexity for loop j

and loop k

is what Degustaf writes: O (N).

The reason I think is max (O (n), O (logn)): Upon reaching the point where the loop k

runs N times, the for loop j

did log N-1 times. During the previous executions, the execution time k

can be considered significantly less, for example, a constant like N / 2, N / 4 .., 2. Thus, the last N-times loop is k

considered to be executed after the j

loop is executed for (logN-1) times.

0


source







All Articles