How to write a recurring relation for a given piece of code

In my class of algorithms and data, we got several recurrence relations, either to solve or to see the complexity of the algorithm.

At first I thought that the simple purpose of this relationship was to write down the complexity of the recursive dividing and conquering algorithm. Then I came across a question in MIT assignments where it was asked to provide a repetition relation for an iterative algorithm.

How would I actually come up with a recurrent relationship given some code? What are the required steps?

Is it really correct that I can write down any case, for example, worst, best, average case with this attitude?

Can anyone give a simple example of how a piece of code turns into a recurrent relationship?

Cheers, Andrew

+3


source to share


2 answers


Okay, so in algorithm analysis, the recurrence relation is a function that relates the amount of work required to solve a problem of size n to solve problems of a smaller size (this is closely related to its meaning in mathematics).

For example, consider the Fibonacci function below:

Fib(a) 
{
  if(a==1 || a==0)
    return 1;
  return Fib(a-1) + Fib(a-2);
}

      

This does three operations (compare, compare, add) and also calls itself recursively. Thus, the recurrence relation T(n) = 3 + T(n-1) + T(n-2)

. To solve this problem, you must use an iterative method: start expanding terms until you find a pattern. For this example, you can expand T(n-1)

to get T(n) = 6 + 2*T(n-2) + T(n-3)

. Then expand T(n-2)

to receive T(n) = 12 + 3*T(n-3) + 2*T(n-4)

. Expand again T(n-3)

to receive T(n) = 21 + 5*T(n-4) + 3*T(n-5)

. Please note that the coefficient of the first T-term is equal to the Fibonacci numbers, and the constant term is the sum of them three times: search, i.e. 3*(Fib(n+2)-1)

... More importantly, note that the sequence grows exponentially; those. the complexity of the algorithm is O (2 n ).

Then consider this function for merge sort:



Merge(ary)
{
  ary_start = Merge(ary[0:n/2]);
  ary_end = Merge(ary[n/2:n]);

  return MergeArrays(ary_start, ary_end);
}

      

This function calls itself on the input half twice, then concatenates the two halves (using an O (n) job). That is T(n) = T(n/2) + T(n/2) + O(n)

. To solve recurrent relations of this type, you must use the Master Theorem . By this theorem, it expands to T(n) = O(n log n)

.

Finally, consider this function for calculating Fibonacci:

Fib2(n)
{
  two = one = 1;
  for(i from 2 to n)
  {
    temp = two + one;
    one = two;
    two = temp;
  }
  return two;
}

      

This function calls itself more than once, and iterates O (n) times. Therefore its a recurrence relation T(n) = O(n)

. This is the case you asked about. This is a special case of recurring relationships without repetition; therefore it is very easy to solve it.

+9


source


To find the running time of an algorithm, we first need to write an expression for the algorithm, and this expression tells the execution time for each step. Therefore, you need to go through each step of the algorithm to find the expression. For example, suppose we defined an isSorted predicate that takes as input array a and size n of the array and returns true if and only if the array was sorted in ascending order.

bool isSorted(int *a, int n) {
   if (n == 1)
      return true;       // a 1-element array is always sorted
   for (int i = 0; i < n-1; i++) {
      if (a[i] > a[i+1]) // found two adjacent elements out of order
         return false;
   }
   return true;         // everything in sorted order
}

      

Clearly the size of the input here will just be n, the size of the array. How many worst-case steps would you take to enter n?

The first if statement counts as 1 step

The for loop will execute n-1 times in the worst case (assuming the internal test doesn't kick us out), for a total of n-1 time for the loop test and index increment.

Inside the loop, there is another if statement that will be executed once per iteration for all n-1 times, in the worst case.

The last refund will be done once.

So in the worst case we will do 1+ (n-1) + (n-1) +1

for the total runtime T (n) ≤1 + (n-1) + (n-1) + 1 = 2n, and so we have the synchronization function T (n) = O (n).

So, in a nutshell, what we did is →>



1. For the parameter 'n', which gives the size of the input, we assume that each simple statement that is executed once will take a constant time, for simplicity we assume that

2. Iterative statements like loops and inner body will take variable time depending on the input. Which has a solution T (n) = O (n), as it does with the non-recursive version, as it does.

3. Your task is to go step by step and write the function in terms of n to calculate the time complexity

For recursive algorithms, you do the same, only this time you add the time taken by each recursive call, expressed as a function of the time it takes to enter it. For example, let's rewrite isSorted as a recursive algorithm:

bool isSorted(int *a, int n) {
   if (n == 1)
      return true;
   if (a[n-2] > a[n-1])         // are the last two elements out of order?
      return false;
   else
      return isSorted(a, n-1); // is the initial part of the array sorted?
}

      

In this case, we are still going through the algorithm, counting: 1 step for the first, if plus 1 step for the second, if plus time isSorted will take input size n-1, which will be T (n -1), giving a recurrence relation

T (n) ≤1 + 1 + T (n-1) = T (n-1) + O (1)

Which has a solution T (n) = O (n), as it does with the non-recursive version, as it does.

Just enough! Practice More to write a recurrence relation of various algorithms, keeping in mind how many times each step will be performed in the algorithm

0


source







All Articles