# Class 22: Analyzing Recursive Algorithms

Back to Algorithm Analysis, Revisited. On to Arrays.

Held Tuesday, February 29, 2000

Overview

Today we continue our discussion of algorithm analysis and recursion by visiting other recursive algorithms and considering analysis techniques.

Notes

• The syllabus is quite screwed up by the recent changes. Don't worry about due dates and readings except as mentioned in each day's class.
• Reminders:
• Project, Phase 2 is due tomorrow. Are there questions on what you need to get done?
• Assigned:
• Read Lab J6 for tomorrow's class.

Contents

Summary

• Exponentiation, revisited
• Techniques for analyzing recursive functions
• Fibonacci numbers.

## Simple Recursive Exponentiation

• In the previous class, we decided that we could implement exponentation using the following rule:
```exp(x,0) is 1
exp(x,n) is x * exp(x, n-1)
```
• But how do we analyze the running time of this algorithm?
• One technique that is typically fruitful (although somewhat mathematical), is to define a function, fb(n), that represents the the number of steps the algorithm takes on input of size n.
• We know that when n is 0, the function takes a constant number of steps.
• fb(0) = b1
• When n is greater than 0, the function takes some constant number of steps before and after the recursive call plus the time for the recursive call.
• We'll call the constant extra steps b2
• How do we determine the number of steps for the recursive call (`expB(x,n-1)`)? Hmmm ... Wait! fb is supposed to tell us that!
• fb(n) = b2 + fb(n-1)
• We've defined the running time recursively, too. That doesn't help us much if we want to do some Big-O analysis. So let's figure out what fb really is.
• Let's start by repeatedly applying our recursive rule ...
• fb(n) i+ = b2 + fb(n-1)
• = b2 + b2 + fb(n-2)
• = b2 + b2 + b2 + fb(n-3)
• We might then generalize
• fb(n) = k*b2 + fb(n-k)
• Can we ever get rid of the recursive term? Yes! When n = k, n-k is 0, and we know the value of fb(0).
• fb(n) = n*b2 + b1
• As you might be able to tell, that's just O(n).

## Generalizing the Technique

• The technique we just used is a good one for analyzing the running time of recursive functions, so let's generalize.
• Here's how I think of it:
• Start by defining a function, f, that represents the running time of your method.
• Determine the value of f for the base cases.
• Determine the value of f for the recursive cases. This will normally be a recursive function definition
• Try expanding the right-hand side of the recursive definition a few times.
• Eventually, you should see a pattern, so generalize.
• How do you see the patterns? Mostly practice and luck.

## Algorithm Design: Divide and Conquer

• As you may have noted, binary search is much faster than sequential search.
• How did we make binary search run faster? We took a particular approach to the data:
• We broke the input into half
• We recursed on one half.
• The idea of breaking the input in half will be useful in developing algorithms to solve a variety of problems. It is a key part of your ``programmer's toolbox''. Make sure to consider whether it is appropriate each time you visit a problem.

## Exponentiation with Divide and Conquer

• Can we use divide-and conquer to solve the exponentiation problem (or to improve our solution)?
• What can we divide in half? (What gives the ``size'' of the problem?)
• Here's another version of exponentiation, based on a divide-and-conquer strategy.
```  /**
* Compute x^n by a recursive divide-and-conquer algorithm.
*/
public static double expC(double x, int n) {
// Handle negative exponents.  x^(-n) = 1/(x^n)
if (n < 0) {
return 1/expB(x,-n);
} // if (n < 0)

// Base case: x^0 is 1.
if (n == 0) {
return 1;
} // Base case: n == 0

// Recursive case (when n is odd)
//   x*x*...*x = x*(x*...*x)
//   That is: x^n = x*(x^(n-1))
else if (n % 2 == 1) {
return x * expC(x,n-1);
} // Recursive case (when n is odd)

// Recursive case (when n is even)
//   Let n be 2k
//   x^n = x^(2k) = x^k * x^k
else {
int k = n/2;
double tmp = expC(x,k);
return tmp*tmp;
} // Recursive case (when n is even)
} // expC
```
• We'll use the same technique of ``define a function whose value is the number of steps the algorithm takes''. This time, we'll call the function fc.
• This time, life is a little harder because we have two different recursive cases. (Odd and Even).
• Fortunately, the odd case transforms into the even case. That is, if n is odd, then on the next recursive call it will be even.
• In effect, by choosing a bigger value for the ``constant'' work, we can pretend that n is divided by two in each recursive call.
• If it makes it easier, think about replacing the lines that read
```    else if (n % 2 == 1) {
return x * expC(x,n-1);
} // Recursive case (when n is odd)
```
with
```    else if (n % 2 == 1) {
double tmp = expC(x, (n-1)/2);
return x * tmp * tmp;
} // Recursive case (when n is odd)
```
• So
• fc(0) = c1
• fc(1) = c2
• fc(n) = c3 + fc(n/2)
• What function is this? Again, we will try some analysis by hand to figure it out.
• A few applications of the recursive rule
• fc(n)
• = c3 + fc(n/2)
• = c3 + c3 + fc(n/4)
• = c3 + c3 + c3 + fc(n/8)
• Can we generalize? Certainly.
• fc(n) = k*c3 + fc(n/2k)
• When can we get rid of the recursive term? When n/2k is 1.
• That is, k is ``the power to which you must raise 2 in order to get n''.
• Fortunately, there's a name for that value. We call it log2(n).
• Hence,
• fc(n) = log2(n)*c3 + c2
• That is, this algorithm is an O(log2(n)) algorithm.

## Fibonacci

• We can compute Fibonacci numbers
• iteratively, starting at the bottom and working our way up
• recursively, using the rule that fib(n) = fib(n-1) + fib(n-2)
• The iterative Fibonacci calculation is fairly easy to analyze. Since we step through n different Fibonacci numbers and do a constant amount of calculation for each, that algorithm is O(n).
• The recursive version is much harder. Again, we'll use a function to represent the number of steps the algorithm takes on input of size n. We'll just call this function f.
• Two base cases:
• f(0) = d1
• f(1) = d2
• One recursive case
• f(n) = d3 + f(n-1) + f(n-2)
• Hmmm ... this is surprisingly similar to the function that defines Fibonacci.
• Thus, the number of steps to compute the nth Fibonacci number is more than the value of the nth Fibonacci number.
• This function grows fairly quickly, but how quickly?
• We'll ignore the d3 to make our life easier. (No, you can't always do so, but it seems safe in this case.)
• We'll take advantage of the observation that f(n-2) <= f(n-1)
• While the observation is fairly obvious, the decision to use it is not. You should not expect to have come up with it on your own.
• So, let's try applying that observation
• f(n) = f(n-1) + f(n-2)
• f(n) <= f(n-1) + f(n-1)
• f(n) <= 2*f(n-1)
• Okay, now we can try applying that rule a few times.
• f(n) <= 2*f(n-1)
• f(n) <= 2*2*f(n-2)
• f(n) <= 2*2*2*f(n-3)
• Can we generalize? Sure
• f(n) <= 2k*f(n-k)
• Does this lead to a natural answer. Let k = n.
• f(n) <= 2n*d1
• f(n) is in O(2n)
• This may be painfully slow (as you've seen)
• But this is an upper bound, and it may be much too large. Let's try a lower bound.
• Again, we'll use the relationship between f(n-1) and f(n-2), but in the opposite direction.
• f(n) = f(n-1) + f(n-2)
• f(n) >= f(n-2) + f(n-2)
• f(n) >= 2*f(n-2)
• We can now apply that rule a few times
• f(n) >= 2*f(n-2)
• f(n) >= 2*2*f(n-4)
• f(n) >= 2*2*2*f(n-6)
• Generalizing,
• f(n) >= 2k*f(n-2k)
• When k = n/2, we're done
• f(n) >= 2(n/2)*d1
• This is a lower bound, rather than an upper bound. Hence, we cannot use Big-O. Instead, we use little-o for lower bound.
• f(n) is in o(2(n/2)).
• This is still painfully slow.

## History

Tuesday, 18 January 2000

• Created as a blank outline.

Tuesday, 29, February 2000

• Filled in the details. Many were taken from the previous outline, and previously from outline 22 of CSC152 99F.
• Added new section generalizing the analysis technique.

Back to Algorithm Analysis, Revisited. On to Arrays.

Disclaimer Often, these pages were created "on the fly" with little, if any, proofreading. Any or all of the information on the pages may be incorrect. Please contact me if you notice errors.