Held Tuesday, February 29, 2000
Overview
Today we continue our discussion of algorithm analysis and recursion
by visiting other recursive algorithms and considering analysis
techniques.
Notes
- The syllabus is quite screwed up
by the recent changes. Don't worry about due dates and readings except
as mentioned in each day's class.
- Reminders:
- Project, Phase 2 is due tomorrow. Are there questions on what you
need to get done?
- Assigned:
- Read Lab J6 for tomorrow's class.
Contents
Summary
- Exponentiation, revisited
- Techniques for analyzing recursive functions
- Fibonacci numbers.
- In the previous class, we decided that we could implement
exponentation using the following rule:
exp(x,0) is 1
exp(x,n) is x * exp(x, n-1)
- But how do we analyze the running time of this algorithm?
- One technique that is typically fruitful (although somewhat
mathematical), is to define a function, f_{b}(n), that
represents the the number of steps the algorithm takes on input
of size n.
- What can we say about this function?
- We know that when n is 0, the function takes a constant number of
steps.
- When n is greater than 0, the function takes some constant
number of steps before and after the recursive call plus
the time for the recursive call.
- We'll call the constant extra steps b_{2}
- How do we determine the number of steps for the recursive call
(
expB(x,n-1)
)?
Hmmm ... Wait! f_{b} is supposed to tell us that!
- f_{b}(n) = b_{2} + f_{b}(n-1)
- We've defined the running time recursively, too. That doesn't
help us much if we want to do some Big-O analysis. So let's
figure out what f_{b} really is.
- Let's start by repeatedly applying our recursive rule ...
- f_{b}(n)
i+ = b_{2} + f_{b}(n-1)
- = b_{2} +
b_{2} +
f_{b}(n-2)
- = b_{2} +
b_{2} +
b_{2} +
f_{b}(n-3)
- We might then generalize
- f_{b}(n) = k*b_{2} + f_{b}(n-k)
- Can we ever get rid of the recursive term? Yes! When n = k,
n-k is 0, and we know the value of f_{b}(0).
- f_{b}(n) = n*b_{2} + b_{1}
- As you might be able to tell, that's just O(n).
- The technique we just used is a good one for analyzing the running
time of recursive functions, so let's generalize.
- Here's how I think of it:
- Start by defining a function, f, that represents the running
time of your method.
- Determine the value of f for the base cases.
- Determine the value of f for the recursive cases.
This will normally be a recursive function definition
- Try expanding the right-hand side of the recursive definition a
few times.
- Eventually, you should see a pattern, so generalize.
- Check your pattern.
- How do you see the patterns? Mostly practice and luck.
- As you may have noted, binary search is much faster than sequential
search.
- How did we make binary search run faster? We took a particular
approach to the data:
- We broke the input into half
- We recursed on one half.
- The idea of breaking the input in half will be useful in developing
algorithms to solve a variety of problems. It is a key part of
your ``programmer's toolbox''. Make sure to consider whether it
is appropriate each time you visit a problem.
- Can we use divide-and conquer to solve the exponentiation problem
(or to improve our solution)?
- What can we divide in half? (What gives the ``size'' of the
problem?)
- Here's another version of exponentiation, based on a divide-and-conquer
strategy.
/**
* Compute x^n by a recursive divide-and-conquer algorithm.
*/
public static double expC(double x, int n) {
// Handle negative exponents. x^(-n) = 1/(x^n)
if (n < 0) {
return 1/expB(x,-n);
} // if (n < 0)
// Base case: x^0 is 1.
if (n == 0) {
return 1;
} // Base case: n == 0
// Recursive case (when n is odd)
// x*x*...*x = x*(x*...*x)
// That is: x^n = x*(x^(n-1))
else if (n % 2 == 1) {
return x * expC(x,n-1);
} // Recursive case (when n is odd)
// Recursive case (when n is even)
// Let n be 2k
// x^n = x^(2k) = x^k * x^k
else {
int k = n/2;
double tmp = expC(x,k);
return tmp*tmp;
} // Recursive case (when n is even)
} // expC
- We'll use the same technique of ``define a function whose value is
the number of steps the algorithm takes''. This time, we'll
call the function f_{c}.
- This time, life is a little harder because we have two different
recursive cases. (Odd and Even).
- Fortunately, the odd case transforms into the even case. That
is, if n is odd, then on the next recursive call it will be even.
- In effect, by choosing a bigger value for the ``constant'' work,
we can pretend that n is divided by two in each recursive call.
- If it makes it easier, think about replacing the lines that read
else if (n % 2 == 1) {
return x * expC(x,n-1);
} // Recursive case (when n is odd)
with
else if (n % 2 == 1) {
double tmp = expC(x, (n-1)/2);
return x * tmp * tmp;
} // Recursive case (when n is odd)
- So
- f_{c}(0) = c_{1}
- f_{c}(1) = c_{2}
- f_{c}(n) = c_{3} + f_{c}(n/2)
- What function is this? Again, we will try some analysis by hand
to figure it out.
- A few applications of the recursive rule
- f_{c}(n)
- = c_{3} +
f_{c}(n/2)
- = c_{3} +
c_{3} +
f_{c}(n/4)
- = c_{3} +
c_{3} +
c_{3} +
f_{c}(n/8)
- Can we generalize? Certainly.
- f_{c}(n) =
k*c_{3} +
f_{c}(n/2^{k})
- When can we get rid of the recursive term?
When n/2^{k} is 1.
- That is, k is ``the power to which you must raise 2 in order
to get n''.
- Fortunately, there's a name for that value. We call it
log_{2}(n).
- Hence,
- f_{c}(n) =
log_{2}(n)*c_{3} +
c_{2}
- That is, this algorithm is an O(log_{2}(n)) algorithm.
- We can compute Fibonacci numbers
- iteratively, starting at the bottom and working our way up
- recursively, using the rule that fib(n) = fib(n-1) + fib(n-2)
- The iterative Fibonacci calculation is fairly easy to analyze.
Since we step through n different Fibonacci numbers and do a
constant amount of calculation for each, that algorithm is
O(n).
- The recursive version is much harder. Again, we'll use a
function to represent the number of steps the algorithm takes
on input of size n. We'll just call this function f.
- Two base cases:
- f(0) = d_{1}
- f(1) = d_{2}
- One recursive case
- f(n) = d_{3} + f(n-1) + f(n-2)
- Hmmm ... this is surprisingly similar to the function that
defines Fibonacci.
- Thus, the number of steps to compute the nth Fibonacci number
is more than the value of the nth Fibonacci number.
- This function grows fairly quickly, but how quickly?
- We'll ignore the d_{3} to make our life easier.
(No, you can't always do so, but it seems safe in this case.)
- We'll take advantage of the observation that f(n-2) <= f(n-1)
- While the observation is fairly obvious, the decision to use it
is not. You should not expect to have come up with it on your own.
- So, let's try applying that observation
- f(n) = f(n-1) + f(n-2)
- f(n) <= f(n-1) + f(n-1)
- f(n) <= 2*f(n-1)
- Okay, now we can try applying that rule a few times.
- f(n) <= 2*f(n-1)
- f(n) <= 2*2*f(n-2)
- f(n) <= 2*2*2*f(n-3)
- Can we generalize? Sure
- Does this lead to a natural answer. Let k = n.
- f(n) is in O(2^{n})
- This may be painfully slow (as you've seen)
- But this is an upper bound, and it may be much too large. Let's
try a lower bound.
- Again, we'll use the relationship between f(n-1) and f(n-2), but
in the opposite direction.
- f(n) = f(n-1) + f(n-2)
- f(n) >= f(n-2) + f(n-2)
- f(n) >= 2*f(n-2)
- We can now apply that rule a few times
- f(n) >= 2*f(n-2)
- f(n) >= 2*2*f(n-4)
- f(n) >= 2*2*2*f(n-6)
- Generalizing,
- When k = n/2, we're done
- This is a lower bound, rather than an upper bound. Hence,
we cannot use Big-O. Instead, we use little-o for lower bound.
- This is still painfully slow.
Tuesday, 18 January 2000
- Created as a blank outline.
Tuesday, 29, February 2000
- Filled in the details. Many were taken from the previous outline,
and previously from
outline 22 of
CSC152 99F.
- Added new section generalizing the analysis technique.
Back to Algorithm Analysis, Revisited.
On to Arrays.