Section 25.11 Recursive Function Calls
Our accounting of the work done by functions, charging for the work done by the function called, hits a snag when we go to analyze recursive functions. Consider the following simple recursive function that computes the factorial of a number:
int factorial(int n) {
if (n <= 1) {
return 1;
} else {
return n * factorial(n - 1);
}
}
Everything in
factorial looks like
\(O(1)\) work except for the recursive call to
factorial itself. So how much work does that call do? It does whatever work
factorial does when called with
n - 1. But that work includes another call to
factorial, this time with
n - 2. And that call includes yet another call to
factorial, this time with
n - 3. So how much work does
factorial do?
This chain will not go on forever. Eventually, we will reach the base case where
n is 1 or less. At that point, the function simply returns 1 and does no further recursive calls. In that case, when we execute the if branch, the work done is
\(O(1)\text{.}\)
In the else branch, we recursively call the function on
\(n - 1\text{.}\) So the time on that branch will be whatever time it takes to compute
factorial(n - 1), plus the
\(O(1)\) work to multiply by
n and return the result.
int factorial(int n) {
if (n <= 1) {
// O(1) work
return 1;
} else {
// time for factorial(n - 1)
// O(1) work to multiple and return
return n * factorial(n - 1);
}
}
To analyze the overall work of factorial, we can set up a recurrence relation. Let \(T(n)\) be the time it takes to compute factorial(n). Then we have:
\begin{equation*}
T(n) = \begin{cases}
O(1) & \text{if } n \leq 1 \\
T(n - 1) + O(1) & \text{if } n > 1
\end{cases}
\end{equation*}
Full coverage of solving recurrence relations is beyond the scope of this book, but we will give a brief overview of how to solve this one. Key to this process is recognizing that because
\(T(n) = T(n - 1) + O(1)\text{,}\) anywhere we see
\(T(a)\) for some value
\(a\text{,}\) we can replace it with
\(T((a) - 1) + O(1)\text{.}\)
First, we will take \(T(n - 1)\) and expand it by replacing it with \(T((n - 1) - 1) + O(1)\text{:}\)
\begin{equation*}
T(n) = T(n - 1) + O(1)
\end{equation*}
becomes
\begin{align*}
(T((n - 1) - 1) \amp + O(1)) + O(1)\\
T(n - 2) \amp + 2 \cdot O(1)
\end{align*}
Next, we can take \(T(n - 2)\) and expand it by replacing it with \(T(n - 3) + O(1)\text{:}\)
\begin{gather*}
(T(n - 3) + O(1)) + 2 \cdot O(1)\\
T(n - 3) + 3 \cdot O(1)
\end{gather*}
Which could become:
\begin{equation*}
T(n - 4) + 4 \cdot O(1)
\end{equation*}
Then:
\begin{gather*}
T(n - 5) + 5 \cdot O(1)\\
\vdots
\end{gather*}
Continuing this process, we can see a pattern emerging. Each time we expand \(T(a)\text{,}\) we reduce the argument by 1 and add another \(O(1)\) term. At some point, we will reach \(T(1)\text{,}\) which is our base case. At that point, we will have added \((n - 1)\) \(O(1)\) terms. So we can express \(T(n)\) as:
\begin{gather*}
T(1) + (n - 1) \cdot O(1)
\end{gather*}
From analyzing the base case, we know that if we know that when
n is 1, the time to execute
factorial(n) is
\(O(1)\text{:}\)
int factorial(int n) {
if (n <= 1) {
// O(1) work
return 1;
}
...
So we can substitute \(O(1)\) for \(T(1)\) in our expression for \(T(n)\text{:}\)
\begin{gather*}
O(1) + (n - 1) \cdot O(1)
\end{gather*}
Which simplifies to:
\begin{equation*}
O(1) + O(n) - O(1)
\end{equation*}
And further simplifies to:
\begin{equation*}
O(n)
\end{equation*}
Thus, the overall work of the
factorial function is
\(O(n)\text{.}\) That makes sense, since the function makes
\(n\) recursive calls before reaching the base case, and each call does
\(O(1)\) work. (We canβt always rely on intuition like that, but in this case it works out.)
For a second, consider a non-recursive implementation of
factorial:
int factorial(int n) {
int total = 1;
for (int i = 1; i <= n; i++) {
total *= i;
}
return total;
}
Quick analysis tells us that this non-recursive implementation also has a time complexity of
\(O(n)\text{,}\) since it uses a loop that iterates
\(n\) times, performing
\(O(1)\) work in each iteration. Because recursion and iteration are just two different ways to repeat work, and both of these algorithms are doing the same work (multiplying
\(n\) numbers), it seems reasonable that they have the same time complexity.
Checkpoint 25.11.1.
If we call
factorial(n), what is the expected number of recursive calls?
- \(O(1)\)
-
- \(O(\log n)\)
-
- \(O(n)\)
-
- \(O(n^2)\)
-
Checkpoint 25.11.2.
If we call the function listed below, what is the expected number of recursive calls?
int mysteryFunction(int n) {
if (n <= 1) {
return 1;
} else {
return mysteryFunction(n / 2) + 1;
}
}
- \(O(1)\)
-
- \(O(\log n)\)
-
- \(O(n)\)
-
- \(O(n^2)\)
-
You have attempted
of
activities on this page.