Insight 25.9.1.
A function call βcostsβ work equal to the work done by the function being called.
int size;
cin >> size;
printRow(size);
printGrid(size);
printRow(size * 2);
#include <iostream>
using namespace std;
void printRow(int size) {
for (int i = 0; i < size; i++) {
cout << "*";
}
cout << endl;
}
void printGrid(int size) {
for (int i = 0; i < size; i++) {
for (int j = 0; j < size; j++) {
cout << "*";
}
cout << endl;
}
}
int main() {
int size;
cin >> size;
printRow(size);
printGrid(size);
printRow(size * 2);
}
size is passed into both functions. Each function uses that value to control one or more loops. So when we say \(n\) for this analysis, we will be talking about size. If we changed the value of size, that would change the amount of work done by both functions and thus the overall work of the program. (Although size is hard-coded in this example, we can imagine it being read from user input and thus truly variable.)
printRow has a single loop that runs \(O(n)\) times and does constant work. It also does a constant-time operations outside of the loop. But the overall work of printRow is \(O(n)\text{.}\)
void printRow(int size) {
// O(n) work where n = size
}
printGrid has a nested loop. The outer loop runs \(O(n)\) times and for each iteration of that loop, the inner loop runs \(O(n)\) times. So the overall work of printGrid is \(O(n^2)\) .
void printGrid(int size) {
// O(n^2) work where n = size
}
main. A call to printRow does \(O(n)\) work where \(n\) is the value passed to the function. A call to printGrid does \(O(n^2)\) work where \(n\) is the value passed to the function. The first two function calls in main both pass in size, which is \(n\text{.}\) So they are doing \(O(n)\) and \(O(n^2)\) work respectively:
#include <iostream>
using namespace std;
void printRow(int size) {
// O(n) work where n = size
}
void printGrid(int size) {
// O(n^2) work where n = size
}
int main() {
int size;
cin >> size;
// O(n) work for printRow(size)
// O(n^2) work for printGrid(size)
printRow(size * 2);
}
size * 2. So if size is \(n\text{,}\) then the function call does \(2 \cdot O(n)\) work. But that is still \(O(n)\) work. So that last call to printRow is also doing \(O(n)\) work:
#include <iostream>
using namespace std;
void printRow(int size) {
// O(n) work where n = size
}
void printGrid(int size) {
// O(n^2) work where n = size
}
int main() {
// O(1) to get size
// O(n) work for printRow(size)
// O(n^2) work for printGrid(size)
// O(n) work for printRow(size)
}
main is \(O(n^2)\text{.}\) We would get the same result if we listed all the terms and then simplified:int main() {
int size;
cin >> size;
printRow(size);
printGrid(10);
printRow(size + 1);
}
printGrid is passing in a constant value of 10. That function does \(O(n^2)\) work where \(n\) is the value passed in. But since we are passing in a constant value, that means the function is doing \(O(10^2)\) work, which is a constant and thus \(O(1)\text{.}\) No matter what value size has, the second function call is always doing the same amount of work.
int main() {
// O(1) to get size
// O(n) work for printRow(size)
// O(1) work for printGrid(10)
// O(n) work for printRow(size + 1)
}
main is dominated by the \(O(n)\) terms, so it is \(O(n)\text{.}\)
for (int i = 0; i < n; i++) {
foo(n);
}
foo(n) which does \(O(n)\) work. The loop runs \(n\) times, so the overall work is \(n \cdot O(n) = O(n^2)\text{.}\)
for (int i = 0; i < n; i++) {
foo(5);
}
foo(5) which does \(O(1)\) work since it is called with a constant. The loop runs \(n\) times, so the overall work is \(n \cdot
O(1) = O(n)\text{.}\)
foo(n/2);
int x = 10;
foo(n);
foo(n/2) does \(O(n/2)\) work, which is just \(O(n)\text{.}\) The second call to foo(n) does \(O(n)\) work. So the overall work is \(O(n) +
O(n) = 2 \cdot O(n) = O(n)\text{.}\)
foo(n);
for (int i = 0; i < n; i++) {
foo(n/2);
foo(n/2);
}
foo(n) does \(O(n)\) work. The loop runs \(n\) times and each iteration does two calls to foo(n/2), each of which does \(O(n/2)\) work. So the loop does \(n \cdot 2 \cdot O(n/2) = n \cdot O(n) = O(n^2)\) work. The overall work is thus \(O(n) + O(n^2) = O(n^2)\text{.}\)