Skip to main content

Section 25.6 Identifying Big-O - Lines and Loops

Given an algorithm, how do we identify its Big-O category? We have to come up with a function that estimates the amount of work the algorithm does as a function of the size of its input. However, because we are only interested in the dominant term of that function and don’t care about constant factors, this is easier than it might sound.

Subsection 25.6.1 Lines of Code

One tempting way to estimate the work an algorithm does is to count the number of lines of code it contains. However, this method is not very reliable because the number of lines of code does not necessarily correlate with the amount of work done.
Consider double d = x * x + y * y;. Surely that involves more work than int a = 5;. The first line of code involves multiple operations: it does two multiplies, an addition, and then stores the result. That is clearly more work than the second line.
And if we did some research on the processor being used, we might find that multiplication takes more machine cycles to execute than addition. So maybe each multiplication should count as 3 units of work as it takes that many machine cycles to perform. The addition might count as 1 unit of work and the assignment another unit. So the first line of code would be 3 + 3 + 1 + 1 = 8 units of work, while the second line would be just 1 unit of work.
However, none of those details matter. Both lines of code take a constant amount of work. In Big-O notation, constant factors are ignored. We don’t care if that is 1 unit of work or 100 units of work.

Insight 25.6.1.

The work on any line of code that does not involve a loop or function call is constant, or \(O(1)\text{.}\)
Assignment, doing math, comparing values, accessing an element in an array, and similar operations all take constant time. Any combination of these operations takes constant time.
A similar thing applies to a sequence of lines of code. Consider these two fragments:
int x = 5;
int y = 10;
int z = 15;
int a = 20;
int x = y + z - a;
The left fragment takes \(O(1)\) time.
The right fragment takes \(O(1) + O(1) + O(1) + O(1)\) time. \(O(1) + O(1) + O(1) + O(1)\) can be written as \(4 \cdot O(1)\text{,}\) which simplifies to \(O(1)\) when we drop the constant factor.
In Big O terms, both fragments take \(O(1)\) time. The exact amount of time will clearly be different between the two fragments, but both will involve some constant amount of work and that is all that is important.

Insight 25.6.2.

The work in a sequence of lines of code that do not involve a loop or function call is constant, or \(O(1)\text{.}\)

Subsection 25.6.2 Loops

Loops are where things get more interesting. The amount of work done in a loop depends on how many times the loop runs and how much work is done in each iteration.
Consider this loop:
for (int i = 0; i < n; i++) {
  sum = sum + i;
}
The loop runs \(n\) times. Each time through the loop, it does a constant amount of work: adding i to sum and updating i. So the total amount of work done by the loop is \(n \cdot O(1)\text{.}\) Here, \(n\) is not a constant. If n is 100, the loop repeats 100 times. If n is 1000 it will repeat 1000 times. So we can’t simply drop n like we would a constant factor. Instead, we must keep it in our Big-O expression.
\(n \cdot O(1) = O(n)\) Both sides of that equality represent β€œn times some constant”. So we would say that entire loop is \(O(n)\text{.}\)
How about the loop:
for (int i = 0; i < n; i++) {
  sum = sum + i;
  if( i > max ) {
    max = i;
  }
}
The entire group of lines inside the loop is still \(O(1)\text{.}\) There is a constant amount of work inside the loop. When \(n\) changes, it will change the number of times the loop repeats, but it will not change the amount of work done in each iteration of the loop!
So the entire loop is still \(n \cdot O(1) = O(n)\text{.}\)
What about while loops?
int i = 0;
while (i < n) {
  sum = sum + i;
  if( i > max ) {
    max = i;
  }
  i++;
}
This loop is similar to the for loop above. The loop runs \(n\) times. Each time through the loop, it does a constant amount of work (remember that multiple lines of code that do not involve loops or function calls is still constant time). So the total amount of work done by the loop is again \(n \cdot O(1) = O(n)\text{.}\) It also does one constant unit of work to set up i. But \(O(1) + O(n) = O(n)\text{.}\) So it is \(O(n)\text{.}\)
This is also why we don’t have to worry about the various steps in a for loop. If we take the for loop above, the initialization, condition check, and increment steps each take constant time. But doing extra constant work in the loop does not change the overall Big-O complexity.

Subsection 25.6.3 Loop Counters

Loops are one of the primary ways we get non-constant work in our algorithms. But not every loop represents \(O(n)\) work. The amount of work depends on how many times the loop runs.
Consider this loop:
for (int i = 0; i < 10; i++) {
  sum = sum + i;
}
This loop runs 10 times. Each time through the loop, it does a constant amount of work. So the total amount of work done by the loop is \(10 \cdot O(1)\text{.}\) Here, 10 is a constant, so we can drop it. Thus, this entire loop is \(O(1)\text{.}\)
Because the loop ran a constant amount of times, it still represents constant work!

Warning 25.6.3.

This means that in Big-O analysis, a loop that repeats 1,000,000,000 times is considered \(O(1)\) because it is a constant number of iterations.
However, that loop will clearly take longer to run than int x = 1; which also counts as \(O(1)\text{.}\) If we let constant factors get that large, they do become significant in real-world performance.
Now consider this loop:
for (int i = 0; i < n / 2; i++) {
  sum = sum + i;
}
How many times does it repeat? \(n / 2\) times. Which means the total amount of work done by the loop is \((n / 2) \cdot O(1)\text{.}\) Dividing by 2 is a constant factor (we could instead multiply by 0.5), so we can drop it. Thus, this entire loop is \(O(n)\text{.}\)
Even though the loop runs only half as many times as a loop that runs \(n\) times, it is still \(O(n)\) because the amount of work increases in a linear fashion with the size of n. If n doubles, this loop will repeat twice as many times.
By the same logic, a loop that counted from 0 to 3*n would also be \(O(n)\text{.}\) Repeating 3 times as many iterations is still a constant factor, so it does not change the Big O classification.

Insight 25.6.4.

If a loop counts up to some constant multiple of \(n\text{,}\) the number of iterations is still proportional to \(n\text{,}\) so the loop is \(O(n)\text{.}\)
Next, let us consider this loop:
for (int i = 0; i <= n; i += 2) {
  sum = sum + i;
}
What values does i count through?
0, 2, 4, 6, ..., n
The sequence 0, 1, 2, ..., n is \(n + 1\) items long or \(O(n)\) (plus some constant doesn’t matter since n is dominant). The sequence above is roughly half as long since it skips every other number. It is approximately \(n / 2\) items long. But \(n / 2\) is a constant factor of \(n\text{,}\) so the loop is still \(O(n)\text{!}\)
If we counted by 10’s like 0, 10, 20, ... , the length of the sequence would be approximately \(n / 10\text{.}\) But 1/10 is a constant factor, so the loop is still \(O(n)\text{.}\)

Insight 25.6.5.

If the loop counter changes by a constant amount each time (like 2 in this example), the number of iterations is still proportional to \(n\text{,}\) so the loop is \(O(n)\text{.}\)
Next, let’s consider this loop where the loop counter is multiplied by 2 each time:
for (int i = 1; i <= n; i *= 2) {
  sum = sum + i;
}
Here, the values we count through are:
1, 2, 4, 8, ..., n
Each time through the loop, we multiply i by 2. So how many times can we double 1 before we exceed n? We need to solve the equation \(2^k = n\) to find out. Taking the logarithm base 2 of both sides gives us \(k = \log_2(n)\text{.}\) So this loop runs about \(\log_2(n)\) times. Each time through the loop, it does a constant amount of work. So the total amount of work done by the loop is \(\log_2(n) \cdot O(1) = O(\log n)\text{.}\)
Another variation on this pattern would be the loop:
for (int i = n; i > 0; i /= 2) {
  sum = sum + i;
}
Here, we start at n and divide by 2 each time. How many times can we divide n by 2 before we get down to 1? But this just counts through the same values of i in reverse. (n, n/2, n/4, ..., 1) So this loop is also \(O(\log n)\text{.}\)

Insight 25.6.6.

If the loop control variable is multiplied or divided by a constant factor each time, the number of iterations is proportional to \(\log n\text{,}\) so the loop is \(O(\log n)\text{.}\)

Checkpoint 25.6.1.

What is the Big-O classification of the following code fragment?
int hoursWorked = 10;
double hourlyRate = 15.50;
double totalPay = hoursWorked * hourlyRate;
  • O(1)
  • O(n)
  • There are no loops or function calls in this code fragment. All the operations take constant time.
  • O(3)
  • O(3) is not a valid Big-O classification. Any amount of constant work is considered O(1).
  • O(4)
  • O(4) is not a valid Big-O classification. Any amount of constant work is considered O(1).

Checkpoint 25.6.2.

Checkpoint 25.6.3.

What is the Big-O classification of the following code fragment?
for (int i = 1; i <= n; i *= 3) {
  sum = sum + i;
}
  • O(1)
  • The loop does not run constant number of times. It depends on n.
  • O(log n)
  • O(n)
  • Note how the counter is incremented it is multiplied by 3 each time. i will have the pattern 1, 3, 9, 27, ...

Checkpoint 25.6.4.

What is the Big-O classification of the following code fragment?
for (int i = 1; i <= 2*n; i++) {
  sum = sum + i;
}
  • O(1)
  • The loop does not run constant number of times. It depends on n.
  • O(log n)
  • It is counting by 1’s from 1 to 2*n.
  • O(n)
You have attempted of activities on this page.