Skip to main content

Section 25.14 Vocabulary and Common Recurrence Relations

Subsection 25.14.1 Common Recurrence Relations

Here again is the table of common recurrence relations from TableΒ 25.12.1 for your reference:
Pattern for General Case Big-O Description
\(T(n) = T(n / 2) + O(1) \) \(O(\log n)\)
Function does constant work and makes a single recursive call on half the input.
\(T(n) = T(n - 1) + O(1) \) \(O(n)\)
Function does constant work and makes a single recursive call on input reduced by one.
\(T(n) = 2 \cdot T(n / 2) + O(n) \) \(O(n \log n)\)
Function does linear work and makes two recursive calls on half the input.

Subsection 25.14.2 Vocabulary

Big-O notation
A mathematical notation used to describe the efficiency of algorithms in terms of how their resource usage grows as the size of the input grows.
linear complexity
An algorithm where the growth rate is \(O(n)\text{,}\) meaning that the time taken grows as a linear function of the size of the input.
logarithmic complexity
An algorithm where the growth rate is \(O(\log n)\text{,}\) meaning that the time taken grows as a logarithmic function of the size of the input.
log-linear complexity
An algorithm where the growth rate is \(O(n \log n)\text{,}\) meaning that the time taken grows as a log-linear function of the size of the input.
quadratic complexity
An algorithm where the growth rate is \(O(n^2)\text{,}\) meaning that the time taken grows as a quadratic function of the size of the input.
time complexity
A measure of the amount of time an algorithm takes to complete as a function of the size of its input. This complexity is analyzed in terms of the number of basic operations performed by the algorithm.
wall time
The actual time taken from the start to the end of a program’s execution, as would be measured by a clock on the wall. Also called wall-clock time.
You have attempted of activities on this page.