Algorithm Efficiency and Recursion
Analyze time and space complexity and implement efficient recursive solutions.
Content
Asymptotic Notation: Big-O
Versions:
Watch & Learn
AI-discovered learning video
Sign in to watch the learning video for this topic.
Big-O Notation: How Fast Does Your Code Grow?
You already know how to manipulate arrays and strings, and you've practiced making algorithms correct and validating inputs. Now the important question: how does your algorithm behave when the input gets unreasonably huge? Welcome to Big-O — the math that tells you whether your code will scale or spectacularly fail at scale.
What is Big-O (quick, not boring)
Big-O notation is a way to describe the asymptotic behavior of an algorithm: how its running time (or memory) grows as the input size n grows toward infinity. It's the language of scaling.
- Big-O focuses on the dominant term and ignores constants. If your function is 3n + 12, Big-O cares that it's O(n), not the 3 or the 12.
- We usually talk about time complexity (how long) and space complexity (how much memory).
Why this matters: small inefficiencies are fine for tiny inputs, but when n goes from 100 to 100,000, your algorithm's growth rate determines if your program lives or dies.
Real-world intuition (not a proof, but a gut punch)
Imagine two pizza delivery services:
- Delivery A: takes time proportional to the number of houses on the block (O(n)).
- Delivery B: needs to check every pair of houses to decide the best route (O(n^2)).
If there are 10 houses both are fast. If there are 10,000 houses, Delivery A is fine; Delivery B is still looking for a scooter.
"This is the moment where the concept finally clicks: constants don't matter, growth does."
Common Big-O classes (from fastest to slowest)
- O(1) — constant time. Example: access arr[i].
- O(log n) — logarithmic. Example: binary search in a sorted array.
- O(n) — linear. Example: linear search.
- O(n log n) — linearithmic. Example: good sorting algorithms (merge sort, quicksort average).
- O(n^2) — quadratic. Example: naive bubble sort, nested loops.
- O(2^n) — exponential. Example: naive recursive Fibonacci.
- O(n!) — factorial. Example: brute-force permutations for traveling salesman.
Rules of thumb (apply like sunscreen)
- Drop constants: 5n + 300 → O(n).
- Keep the biggest term: n^2 + n → O(n^2).
- Consecutive statements add: two O(n) loops one after another → O(n) + O(n) = O(n).
- Nested loops multiply: an O(n) loop inside an O(n) loop → O(n^2).
- If/else pick the worst branch for Big-O (for worst-case analysis).
Note: If you care about average-case or best-case, specify it: O_average, O_best.
Examples you already know (arrays, sentinels, validation — remember them!)
Linear search — O(n)
You scan the array until you find the value. If you used a sentinel (remember sentinels and guards?), you might avoid boundary checks — a tiny constant improvement.
// pseudo-C linear search
for (int i = 0; i < n; i++) {
if (arr[i] == key) return i; // best-case O(1), worst-case O(n)
}
return -1;
Big-O (worst case) = O(n). Input validation or sentinels change constants, not the O(n) nature.
Binary search — O(log n)
If the array is sorted (you validated inputs, right?), binary search halves the search space each time.
// pseudo-C binary search loop
int lo = 0, hi = n - 1;
while (lo <= hi) {
int mid = (lo + hi) / 2;
if (arr[mid] == key) return mid;
else if (arr[mid] < key) lo = mid + 1;
else hi = mid - 1;
}
Big-O = O(log n). But note: sorting the array first costs O(n log n) unless it's already sorted.
Recursion and Big-O (because CS50 loves recursion)
Recursion is syntactic sugar for repeated work. Big-O can be read from recurrences.
Factorial (simple recursion) — O(n)
int factorial(int n) {
if (n <= 1) return 1;
return n * factorial(n - 1);
}
T(n) = T(n-1) + O(1) → O(n).
Naive Fibonacci — O(2^n) (exponential)
int fib(int n) {
if (n <= 1) return n;
return fib(n-1) + fib(n-2);
}
This recomputes the same values many times. T(n) ≈ 2^n. Horrible for n > 40.
Memoize it and boom: O(n).
// with memoization (top-down DP) or bottom-up DP: O(n)
Recurrence example: Merge sort — O(n log n)
Merge sort divides the array in half and merges sorted halves:
T(n) = 2T(n/2) + O(n) → O(n log n)
(If you like formulas, this is where the Master Theorem saves you from tears.)
Quick checklist to analyze complexity
- Identify input size n.
- Count how many times the dominant operation executes as a function of n.
- Apply rules of thumb: drop constants, keep highest-order term.
- For recursion, write a recurrence T(n) and solve (or use Master Theorem).
Micro explanation: if your algorithm does a single pass over an array and then sorts the result, the cost is O(n) + O(n log n) → O(n log n).
When Big-O lies (or is incomplete)
- Big-O ignores constants — but in real small-n cases, constants matter. If your O(n) code has a huge constant and the O(n log n) has a tiny one, the O(n log n) might be faster for realistic n.
- Big-O doesn't talk about responsiveness or latency. A function that takes O(1) but takes 500ms might still be unacceptable for UI.
- Big-O doesn't replace empirical testing. Benchmarks and profiling are essential.
Key takeaways — the things to tattoo on your brain
- Big-O describes growth, not exact time. It's about scaling.
- Drop constants, keep the dominant term. O(3n + 10) → O(n).
- Use Big-O to compare algorithms at scale. O(n log n) sorts are usually better than O(n^2) sorts for large n.
- Recursion requires recurrence solving. Naive recursion can be exponential; memoize or convert to DP to make it linear.
- Practical note: algorithmic correctness, input validation, and sentinels (from earlier topics) still matter — they ensure the algorithm runs on valid inputs; asymptotic analysis assumes valid inputs.
If your brain remembers one thing: when n skyrockets, pretend your code is a movie villain — the complexity decides whether it conquers the world or trips on its own cape. Want problems to practice with (linear vs binary search, merge sort recurrence, naive vs memoized recursion)? Tell me which and I'll build a CS50-style problem set with hints and memes. 🎭
Comments (0)
Please sign in to leave a comment.
No comments yet. Be the first to comment!