jypi
  • Explore
ChatWays to LearnMind mapAbout

jypi

  • About Us
  • Our Mission
  • Team
  • Careers

Resources

  • Ways to Learn
  • Mind map
  • Blog
  • Help Center
  • Community Guidelines
  • Contributor Guide

Legal

  • Terms of Service
  • Privacy Policy
  • Cookie Policy
  • Content Policy

Connect

  • Twitter
  • Discord
  • Instagram
  • Contact Us
jypi

© 2026 jypi. All rights reserved.

CS50 - Introduction to Computer Science
Chapters

1Computational Thinking and Foundations

2C Language Basics

3Arrays, Strings, and Algorithmic Basics

4Algorithm Efficiency and Recursion

Asymptotic Notation: Big-OBest, Average, Worst CaseLogarithms in AlgorithmsBinary SearchInsertion SortMerge SortQuick Sort OverviewRecursion FundamentalsCall Stack and FramesTail RecursionDivide and ConquerSpace ComplexityTime–Space Trade-offsProofs of CorrectnessEmpirical Benchmarking

5Memory, Pointers, and File I/O

6Core Data Structures in C

7Python Fundamentals

8Object-Oriented and Advanced Python

9Relational Databases and SQL

10Web Foundations: HTML, CSS, and JavaScript

11Servers and Flask Web Applications

12Cybersecurity and Privacy Essentials

13Software Engineering Practices

14Version Control and Collaboration

15Capstone: Designing, Building, and Presenting

Courses/CS50 - Introduction to Computer Science/Algorithm Efficiency and Recursion

Algorithm Efficiency and Recursion

8789 views

Analyze time and space complexity and implement efficient recursive solutions.

Content

1 of 15

Asymptotic Notation: Big-O

Big-O Notation Explained: Asymptotic Complexity for CS50
605 views
beginner
computer-science
asymptotic-notation
recursion
humorous
gpt-5-mini
605 views

Versions:

Big-O Notation Explained: Asymptotic Complexity for CS50

Watch & Learn

AI-discovered learning video

Sign in to watch the learning video for this topic.

Sign inSign up free

Start learning for free

Sign up to save progress, unlock study materials, and track your learning.

  • Bookmark content and pick up later
  • AI-generated study materials
  • Flashcards, timelines, and more
  • Progress tracking and certificates

Free to join · No credit card required

Big-O Notation: How Fast Does Your Code Grow?

You already know how to manipulate arrays and strings, and you've practiced making algorithms correct and validating inputs. Now the important question: how does your algorithm behave when the input gets unreasonably huge? Welcome to Big-O — the math that tells you whether your code will scale or spectacularly fail at scale.


What is Big-O (quick, not boring)

Big-O notation is a way to describe the asymptotic behavior of an algorithm: how its running time (or memory) grows as the input size n grows toward infinity. It's the language of scaling.

  • Big-O focuses on the dominant term and ignores constants. If your function is 3n + 12, Big-O cares that it's O(n), not the 3 or the 12.
  • We usually talk about time complexity (how long) and space complexity (how much memory).

Why this matters: small inefficiencies are fine for tiny inputs, but when n goes from 100 to 100,000, your algorithm's growth rate determines if your program lives or dies.


Real-world intuition (not a proof, but a gut punch)

Imagine two pizza delivery services:

  • Delivery A: takes time proportional to the number of houses on the block (O(n)).
  • Delivery B: needs to check every pair of houses to decide the best route (O(n^2)).

If there are 10 houses both are fast. If there are 10,000 houses, Delivery A is fine; Delivery B is still looking for a scooter.

"This is the moment where the concept finally clicks: constants don't matter, growth does."


Common Big-O classes (from fastest to slowest)

  • O(1) — constant time. Example: access arr[i].
  • O(log n) — logarithmic. Example: binary search in a sorted array.
  • O(n) — linear. Example: linear search.
  • O(n log n) — linearithmic. Example: good sorting algorithms (merge sort, quicksort average).
  • O(n^2) — quadratic. Example: naive bubble sort, nested loops.
  • O(2^n) — exponential. Example: naive recursive Fibonacci.
  • O(n!) — factorial. Example: brute-force permutations for traveling salesman.

Rules of thumb (apply like sunscreen)

  1. Drop constants: 5n + 300 → O(n).
  2. Keep the biggest term: n^2 + n → O(n^2).
  3. Consecutive statements add: two O(n) loops one after another → O(n) + O(n) = O(n).
  4. Nested loops multiply: an O(n) loop inside an O(n) loop → O(n^2).
  5. If/else pick the worst branch for Big-O (for worst-case analysis).

Note: If you care about average-case or best-case, specify it: O_average, O_best.


Examples you already know (arrays, sentinels, validation — remember them!)

Linear search — O(n)

You scan the array until you find the value. If you used a sentinel (remember sentinels and guards?), you might avoid boundary checks — a tiny constant improvement.

// pseudo-C linear search
for (int i = 0; i < n; i++) {
    if (arr[i] == key) return i; // best-case O(1), worst-case O(n)
}
return -1;

Big-O (worst case) = O(n). Input validation or sentinels change constants, not the O(n) nature.

Binary search — O(log n)

If the array is sorted (you validated inputs, right?), binary search halves the search space each time.

// pseudo-C binary search loop
int lo = 0, hi = n - 1;
while (lo <= hi) {
    int mid = (lo + hi) / 2;
    if (arr[mid] == key) return mid;
    else if (arr[mid] < key) lo = mid + 1;
    else hi = mid - 1;
}

Big-O = O(log n). But note: sorting the array first costs O(n log n) unless it's already sorted.


Recursion and Big-O (because CS50 loves recursion)

Recursion is syntactic sugar for repeated work. Big-O can be read from recurrences.

Factorial (simple recursion) — O(n)

int factorial(int n) {
    if (n <= 1) return 1;
    return n * factorial(n - 1);
}

T(n) = T(n-1) + O(1) → O(n).

Naive Fibonacci — O(2^n) (exponential)

int fib(int n) {
    if (n <= 1) return n;
    return fib(n-1) + fib(n-2);
}

This recomputes the same values many times. T(n) ≈ 2^n. Horrible for n > 40.

Memoize it and boom: O(n).

// with memoization (top-down DP) or bottom-up DP: O(n)

Recurrence example: Merge sort — O(n log n)

Merge sort divides the array in half and merges sorted halves:

T(n) = 2T(n/2) + O(n) → O(n log n)

(If you like formulas, this is where the Master Theorem saves you from tears.)


Quick checklist to analyze complexity

  1. Identify input size n.
  2. Count how many times the dominant operation executes as a function of n.
  3. Apply rules of thumb: drop constants, keep highest-order term.
  4. For recursion, write a recurrence T(n) and solve (or use Master Theorem).

Micro explanation: if your algorithm does a single pass over an array and then sorts the result, the cost is O(n) + O(n log n) → O(n log n).


When Big-O lies (or is incomplete)

  • Big-O ignores constants — but in real small-n cases, constants matter. If your O(n) code has a huge constant and the O(n log n) has a tiny one, the O(n log n) might be faster for realistic n.
  • Big-O doesn't talk about responsiveness or latency. A function that takes O(1) but takes 500ms might still be unacceptable for UI.
  • Big-O doesn't replace empirical testing. Benchmarks and profiling are essential.

Key takeaways — the things to tattoo on your brain

  • Big-O describes growth, not exact time. It's about scaling.
  • Drop constants, keep the dominant term. O(3n + 10) → O(n).
  • Use Big-O to compare algorithms at scale. O(n log n) sorts are usually better than O(n^2) sorts for large n.
  • Recursion requires recurrence solving. Naive recursion can be exponential; memoize or convert to DP to make it linear.
  • Practical note: algorithmic correctness, input validation, and sentinels (from earlier topics) still matter — they ensure the algorithm runs on valid inputs; asymptotic analysis assumes valid inputs.

If your brain remembers one thing: when n skyrockets, pretend your code is a movie villain — the complexity decides whether it conquers the world or trips on its own cape. Want problems to practice with (linear vs binary search, merge sort recurrence, naive vs memoized recursion)? Tell me which and I'll build a CS50-style problem set with hints and memes. 🎭

Flashcards
Mind Map
Speed Challenge

Comments (0)

Please sign in to leave a comment.

No comments yet. Be the first to comment!

Ready to practice?

Sign up now to study with flashcards, practice questions, and more — and track your progress on this topic.

Study with flashcards, timelines, and more
Earn certificates for completed courses
Bookmark content for later reference
Track your progress across all topics