jypi
  • Explore
ChatWays to LearnMind mapAbout

jypi

  • About Us
  • Our Mission
  • Team
  • Careers

Resources

  • Ways to Learn
  • Mind map
  • Blog
  • Help Center
  • Community Guidelines
  • Contributor Guide

Legal

  • Terms of Service
  • Privacy Policy
  • Cookie Policy
  • Content Policy

Connect

  • Twitter
  • Discord
  • Instagram
  • Contact Us
jypi

© 2026 jypi. All rights reserved.

CS50 - Introduction to Computer Science
Chapters

1Computational Thinking and Foundations

2C Language Basics

3Arrays, Strings, and Algorithmic Basics

4Algorithm Efficiency and Recursion

Asymptotic Notation: Big-OBest, Average, Worst CaseLogarithms in AlgorithmsBinary SearchInsertion SortMerge SortQuick Sort OverviewRecursion FundamentalsCall Stack and FramesTail RecursionDivide and ConquerSpace ComplexityTime–Space Trade-offsProofs of CorrectnessEmpirical Benchmarking

5Memory, Pointers, and File I/O

6Core Data Structures in C

7Python Fundamentals

8Object-Oriented and Advanced Python

9Relational Databases and SQL

10Web Foundations: HTML, CSS, and JavaScript

11Servers and Flask Web Applications

12Cybersecurity and Privacy Essentials

13Software Engineering Practices

14Version Control and Collaboration

15Capstone: Designing, Building, and Presenting

Courses/CS50 - Introduction to Computer Science/Algorithm Efficiency and Recursion

Algorithm Efficiency and Recursion

8790 views

Analyze time and space complexity and implement efficient recursive solutions.

Content

3 of 15

Logarithms in Algorithms

Logarithms in Algorithms: Why Logarithms Matter in CS
3662 views
beginner
humorous
computer science
algorithms
gpt-5-mini
3662 views

Versions:

Logarithms in Algorithms: Why Logarithms Matter in CS

Watch & Learn

AI-discovered learning video

Sign in to watch the learning video for this topic.

Sign inSign up free

Start learning for free

Sign up to save progress, unlock study materials, and track your learning.

  • Bookmark content and pick up later
  • AI-generated study materials
  • Flashcards, timelines, and more
  • Progress tracking and certificates

Free to join · No credit card required

Logarithms in Algorithms — The “Half-Life” of Data (but Friendlier)

"If halving your workload were a magic spell, logarithms would be the incantation." — Your slightly dramatic CS50 TA

You already know Big-O and the difference between best, average, and worst case from earlier sections, and you've been poking arrays and strings to see how searching and sorting behave. Now let's zoom in on a superstar that shows up everywhere in efficient algorithms: logarithms. This is the unit that measures how many times you can halve something before it's tiny enough to stop caring. And yes — it makes some algorithms feel like cheat codes.


What does "logarithm" mean in plain CS terms?

  • Mathematically: log_b(n) = the number of times you multiply b to get n.
  • Algorithmically (the intuition): How many times can you split a problem into constant-sized pieces by repeatedly dividing by a factor (usually 2) before you're done?

Think: you have n items. Each step you cut the problem size in half. How many steps until you hit 1? Answer: about log_2(n). That’s the heart of logarithmic time.

Micro explanation

If n = 1,000,000, halving repeatedly gives: 500k, 250k, 125k, … after ~20 halvings you're at 1. So log_2(1,000,000) ≈ 20.


Where log shows up (real-life and in CS)

  • Binary Search on a sorted array — each comparison cuts the search interval in half. Complexity: O(log n).
  • Balanced binary search trees (AVL/Red-Black) — operations like lookup/insert/delete: O(log n) (height ~ log n).
  • Heaps — insert/extract on a binary heap: O(log n) (bubble up/down along heap height).
  • Divide-and-conquer recurrences like T(n) = T(n/2) + O(1) solve to O(log n).

This is a logical progression from what you learned about arrays, searching, and Big-O: once the search or operation halves the problem each step, expect logarithms.


Binary Search — The Canonical Example

Imagine finding a word in a sorted dictionary. Do you read every page? Of course not. You open roughly in the middle and decide which half to keep. That's binary search.

Iterative binary search (pseudocode):

function binary_search(A, target):
    low = 0, high = len(A) - 1
    while low <= high:
        mid = (low + high) // 2
        if A[mid] == target:
            return mid  # best case O(1) if it's in the middle
        elif A[mid] < target:
            low = mid + 1
        else:
            high = mid - 1
    return -1  # not found

Analysis: each loop iteration halves the search interval. If n is the starting interval length, after k iterations the interval length ≤ n / 2^k. Stop when n / 2^k ≤ 1 ⇒ k ≥ log_2(n). So worst-case time is O(log n). (Best case is O(1) if the mid happens to be the target — remember best/avg/worst case.)


Recurrence viewpoint (a little math, very useful)

If an algorithm does a constant amount of work and then recursively runs on half the input, you get a recurrence like:

T(n) = T(n/2) + c

Unroll it:

T(n) = c + T(n/2)
= c + c + T(n/4)
= ...
= k*c + T(n / 2^k)

Stop when n / 2^k = 1 ⇒ 2^k = n ⇒ k = log_2(n).
So T(n) = c * log_2(n) + T(1) = O(log n).

Contrast that with T(n) = T(n/2) + n — that's linear O(n) because each level does O(n) work across the recursion tree.

Quick note: use the Master Theorem (from asymptotic notation) when you see recurrences like T(n)=aT(n/b)+f(n). For the a=1, b=2 case and f(n)=O(1), you get logarithmic behavior.


Why base doesn't matter in Big-O

You might see log_2(n), log_10(n), or log_e(n) (natural log). They differ by a constant factor via the change-of-base formula:

log_b(n) = log_k(n) / log_k(b)

Since Big-O ignores constant factors, log base is irrelevant for asymptotic classification: O(log_2 n) = O(log n) = O(log_10 n).


When logarithmic time sounds magical — and when it’s not

  • Magical: When the algorithm halves the problem and the work per step is constant. Binary search is the superhero example.
  • Not-so-magical: If halving is combined with full-linear work per level (e.g., T(n)=T(n/2)+n), complexity becomes O(n). Also if the data isn't in a structure that supports logarithmic operations (unsorted array lookup remains O(n)).

Remember the tradeoffs from arrays and strings: to enjoy O(log n) searches you usually need the data sorted or stored in a tree-like structure that preserves order.


Helpful comparisons (real quick)

  • O(1): instant — check the top item, constant time
  • O(log n): zooms down by halving — binary search, tree height
  • O(n): scan everything — unsorted search
  • O(n log n): fast sorts like mergesort/heap sort
  • O(n^2): bubble/selection sort (slow for large n)
  • O(2^n): exponential — avoid unless n tiny

Logarithmic sits between constant and linear — often the sweet spot.


Intuition tricks (so you never forget)

  • Folding paper: How many times do you fold a piece of paper to reach some thickness? Each fold doubles thickness; the number of folds to reach height H is about log_2(H).
  • Phone book search: open to the middle, decide left/right → binary search.
  • Doubling trick: If an algorithm doubles the work each step going forward, then moving backwards (asking how many steps to reach n) gives a log.

"If you can halve the problem deterministically each step, you just bought yourself log-time — congratulations, you're an efficiency wizard." — Also your TA


Key takeaways

  • Logarithmic time arises when the problem size is reduced multiplicatively (usually halved) each step.
  • Binary search and tree operations are classic O(log n) examples — but you need the right data structure (sorted array, balanced tree, heap).
  • Base of the log doesn't matter for Big-O; change-of-base is a constant factor.
  • Recurrences like T(n)=T(n/2)+O(1) solve to O(log n). Use the Master Theorem for more complex recurrences.

Final memorable image: imagine shrinking a loaf of bread by slicing it in half over and over. The number of slices until you get to crumbs grows logarithmically with the initial size. Algorithms that slice their problems like that are fast, elegant, and deserve a standing ovation.


If you want, I can:

  • Walk through a recursive binary search and show its recursion tree,
  • Solve a few recurrences with the Master Theorem,
  • Or give a quick quiz: classify runtimes and justify why they're logarithmic (or not).

Which one would make you feel smarter immediately?

Flashcards
Mind Map
Speed Challenge

Comments (0)

Please sign in to leave a comment.

No comments yet. Be the first to comment!

Ready to practice?

Sign up now to study with flashcards, practice questions, and more — and track your progress on this topic.

Study with flashcards, timelines, and more
Earn certificates for completed courses
Bookmark content for later reference
Track your progress across all topics