Algorithm Timing Analysis
Content
Introduction to Algorithm Analysis
Versions:
Watch & Learn
AI-discovered learning video
Introduction to Algorithm Analysis (for USask students who survived the prairies)
Why algorithm analysis matters — like layering for Saskatoon weather
If you've lived in Saskatoon, you know weather is a cruel, hilarious teacher: spring teases you, winter punches you, and mosquitoes in summer collect ransom. Algorithm analysis is the same kind of brutal honesty. It doesn't ask how pretty your code looks or whether you commented your hopes and dreams into the header — it asks how your program behaves when the input size grows, and then coldly punishes you if you didn't plan for it.
- Algorithm analysis tells you how much time and memory an algorithm will need as the size of the problem (n) grows.
- Think of n as the number of students crammed into Place Riel during exam week. A linear increase in chaos is one thing; exponential chaos is when you wish you'd stayed in warm Florida. (Except we can't, because Canada.)
By mastering this, you'll avoid writing code that works great on your laptop but collapses like a bike without winter tires when you run it on real data — or when you run a 100,000-item input instead of the 10 you used for testing in the PAC between classes.
Core concepts (with prairie metaphors)
1. Asymptotic notation: Big-O, Big-Theta, Big-Omega
- Big-O (O) — the worst-case growth. Like preparing for a -40°C blizzard: you pack for the worst. If an algorithm is O(n), doubling n roughly doubles time.
- Big-Theta (Θ) — the average/typical growth when upper and lower bounds agree. Like the average winter: predictably miserable.
- Big-Omega (Ω) — the best-case growth. Like the one spring day when the river thawed early and classes were outdoors.
2. Time vs Space complexity
- Time complexity = how long an algorithm takes (CPU steps).
- Space complexity = how much extra memory it uses (stack, heap). Think of space like how many layers of clothing you need: recursion depth is like wearing multiple scarves — useful, but heavy.
3. Worst/Average/Best case
- Worst-case: like being stuck behind a tractor on Highway 5 in a blizzard.
- Average-case: the usual slog.
- Best-case: the rare, glorious 20°C day in November.
Common complexity classes (and campus analogies)
O(1) — Constant time
- Example: Checking if your USask student card is in your pocket. It doesn't depend on how many students are in the Bowl.
O(log n) — Logarithmic
- Example: Binary search — like halving the crowd each time you shout "Is Jamie here?" until you find them in the crowd at a Folkfest. Fast and efficient.
O(n) — Linear
- Example: Scanning every seat in a lecture hall to find the warmest spot — you check each one.
O(n log n)
- Example: Efficient sorting algorithms (merge sort, heapsort). Like organizing stacks of Midterm papers into sorted piles by repeatedly merging smaller sorted piles.
O(n^2) — Quadratic
- Example: Nested loops — comparing every student to every other student in a sudsy intramural dodgeball tournament bracket. Fine for small groups, horrifying for big ones.
O(2^n) — Exponential
- Example: Trying all course schedules by brute force. The number of combinations explodes faster than rumors about Tuesday pubs.
O(n!) — Factorial
- Example: Testing every seating arrangement for a formal at Marquis Wheat Hall (imagine trying to please everyone). This is basically forbidden in polite company and exams.
How to analyze code — a recipe with prairie grit
- Identify the basic operations (comparisons, swaps, arithmetic) and choose one to count.
- Find the input parameter n.
- Look at loops:
- Single loop from 1..n → O(n)
- Nested loops each from 1..n → O(n^2)
- Look at recursive calls. Use recurrence relations and the Master Theorem where applicable.
- Drop low-order terms and constants. (We don't care if your routine takes 10 seconds or 5 — we care how it scales when midterms turn into finals and your dataset is huge.)
Quick Master Theorem sketch (for divide-and-conquer recurrences)
For T(n) = a T(n/b) + f(n):
- If f(n) = O(n^{log_b a - ε}) → T(n) = Θ(n^{log_b a})
- If f(n) = Θ(n^{log_b a}) → T(n) = Θ(n^{log_b a} log n)
- If f(n) = Ω(n^{log_b a + ε}) and regularity condition holds → T(n) = Θ(f(n))
(Yes, the theorem looks like a math incantation. Treat it like a reliable winter coat: learn which situation you're in and it'll keep you warm.)
Examples with short walkthroughs
Loop that runs n times and inside does a binary search (log n): O(n log n)
- Walkthrough: outer loop linear, inner operation logarithmic, multiply.
Two nested loops where inner depends on i: for (i=1..n) for (j=1..i) do O(1)
- Complexity: O(n^2 / 2) → drop constant → O(n^2)
Recurrence: T(n) = 2T(n/2) + n
- By Master Theorem, a=2, b=2 → n^{log_b a} = n^{1} = n, and f(n)=n → case 2 → Θ(n log n)
Exam-friendly tips (for surviving both exams and -30C wind chill)
- Always state which n you choose and what operation you count.
- When in doubt, assume worst-case unless problem asks for average or best.
- Drop constants: O(3n) is O(n). The prairies don't care about constants either.
- Practice with small programs but test with large n — an algorithm that runs in 0.1s for n=100 might turn into a saga for n=100,000.
- Learn a few common recurrences and how to apply the Master Theorem; it’s the difference between cozy and frostbite.
Short practice problems (with hints)
- Given a two-pointer loop that advances left or right pointer each step until they meet, what's the time complexity? (Hint: each element visited at most once)
- Show that quicksort worst-case is O(n^2) but average-case is O(n log n). (Hint: pivot choice matters like choosing which path across the river.)
- Analyze this recurrence: T(n) = 3T(n/3) + n. (Hint: a=3, b=3)
Hints/solutions:
- 1 → O(n).
- 2 → Worst-case when pivot is always max/min → degenerate partitioning → O(n^2). Average-case with good splits → O(n log n).
- 3 → n^{log_3 3} = n^1 = n and f(n)=n → case 2 of Master Theorem → Θ(n log n).
Final notes — survive algorithms like you survive Saskatoon winter
Algorithms are like living on the prairies: preparation matters, small mistakes get amplified, and the cold (or exponential growth) will find you if you don't plan. Learn the notations, practice the patterns (loops, recurrences, reductions), and use analogies that make sense — even if that analogy is how your code behaves like a snowdrift filling the steps of the Diefenbaker building (metaphorically speaking).
If you want, I can give you a tailored practice sheet with USask-themed examples (Place Riel line-ups, midterm-mark sorting, and the official "escape the library before it snows" algorithm). Or we can write a quick cheat-sheet poster you can tape to your laptop — somewhat illegal, but emotionally supportive.
Stay warm, stay curious, and may your algorithms be as efficient as a compact winter parka.
Comments (0)
Please sign in to leave a comment.
No comments yet. Be the first to comment!