Big O Notation & Complexity Analysis: A Complete Guide

Introduction
Imagine you're in a library with 1 million books. How long does it take to find a specific book?
- Unsorted library: You might need to check every book → slow
- Sorted by author: You can use binary search → fast
- Computerized index: Instant lookup → instant
This is what Big O notation is all about: measuring how algorithms scale as input size grows. It's the language developers use to discuss algorithm efficiency, and it's essential for writing fast, scalable code and acing technical interviews.
What You'll Learn
✅ Understand Big O notation and why it matters
✅ Master common time complexities: O(1), O(log n), O(n), O(n log n), O(n²), O(2ⁿ)
✅ Calculate Big O for loops, recursion, and nested structures
✅ Analyze space complexity and memory usage
✅ Understand best, average, and worst-case scenarios
✅ Learn time vs space trade-offs
✅ Apply complexity analysis in TypeScript and Python
✅ Master interview techniques for discussing algorithm efficiency
Prerequisites
- Basic programming knowledge (variables, loops, functions)
- Familiarity with TypeScript OR Python
- Understanding of arrays/lists
1. What is Big O Notation?
Big O notation describes how the runtime or space requirements of an algorithm grow as the input size increases. It answers: "How does this algorithm scale?"
Key Concepts
Input Size (n): The size of the data you're working with
- Array of 10 items → n = 10
- Array of 1 million items → n = 1,000,000
Big O (O): Describes the worst-case upper bound of growth
- O(n) means "at most linear growth"
- O(1) means "constant, doesn't grow"
Why "Big O"? It focuses on the worst-case scenario to guarantee performance.
Reading Big O Notation
O(1) → "O of 1" or "constant time"
O(n) → "O of n" or "linear time"
O(log n) → "O of log n" or "logarithmic time"
O(n²) → "O of n squared" or "quadratic time"
O(2ⁿ) → "O of 2 to the n" or "exponential time"2. Common Time Complexities
Let's explore the most common complexity classes from fastest to slowest.
O(1) - Constant Time
Definition: Operation takes the same time regardless of input size.
Examples:
- Accessing an array element by index
- Getting a value from a hash table/object
- Simple arithmetic operations
TypeScript:
// O(1) - Constant time
function getFirstElement<T>(arr: T[]): T | undefined {
return arr[0]; // Always one operation, regardless of array size
}
// O(1) - Hash table lookup
function getUserAge(users: Map<string, number>, userId: string): number | undefined {
return users.get(userId); // Instant lookup
}
console.log(getFirstElement([1, 2, 3, 4, 5])); // O(1)
console.log(getFirstElement([1, 2, 3, ..., 1000000])); // Still O(1)Python:
# O(1) - Constant time
def get_first_element(arr: list) -> any:
return arr[0] if arr else None
# O(1) - Dictionary lookup
def get_user_age(users: dict, user_id: str) -> int:
return users.get(user_id)
print(get_first_element([1, 2, 3, 4, 5])) # O(1)
print(get_first_element([1, 2, 3] + list(range(1000000)))) # Still O(1)Real-world analogy: Opening a book to the first page. Doesn't matter if it's 10 pages or 1000 pages.
O(log n) - Logarithmic Time
Definition: Algorithm cuts the problem in half each iteration.
Examples:
- Binary search in sorted array
- Balanced tree operations
TypeScript:
// O(log n) - Binary search
function binarySearch(arr: number[], target: number): number {
let left = 0;
let right = arr.length - 1;
while (left <= right) {
const mid = Math.floor((left + right) / 2);
if (arr[mid] === target) {
return mid;
} else if (arr[mid] < target) {
left = mid + 1; // Search right half
} else {
right = mid - 1; // Search left half
}
}
return -1; // Not found
}
// For array of 1,000,000 items, only ~20 comparisons needed!
const sortedArray = Array.from({ length: 1000000 }, (_, i) => i);
console.log(binarySearch(sortedArray, 500000)); // ~20 operationsPython:
# O(log n) - Binary search
def binary_search(arr: list[int], target: int) -> int:
left, right = 0, len(arr) - 1
while left <= right:
mid = (left + right) // 2
if arr[mid] == target:
return mid
elif arr[mid] < target:
left = mid + 1
else:
right = mid - 1
return -1
# For 1 million items, only ~20 comparisons!
sorted_array = list(range(1000000))
print(binary_search(sorted_array, 500000)) # ~20 operationsWhy log n? Each step eliminates half the remaining elements:
- 1,000,000 items → 500,000 → 250,000 → ... → 1 (takes ~20 steps)
- log₂(1,000,000) ≈ 20
Real-world analogy: Finding a word in a dictionary by opening to the middle, then halving repeatedly.
O(n) - Linear Time
Definition: Algorithm processes each element once.
Examples:
- Single loop through array
- Linear search
- Finding max/min value
TypeScript:
// O(n) - Linear search
function linearSearch<T>(arr: T[], target: T): number {
for (let i = 0; i < arr.length; i++) { // Visits each element once
if (arr[i] === target) {
return i;
}
}
return -1;
}
// O(n) - Sum all elements
function sum(arr: number[]): number {
let total = 0;
for (const num of arr) { // One pass through array
total += num;
}
return total;
}
// O(n) - Find maximum
function findMax(arr: number[]): number {
let max = arr[0];
for (const num of arr) {
if (num > max) {
max = num;
}
}
return max;
}Python:
# O(n) - Linear search
def linear_search(arr: list, target) -> int:
for i, item in enumerate(arr):
if item == target:
return i
return -1
# O(n) - Sum all elements
def sum_array(arr: list[int]) -> int:
total = 0
for num in arr:
total += num
return total
# O(n) - Python built-in (also O(n))
def find_max(arr: list[int]) -> int:
return max(arr) # Built-in max is O(n)Real-world analogy: Reading every page of a book to find a specific word.
O(n log n) - Linearithmic Time
Definition: Common in efficient sorting algorithms.
Examples:
- Merge sort
- Quick sort (average case)
- Heap sort
TypeScript:
// O(n log n) - Merge sort
function mergeSort(arr: number[]): number[] {
if (arr.length <= 1) {
return arr;
}
const mid = Math.floor(arr.length / 2);
const left = mergeSort(arr.slice(0, mid)); // log n divisions
const right = mergeSort(arr.slice(mid));
return merge(left, right); // O(n) merge at each level
}
function merge(left: number[], right: number[]): number[] {
const result: number[] = [];
let i = 0, j = 0;
while (i < left.length && j < right.length) {
if (left[i] <= right[j]) {
result.push(left[i++]);
} else {
result.push(right[j++]);
}
}
return result.concat(left.slice(i), right.slice(j));
}
// Native sort is also O(n log n)
const arr = [64, 34, 25, 12, 22, 11, 90];
console.log(mergeSort(arr));
console.log([...arr].sort((a, b) => a - b)); // O(n log n)Python:
# O(n log n) - Merge sort
def merge_sort(arr: list[int]) -> list[int]:
if len(arr) <= 1:
return arr
mid = len(arr) // 2
left = merge_sort(arr[:mid])
right = merge_sort(arr[mid:])
return merge(left, right)
def merge(left: list[int], right: list[int]) -> list[int]:
result = []
i = j = 0
while i < len(left) and j < len(right):
if left[i] <= right[j]:
result.append(left[i])
i += 1
else:
result.append(right[j])
j += 1
result.extend(left[i:])
result.extend(right[j:])
return result
# Python's built-in sort is also O(n log n) (Timsort)
arr = [64, 34, 25, 12, 22, 11, 90]
print(sorted(arr)) # O(n log n)Why n log n? You divide the problem log n times, and do O(n) work at each level.
Real-world analogy: Organizing a deck of cards by repeatedly splitting and merging piles.
O(n²) - Quadratic Time
Definition: Nested loops over the same dataset.
Examples:
- Bubble sort
- Selection sort
- Checking all pairs
TypeScript:
// O(n²) - Bubble sort (nested loops)
function bubbleSort(arr: number[]): number[] {
const n = arr.length;
for (let i = 0; i < n; i++) { // Outer loop: n times
for (let j = 0; j < n - i - 1; j++) { // Inner loop: n times
if (arr[j] > arr[j + 1]) {
[arr[j], arr[j + 1]] = [arr[j + 1], arr[j]];
}
}
}
return arr;
}
// O(n²) - Find all pairs
function findPairs<T>(arr: T[]): [T, T][] {
const pairs: [T, T][] = [];
for (let i = 0; i < arr.length; i++) {
for (let j = i + 1; j < arr.length; j++) {
pairs.push([arr[i], arr[j]]);
}
}
return pairs;
}
console.log(findPairs([1, 2, 3])); // [(1,2), (1,3), (2,3)]Python:
# O(n²) - Bubble sort
def bubble_sort(arr: list[int]) -> list[int]:
n = len(arr)
for i in range(n):
for j in range(n - i - 1):
if arr[j] > arr[j + 1]:
arr[j], arr[j + 1] = arr[j + 1], arr[j]
return arr
# O(n²) - Find all pairs
def find_pairs(arr: list) -> list[tuple]:
pairs = []
for i in range(len(arr)):
for j in range(i + 1, len(arr)):
pairs.append((arr[i], arr[j]))
return pairs
print(find_pairs([1, 2, 3])) # [(1, 2), (1, 3), (2, 3)]Real-world analogy: Comparing every person in a room with every other person (handshakes problem).
O(2ⁿ) - Exponential Time
Definition: Algorithm doubles with each additional input.
Examples:
- Recursive Fibonacci (naive)
- Power set generation
- Brute force solutions
TypeScript:
// O(2ⁿ) - Naive recursive Fibonacci
function fibonacci(n: number): number {
if (n <= 1) return n;
// Each call spawns 2 more calls → exponential growth
return fibonacci(n - 1) + fibonacci(n - 2);
}
// For n=10: ~1,000 calls
// For n=20: ~20,000 calls
// For n=30: ~2 million calls!
console.log(fibonacci(10)); // 55, but very slow for n > 40Python:
# O(2ⁿ) - Naive recursive Fibonacci
def fibonacci(n: int) -> int:
if n <= 1:
return n
return fibonacci(n - 1) + fibonacci(n - 2)
# Grows exponentially!
print(fibonacci(10)) # 55, but very slow for n > 40Why 2ⁿ? Each call branches into 2 more calls, creating an exponential tree.
Real-world analogy: Chain letters—each person sends to 2 more, who each send to 2 more...
Growth Rate Comparison
| Input Size (n) | O(1) | O(log n) | O(n) | O(n log n) | O(n²) | O(2ⁿ) |
|---|---|---|---|---|---|---|
| 10 | 1 | 3 | 10 | 33 | 100 | 1,024 |
| 100 | 1 | 7 | 100 | 664 | 10,000 | 1.27×10³⁰ |
| 1,000 | 1 | 10 | 1,000 | 9,966 | 1,000,000 | ∞ |
| 1,000,000 | 1 | 20 | 1,000,000 | 19,931,569 | 10¹² | ∞ |
Key Takeaway: O(1) and O(log n) scale incredibly well. O(n²) and O(2ⁿ) become unusable for large inputs.
3. How to Calculate Big O
Rule 1: Drop Constants
Constants don't affect growth rate, so we ignore them.
// O(2n) → simplifies to O(n)
function printTwice(arr: number[]): void {
arr.forEach(x => console.log(x)); // O(n)
arr.forEach(x => console.log(x)); // O(n)
// Total: O(n) + O(n) = O(2n) → O(n)
}
// O(n/2) → simplifies to O(n)
function printHalf(arr: number[]): void {
for (let i = 0; i < arr.length / 2; i++) { // Only half
console.log(arr[i]);
}
// Still O(n) because constants are dropped
}Rule 2: Dominant Term Wins
Keep only the fastest-growing term.
// O(n² + n) → simplifies to O(n²)
function example1(arr: number[]): void {
// First loop: O(n)
for (let i = 0; i < arr.length; i++) {
console.log(arr[i]);
}
// Nested loops: O(n²)
for (let i = 0; i < arr.length; i++) {
for (let j = 0; j < arr.length; j++) {
console.log(arr[i], arr[j]);
}
}
// Total: O(n) + O(n²) = O(n² + n) → O(n²)
}
// O(n log n + n) → simplifies to O(n log n)
function example2(arr: number[]): void {
arr.sort((a, b) => a - b); // O(n log n)
arr.forEach(x => console.log(x)); // O(n)
// Total: O(n log n)
}Rule 3: Different Inputs = Different Variables
// O(n + m) - NOT O(n)
function printBothArrays(arr1: number[], arr2: number[]): void {
arr1.forEach(x => console.log(x)); // O(n)
arr2.forEach(x => console.log(x)); // O(m)
// Total: O(n + m)
}
// O(n × m) - NOT O(n²)
function printPairs(arr1: number[], arr2: number[]): void {
for (const x of arr1) { // O(n)
for (const y of arr2) { // O(m)
console.log(x, y);
}
}
// Total: O(n × m)
}Rule 4: Analyzing Loops
Single loop: O(n)
for (let i = 0; i < n; i++) {
// O(1) operation
}
// Total: O(n)Nested loops (same size): O(n²)
for (let i = 0; i < n; i++) {
for (let j = 0; j < n; j++) {
// O(1) operation
}
}
// Total: O(n²)Halving loop: O(log n)
for (let i = n; i > 1; i = Math.floor(i / 2)) {
// O(1) operation
}
// Total: O(log n)4. Space Complexity
Space complexity measures memory usage as input grows.
Types of Space
1. Input Space: Memory for the input itself (doesn't count) 2. Auxiliary Space: Extra memory used by the algorithm (this is what we measure)
O(1) - Constant Space
// O(1) space - Only a few variables
function sumArray(arr: number[]): number {
let total = 0; // O(1) space
for (const num of arr) {
total += num;
}
return total;
}Python:
# O(1) space
def sum_array(arr: list[int]) -> int:
total = 0
for num in arr:
total += num
return totalO(n) - Linear Space
// O(n) space - Creating a new array
function doubleArray(arr: number[]): number[] {
const result: number[] = []; // O(n) space
for (const num of arr) {
result.push(num * 2);
}
return result;
}
// O(n) space - Using map (creates new array)
const doubled = arr.map(x => x * 2); // O(n) spacePython:
# O(n) space - Creating new list
def double_array(arr: list[int]) -> list[int]:
result = []
for num in arr:
result.append(num * 2)
return result
# O(n) space - List comprehension (creates new list)
doubled = [x * 2 for x in arr]Recursion Stack Space
// O(n) space due to call stack
function factorial(n: number): number {
if (n <= 1) return 1;
return n * factorial(n - 1); // Each call uses stack space
}
// Call stack for factorial(5):
// factorial(5) → factorial(4) → factorial(3) → factorial(2) → factorial(1)
// Maximum stack depth: n → O(n) spacePython:
# O(n) space due to recursion
def factorial(n: int) -> int:
if n <= 1:
return 1
return n * factorial(n - 1)In-Place vs Not In-Place
In-place algorithms modify the input without extra space → O(1) space Not in-place algorithms create new data structures → O(n) space
// In-place: O(1) space
function reverseInPlace(arr: number[]): void {
let left = 0, right = arr.length - 1;
while (left < right) {
[arr[left], arr[right]] = [arr[right], arr[left]];
left++;
right--;
}
}
// Not in-place: O(n) space
function reverseNewArray(arr: number[]): number[] {
return [...arr].reverse(); // Creates new array
}5. Built-In Operations Complexity
TypeScript/JavaScript
| Operation | Time Complexity | Notes |
|---|---|---|
arr[i] | O(1) | Array access |
arr.push(x) | O(1) amortized | Add to end |
arr.pop() | O(1) | Remove from end |
arr.unshift(x) | O(n) | Add to start (shifts all) |
arr.shift() | O(n) | Remove from start (shifts all) |
arr.slice() | O(n) | Copy array |
arr.map() | O(n) | Transform each element |
arr.filter() | O(n) | Filter elements |
arr.sort() | O(n log n) | Sort array |
arr.includes(x) | O(n) | Linear search |
arr.indexOf(x) | O(n) | Linear search |
map.get(key) | O(1) | Hash table lookup |
map.set(key, value) | O(1) | Hash table insert |
set.has(x) | O(1) | Set membership |
Python
| Operation | Time Complexity | Notes |
|---|---|---|
arr[i] | O(1) | List access |
arr.append(x) | O(1) amortized | Add to end |
arr.pop() | O(1) | Remove from end |
arr.insert(0, x) | O(n) | Insert at start |
arr.remove(x) | O(n) | Search and remove |
arr[:] | O(n) | Copy list |
[f(x) for x in arr] | O(n) | List comprehension |
sorted(arr) | O(n log n) | Sort list |
x in arr | O(n) | Linear search (list) |
dict[key] | O(1) | Dictionary lookup |
dict[key] = value | O(1) | Dictionary insert |
x in set | O(1) | Set membership |
6. Best, Average, Worst Case
Algorithms can have different performance depending on the input.
Example: Linear Search
function linearSearch(arr: number[], target: number): number {
for (let i = 0; i < arr.length; i++) {
if (arr[i] === target) {
return i;
}
}
return -1;
}Best Case: O(1) - Target is the first element Average Case: O(n/2) → O(n) - Target is in the middle Worst Case: O(n) - Target is the last element or not present
Which One Matters?
In most cases, we focus on worst-case (Big O) because:
- Guarantees performance
- Safer for production systems
- Standard in technical interviews
7. Time vs Space Trade-offs
Often, you can trade space for time (or vice versa).
Example: Fibonacci
Time-optimized (Memoization): O(n) time, O(n) space
function fibMemo(n: number, memo: Map<number, number> = new Map()): number {
if (n <= 1) return n;
if (memo.has(n)) return memo.get(n)!;
const result = fibMemo(n - 1, memo) + fibMemo(n - 2, memo);
memo.set(n, result);
return result;
}Space-optimized (Iterative): O(n) time, O(1) space
function fibIterative(n: number): number {
if (n <= 1) return n;
let prev = 0, curr = 1;
for (let i = 2; i <= n; i++) {
[prev, curr] = [curr, prev + curr];
}
return curr;
}Common Trade-off: Hash Tables
// Two Sum: Find two numbers that add to target
// Brute force: O(n²) time, O(1) space
function twoSumBrute(arr: number[], target: number): [number, number] | null {
for (let i = 0; i < arr.length; i++) {
for (let j = i + 1; j < arr.length; j++) {
if (arr[i] + arr[j] === target) {
return [i, j];
}
}
}
return null;
}
// Optimized: O(n) time, O(n) space
function twoSumHash(arr: number[], target: number): [number, number] | null {
const seen = new Map<number, number>();
for (let i = 0; i < arr.length; i++) {
const complement = target - arr[i];
if (seen.has(complement)) {
return [seen.get(complement)!, i];
}
seen.set(arr[i], i);
}
return null;
}Trade-off: We use extra space (hash table) to achieve faster time.
8. Interview Tips
How to Communicate Complexity
Good:
"This solution has O(n) time complexity because we iterate through the array once. Space complexity is O(1) since we only use a few variables."
Bad:
"It's pretty fast."
Step-by-Step Approach
- Identify loops: Each loop level multiplies complexity
- Check recursion depth: Recursion adds to space complexity
- Consider built-in operations:
sort()is O(n log n),includes()is O(n) - State both time and space: "O(n) time, O(1) space"
Red Flags in Interviews
- O(n²) or worse when O(n) or O(n log n) is possible
- O(2ⁿ) without memoization
- Not considering edge cases (empty arrays, duplicates)
Optimization Strategy
- Start with brute force: Get a working solution first
- Analyze complexity: "This is O(n²), can we do better?"
- Identify bottlenecks: Nested loops? Repeated work?
- Apply patterns: Hash tables for lookups, two pointers, binary search
- Consider trade-offs: Is O(n) space acceptable for O(n log n) → O(n) time?
9. Practice Problems
Problem 1: Find Duplicates
Brute Force: O(n²) time, O(1) space
function hasDuplicateBrute(arr: number[]): boolean {
for (let i = 0; i < arr.length; i++) {
for (let j = i + 1; j < arr.length; j++) {
if (arr[i] === arr[j]) return true;
}
}
return false;
}Optimized: O(n) time, O(n) space
function hasDuplicateSet(arr: number[]): boolean {
const seen = new Set<number>();
for (const num of arr) {
if (seen.has(num)) return true;
seen.add(num);
}
return false;
}Problem 2: Reverse String
In-place: O(n) time, O(1) space
function reverseString(s: string[]): void {
let left = 0, right = s.length - 1;
while (left < right) {
[s[left], s[right]] = [s[right], s[left]];
left++;
right--;
}
}Problem 3: Palindrome Check
Two Pointers: O(n) time, O(1) space
function isPalindrome(s: string): boolean {
let left = 0, right = s.length - 1;
while (left < right) {
if (s[left] !== s[right]) {
return false;
}
left++;
right--;
}
return true;
}Summary and Key Takeaways
✅ Big O measures scalability, not absolute speed
✅ Common complexities: O(1) < O(log n) < O(n) < O(n log n) < O(n²) < O(2ⁿ)
✅ Drop constants and keep dominant term: O(2n + 5) → O(n)
✅ Analyze loops: Nested loops multiply complexity
✅ Space complexity matters: Consider memory usage, not just time
✅ Recursion uses stack space: O(n) space for depth n
✅ Built-in operations have complexity: Know your language's operations
✅ Trade-offs exist: Time vs space, simplicity vs efficiency
✅ Focus on worst-case for interviews and production
✅ Practice identifying patterns: Hash tables, two pointers, binary search
Next Steps
Now that you understand complexity analysis, you're ready to:
Continue the DSA series:
- Problem-Solving Patterns & Techniques - Apply complexity analysis to common patterns
- Arrays & Strings - Learn data structure operations with complexity
Back to roadmap:
- Data Structures & Algorithms Roadmap - Complete learning path
Practice Resources:
- LeetCode Easy Problems - Filter by "Array" and "Easy"
- HackerRank - Data Structures track
- Big-O Cheat Sheet - Quick reference
Part of the Data Structures & Algorithms Roadmap series
📬 Subscribe to Newsletter
Get the latest blog posts delivered to your inbox every week. No spam, unsubscribe anytime.
We respect your privacy. Unsubscribe at any time.
💬 Comments
Sign in to leave a comment
We'll never post without your permission.