Build your algorithmic thinking foundation
Section 1 of 5
Master the art of analyzing algorithm performance with Big O, Omega, and Theta notations. Learn to predict and compare the efficiency of different algorithms with interactive visualizations.
Predict CPU consumption and execution time
Analyze RAM consumption and space requirements
Evaluate bandwidth consumption for data transfer
The running time of an algorithm is the total number of primitive operations executed (machine independent steps). This is also known as algorithm complexity.
Measured by counting key operations like comparisons
Measured by maximum memory space required
O(g(n))Upper bound - worst case scenario
Algorithm will never perform worse than this
f(n) ≤ c × g(n) for all n ≥ n₀Ω(g(n))Lower bound - best case scenario
Algorithm will never perform better than this
f(n) ≥ c × g(n) for all n ≥ n₀Θ(g(n))Tight bound - average case scenario
Algorithm performs exactly at this rate
c₁ × g(n) ≤ f(n) ≤ c₂ × g(n) for all n ≥ n₀Operations remain constant regardless of input size
Array access: arr[5], Hash table lookup
Always 1 operation
Time increases slowly as input grows
Binary search in sorted array
Halves search space each step
Time increases directly with input size
Linear search, finding max in array
One operation per element
Efficient divide-and-conquer algorithms
Merge sort, Heap sort
n × log n operations
Time increases with square of input
Bubble sort, nested loops
n² operations for n elements
Time doubles with each addition
Brute force password cracking
Exponential explosion
A time-space tradeoff is a situation where memory use can be reduced at the cost of slower program execution, and conversely, computation time can be reduced at the cost of increased memory use.