Why Should You Know Sorting Algorithms?
Sorting is one of the most fundamental and important operations in computer science. It is used in nearly every software system — from database queries and search engines to recommendation systems. Sorting is also an essential topic in coding interviews, and understanding sorting algorithms naturally leads to grasping core concepts like divide and conquer, heaps, and stability.
In this guide, we compare the 7 most widely used sorting algorithms in terms of time complexity, stability, and space complexity, and explain how each one works.
At a Glance
| Algorithm | Best | Average | Worst | Space | Stable |
|---|---|---|---|---|---|
| Bubble Sort | O(n) | O(n²) | O(n²) | O(1) | Yes |
| Insertion Sort | O(n) | O(n²) | O(n²) | O(1) | Yes |
| Selection Sort | O(n²) | O(n²) | O(n²) | O(1) | No |
| Merge Sort | O(n log n) | O(n log n) | O(n log n) | O(n) | Yes |
| Quick Sort | O(n log n) | O(n log n) | O(n²) | O(log n) | No |
| Heap Sort | O(n log n) | O(n log n) | O(n log n) | O(1) | No |
| Counting Sort | O(n+k) | O(n+k) | O(n+k) | O(k) | Yes |
1. Bubble Sort
Bubble Sort compares adjacent elements and swaps them if they are in the wrong order. Like bubbles rising to the surface, the largest values "bubble up" to the end of the array.
How It Works:
- •Compare adjacent elements from start to end
- •Swap if they are in the wrong order
- •After each pass, the largest unsorted element is in its correct position
- •Repeat until no swaps are needed
Time Complexity: Best O(n), Average/Worst O(n²)
Simple to implement and great for educational purposes, but rarely used in practice due to its poor performance.
2. Insertion Sort
Insertion Sort works like sorting cards in your hand. You take each new card and insert it into its proper position among the already sorted cards.
How It Works:
- •Start from the second element
- •Compare the current element with the sorted portion
- •Find the correct position and insert
- •Repeat for all elements
Time Complexity: Best O(n), Average/Worst O(n²)
Extremely efficient for nearly sorted data and small datasets. Java's Arrays.sort() actually uses Insertion Sort for small arrays.
3. Selection Sort
Selection Sort repeatedly finds the minimum element from the unsorted portion and places it at the beginning.
How It Works:
- •Find the minimum in the unsorted portion
- •Swap it with the first unsorted element
- •Expand the sorted portion by one
- •Repeat until fully sorted
Time Complexity: Always O(n²)
The advantage is that it performs at most O(n) swaps, but it always makes O(n²) comparisons regardless of the data's initial order.
4. Merge Sort
Merge Sort is a classic example of the Divide and Conquer strategy. It splits the array in half, recursively sorts each half, then merges the sorted halves together.
How It Works:
- •Split the array in half (Divide)
- •Recursively sort each half (Conquer)
- •Merge the two sorted halves (Merge)
Time Complexity: Always O(n log n)
Provides stable and predictable performance. The downside is the O(n) extra memory required, but for linked lists, it can be implemented with O(1) extra space.
5. Quick Sort
Quick Sort partitions elements around a pivot — smaller values go left, larger values go right. It is one of the most widely used sorting algorithms in practice.
How It Works:
- •Choose a pivot element
- •Partition: elements smaller than pivot go left, larger go right
- •Recursively sort each partition
Time Complexity: Best/Average O(n log n), Worst O(n²)
On average, it is the fastest general-purpose sorting algorithm. It has excellent cache performance and can sort in-place. The worst case can be avoided with smart pivot selection strategies.
6. Heap Sort
Heap Sort uses the max-heap (or min-heap) data structure. It repeatedly extracts the maximum element from the heap.
How It Works:
- •Build a max-heap from the array (Build Heap)
- •Move the root (maximum) to the end of the array
- •Reduce heap size and restore heap property (Heapify)
- •Repeat until the heap is empty
Time Complexity: Always O(n log n)
Guarantees O(n log n) with no extra memory. Closely related to priority queues — understanding Heap Sort makes priority queue problems much easier.
7. Counting Sort
Counting Sort is a non-comparison-based sorting algorithm. It counts the occurrences of each value to determine the sorted order.
How It Works:
- •Count the occurrences of each value in a count array
- •Compute the cumulative sum of the count array
- •Traverse the original array in reverse, placing each element in its correct position
Time Complexity: O(n + k), where k is the range of values
Extremely efficient when the range of values (k) is not much larger than the number of elements (n). It achieves linear-time sorting, breaking the O(n log n) barrier.
Which Sort for Which Situation?
Nearly sorted data → Insertion Sort
- •Best case O(n), extremely fast.
Stability required → Merge Sort
- •Preserves the relative order of equal elements.
Fastest on average → Quick Sort
- •Great cache performance, most widely used in practice.
Memory constrained → Heap Sort
- •Guarantees O(n log n) with O(1) extra space.
Limited range of values → Counting Sort
- •Achieves linear time O(n+k).
See It in Action on AlgoNote
Reading about algorithms is one thing — seeing them work step by step is another. On AlgoNote, you can visually watch each sorting algorithm in action. Adjust the speed, change the data, and experiment to build real understanding.