Sorting Algorithms — Visualized and Compared
A quick comparison of common sorting algorithms with a built-in visualizer component for bubble and insertion sort.
Sorting Algorithms — Visualized and Compared
This post compares several classic sorting algorithms and includes an inline visualizer you can play with. Try switching algorithms and watch the swaps.
Play with a Visualizer
All demos below use 50 items and do not auto-start. Click “play” to animate.
Bubble Sort
Bubble sort repeatedly compares adjacent elements and swaps them if they’re out of order. Each pass “bubbles” the largest item to the end.
Quick Sort
Quick sort partitions around a pivot, then recursively sorts the two sides. Average case is fast, but bad pivots can degrade performance.
Insertion Sort
Insertion sort builds a sorted prefix by inserting each next element into its correct position within the prefix.
Selection Sort
Selection sort repeatedly selects the smallest remaining element and places it at the front. It does minimal swaps but always takes O(n²).
Shell Sort
Shell sort performs gapped insertion sorts with decreasing gaps, improving performance on medium-sized arrays.
Merge Sort
Merge sort recursively splits the array and merges sorted halves. It’s stable and reliably O(n log n) but needs extra memory.
Heap Sort
Heap sort builds a max-heap and repeatedly extracts the maximum to the end. In-place, O(n log n), but not stable.
Counting Sort
Counting sort counts occurrences of each value and reconstructs the array. Great for small integer ranges.
Bucket Sort
Bucket sort distributes values into buckets, sorts each bucket, then concatenates them. Works best with uniform distributions.
Radix Sort
Radix sort sorts by digits (least-significant first), using a stable sub-sort. Efficient for fixed-length keys.
Bogo (Monkey) Sort
Bogo sort randomly shuffles until the array is sorted. It’s intentionally terrible and only for fun.
Algorithms Covered
- Bubble sort: simple but O(n²). Repeatedly swap adjacent out-of-order elements.
- Insertion sort: good for nearly-sorted arrays; O(n²) worst, O(n) best.
- Selection sort: always O(n²); minimal swaps but not stable.
- Shell sort: gapped insertion; practical improvements over O(n²) on medium inputs.
- Merge sort: O(n log n) stable, great for large arrays; needs extra memory.
- Quick sort: average O(n log n), in-place; pivot choice affects worst case.
- Heap sort: O(n log n) in-place; not stable.
- Counting sort: O(n + k) for small integer ranges; not comparison-based.
- Bucket sort: distributes to buckets then sorts each; good for uniform distributions.
- Radix sort: O(d*(n+k)); great for fixed-length integers/strings.
- Bogo (monkey) sort: randomized shuffling until sorted; only for fun.
Big-O Summary
| Algorithm | Best | Average | Worst | Stable | In-place |
|---|---|---|---|---|---|
| Bubble | O(n) | O(n²) | O(n²) | Yes | Yes |
| Insertion | O(n) | O(n²) | O(n²) | Yes | Yes |
| Selection | O(n²) | O(n²) | O(n²) | No | Yes |
| Shell | ~ | ~ | O(n²) | No | Yes |
| Merge | O(n log n) | O(n log n) | O(n log n) | Yes | No |
| Quick | O(n log n) | O(n log n) | O(n²) | No | Yes |
| Heap | O(n log n) | O(n log n) | O(n log n) | No | Yes |
| Counting | O(n + k) | O(n + k) | O(n + k) | Yes | No |
| Bucket | O(n + k) | O(n + k) | O(n²) | Depends | No |
| Radix | O(d*(n+k)) | O(d*(n+k)) | O(d*(n+k)) | Yes | No |
| Bogo | O(n) | O((n+1)!) | Unbounded | No | Yes |
Practical Notes
- Use insertion sort for tiny arrays or when inputs are nearly sorted.
- Merge/Tim sort are common in standard libraries for general purpose stability.
- Quick sort variants dominate in practice; choosing good pivots (median-of-three) and tail recursion elimination are key.
- Avoid O(n²) methods for large n unless input characteristics justify it.
Want more? We can extend the visualizer to support merge/quick sort and show comparisons per second (ops) and exact swap counts.