Name | Best-case time complexity | Average time complexity | Worst-case time complexity | Space complexity | Stable? |
---|---|---|---|---|---|
Bubble sort | O(n) comparisons, O(1) swaps | O(n^2) comparisons and swaps | O(n^2) comparisons and swaps | O(1) | Yes |
Counting sort | - | O(n+k) | O(n+k) | O(n+k) | Yes |
Heapsort | O(n log n) | O(n log n) | O(n log n) | O(1) | No |
Insertion sort | O(n) comparisons, O(1) swaps | O(n^2) comparisons and swaps | O(n^2) comparisons and swaps | O(1) | Yes |
Merge sort | O(n log n) | O(n log n) | O(n log n) | O(n) | Yes |
Quicksort | O(n log n) | O(n log n) | O(n^2) | O(log n) | No |
Radix sort | O(k*n) | O(k*n) | O(k*n) | O(k+n) | Yes |
Selection sort | O(n^2) comparisons, O(1) swaps | O(n^2) comparisons, O(n) swaps | O(n^2) comparisons, O(n) swaps | O(1) | No |
Because ...
- Each partitioning is O(n) because you have to visit every element.
- Partitioning is repeated for the height of the tree.
- Hence, the time complexity is
n * height
.
You choose a pivot that partitions the input in half. In the case, the height of the tree is log(n)
because n ≤ 2^h - 1
.
You choose a pivot that is the smallest or largest element. In the case, the height of the tree is n
.
- Randomly choose a pivot.
- Choose the median as a pivot.
Name | Strengths | Weaknesses |
---|---|---|
Quicksort | * Fastest in practice * Parallelizable because it is a divide-and-conquer algorithm |
* Slow worst-case * Not stable |
Merge sort | * Fast worst-case * Parallelizable because it is a divide-and-conquer algorithm * Stable |
Space inefficient |
Heapsort | * Fast worst-case * Space efficient |
* Slow in practice * Not stable |
- The space complexity O(1) means in-place sorting.
- Stable sorting algorithms maintain the relative order of records with equal values.
- n and 2n are both O(n).
- Quicksort and heapsort are single words but merge sort is not.