Insertion, Bubble, and Selection Sorts
Insertion Sort
Have you ever played a card game like Big 2? It is a game where the objective is to run out of cards first with a wide variety of rules. To visualize how insertion sort works, imagine the dealer gave you a card.
King of Hearts! What can you say, you’re a lady’s man.
The dealer comes around to you again and you have a Four of Hearts. You put next to the the king.
Then you get a Seven of Clovers, and you put it in between the King and the Four to keep all the cards in order.
As you are dealt cards, you sort one by one until your entire hand is sorted. This is the basis of insertion sort. The worst case running time for insertion sort is O(n^2) and this is observed when it is in backwards order. So for an array, [5, 4, 3, 2, 1] is the worst case for insertion sort despite being easy for humans to process. The best case is when it is in order, so [1, 2, 3, 4, 5] would yield a running time of O(n). The average case scenario is O(n^2) as well, as we ignore lesser magnitudes. This means that insertion sorts are only useful when the data is nearly sorted. This is why quicksort take advantage of insertion sorts.
Bubble Sort
So in the last post, I trashed bubble sort. I claim that it has no redeeming features, and I will NOT retract that statement. It is only taught in intro CS classes because it is easy to memorize. You know that nice card sorting analogy I presented for insertion sort? Well I can't think of one for bubble sorting. It's very non-intuitive and is relatively slow, even among other O(n^2) sorts. It is essentially two for-loops and an if-statement with a Comparable method.
To visualize it, think of an array of size five.
It will compare index 0 and 1 and swap if needed. Then index 1 and 2. Index 2 and 3. Index 3 and 4. That's the first round. Then it goes back and at does it again, but ignoring the last position because it contains the largest element.
So now, comparing 0 and 1, 1 and 2, 2 and 3. And that’s the second round.
This is repeated until it is finished.
While it is understandable and easy to implement, there are just better alternatives. With the best, average, and worst case scenarios all being O(n^2), there should be no reason for you to use it.
Selection Sort
Congruous to its name, selection sort does exactly what you think it does: it selects.
Suppose you want to line up your collection of super yachts by length but they are docked in random orders.
You run up and down asking the crew to shout how long the boats are. Once you make the first pass through, you order your biggest boat to move to the right most spot.
You go through a second pass and then order your second boat to move next to the first one.
This repeats until your puny 60 foot yacht is put in its miserable place.
The number of comparisons in a selection sort is O(n^2) while the number of swaps is O(n). The smart approach to keeping record swaps low is to instead swap pointers instead of the the records themselves. You will need to add space for the pointers, but this trade-off is great time-wise.