The Magical Powers of Counting Sort

Tonight, I was doing some technical interview practice and I did a problem that illustrated the usefulness of counting sort. The problem gave an array of scores and the highest possible score and asked for a solution to sort the scores in less than O(n log n) time. I was stumped. Radix sort crossed my mind, but I did not quite understand how to implement it.

I guessed that the solution would likely involve me iterating through the array. Since I had the highest possible score, my first guess was that the solution had something to do with the difference between a specific score and the highest possible score. It turned out that the difference did not matter. After a few hints, I came up with a solution and then learned that this solution was counting sort.

How did this solution work? Well, the trick is to allocate a "helper" array that is the length of the highest possible score plus 1. Then, go through the array of scores and for each value in that array, increment the value at the index in the helper array that corresponds to the score. That way, you'll keep a count of how many of that score that you have. Finally, that helper array can be used to create an array with the scores in order.

If you've learned about sorting algorithms, you may have been told that a sorting algorithm can never be less than O(n log n) time. I think the assumption with that statement is that you are doing comparison sorting.

Counting sort is different and pretty clever since you're never actually comparing two numbers. It is really exciting for me to understand this algorithm because when I first learned about sorting algorithms, I always found the ones with run times of less than O(n log n) to be magical and mysterious.

Thank you for reading!

Comments