Big-O meets the world

Daniel Lemire wrote a good article on how big-O notation is a great teaching tool but not the end of the story.

One example from a previous project I worked on: bubble sort. Bubble sort is conceptually simple but analytically troublesome. It is O(n^2); its analysis, while not difficult, is not as straightforward other sorts; and worst for practice, its worst-case scenarios are are common (unlike quicksort).

It does, however, have excellent constants (due to its simplicity), great cache locality (in part due to being in-place), and near-linear runtime on almost-sorted data.

I was working on a 3D graphics project and, to render alpha-blending, I needed to draw my faces in sorted order by depth. I started using the STL’s generic sorting algorithm (a hybrid merge/insertion sort), but my redraw was slow. However, as the user rotated and moved the object, the faces wouldn’t get very far out of sort from frame to frame.

So I replaced the sort with a bubble sort, and redraw greatly improved.

Big-O is great for teaching, first-order approximations of behavior, and understanding how an algorithm scales. But you still need to understand your problem, and details hidden in asymptotic performance bounds may well matter a lot. After all, Fibonacci heaps are rarely used in practice.