Skip to main content

Section 8.18 Merge Sort Compared

We have established that the Merge Sort algorithm requires \(O(n \cdot log_2(n))\) work to sort a list of size n. How does that compare to Insertion Sort and Selection Sort, which are \(O(n^2)\text{?}\)
Below is a graph of the growth of \(n^2\) and \(n \cdot log_2(n)\text{:}\)
Obviously, \(n^2\) grows much faster than \(n \cdot log_2(n)\text{.}\) Looking at the table below, we can see that sorting 500 items with Insertion Sort (\(O(n^2)\)) would take somewhere around 250,000 units of work. Doing the same task with Merge Sort would only take around 4,483 units of work!
Problem size 10 100 500 1,000 10,000 100,000
\(n \cdot log_2(n)\) 33 664 4,483 9,966 132,877 1,660,964
\(n^2\) 100 10,000 250,000 1,000,000 100,000,000 10,000,000,000
Using the Big-O classification of each algorithm like this gives us a rough estimate of the work. We are ignoring constant factors that may change the exact numbers (especially for low values of n). Merge Sort has a greater basic overhead than Insertion Sort (for example, we have to copy items to and from a second list) - if you are solving small problems it may take more time to run than Insertion Sort due to this. But we know that as n grows larger the basic pattern is going to hold - \(O(n^2)\) is going to grow faster than \(O(n \cdot log_2(n))\text{,}\) so Merge Sort will certainly be faster out for large values of n.
Below, you can use the Sort Timer to simulate running sorts of different sizes using the two algorithms. You can use the slider to change the size of the list being sorted. Try comparing the two algorithms at different sort sizes. Does Merge Sort always win? At what point does Insertion Sort start taking more than a half-second to run? At what point does Merges Sort start taking more than a half-second?
Sort Timer
List Length:
You have attempted of activities on this page.