stable algorithm with complexity O(n)

Dan Upton upton at virginia.edu
Mon Dec 15 11:28:03 EST 2008


On Mon, Dec 15, 2008 at 11:05 AM,  <pruebauno at latinmail.com> wrote:
>> Non-comparison sorts are a useful technique, but it's changing the
>> problem, and they are only useful in very limited circumstances. There's
>> a good reason that most sort routines are based on O(n*log n) comparison
>> sorts instead of O(n) bucket sorts or radix sorts.
>>
> This is an assumption that I never quite understood. What most people
> want is to have sorted data, they don't care if I used a sorting or
> non-sorting comparison to do it. I think it is just that in most cases
> n is not very big anyway and comparison sorts make it easier on the
> programmer to create arbitrary types that are sortable.

And if n is small and sparse (ie, k > n) , O(k*n) for radix sort could
be worse than O(n^2).  You could also ask why people make such a big
deal about quicksort over mergesort, since mergesort has a guaranteed
O(n log n) time whereas quicksort can be O(n^2) on pathological cases.

I think I remember learning in my algorithms class that for small
sorts (n < ~40) , bubblesort can actually be the fastest (or close to
the fastest) in terms of wall-clock time because it has a relatively
small constant factor in its O(n^2) complexity.



More information about the Python-list mailing list