How to demonstrate bigO cost of algorithms?

David Eppstein eppstein at ics.uci.edu
Wed Jun 2 13:21:19 EDT 2004


In article <slrncbrudt.995.rs at frank.overlook.homelinux.net>,
 Rusty Shackleford <rs at overlook.homelinux.net> wrote:

> Here's my clumsy attempt to implement this:
> 
> def qsort(ll):
>     try:
>         global n
>         n += len(ll)
>         print "adding %d to %d." % (len(ll), n)
>         print "incrementing up n to %d." % n
>     except:
>         pass
>     if len(ll) == 0:
>         return []
>     p = ll[0]
>     left = [x for x in ll if x < p]
>     mid = [x for x in ll if x == p]
>     right = [x for x in ll if x > p]
>     return qsort(left) + mid + qsort(right)
> 
> I'm not getting the results I hoped for.  I expected that after the
> program is finished, I would get results near 
> 
>     len(ll) * math.log(len(ll), 2)
> 
> since the big O for quicksort is nlogn for all but the worst case.

Are you sure you're not in the worst case?
Is ll random, or is it already sorted?
If already sorted, you should be seeing quadratic behavior.

-- 
David Eppstein                      http://www.ics.uci.edu/~eppstein/
Univ. of California, Irvine, School of Information & Computer Science



More information about the Python-list mailing list