Abuse of Big Oh notation

Oscar Benjamin oscar.j.benjamin at gmail.com
Mon Aug 20 04:24:33 EDT 2012


On Sun, 19 Aug 2012 16:42:03 -0700, Paul Rubin 
<no.email at nospam.invalid> wrote:
> Steven D'Aprano <steve+comp.lang.python at pearwood.info> writes:
> > Of course *if* k is constant, O(k) is constant too, but k is not 
> > constant. In context we are talking about string indexing and 
slicing. 
> > There is no value of k, say, k = 2, for which you can say "People 
will 
> > sometimes ask for string[2] but never ask for string[3]". That is 
absurd.


> The context was parsing, e.g. recognizing a token like "a" or "foo" 
in a
> human-written chunk of text.  Occasionally it might be 
"sesquipidalian"
> or some even worse outlier, but one can reasonably put a fixed and
> relatively small upper bound on the expected value of k.  That 
makes the
> amortized complexity O(1), I think.

No it doen't. It is still O(k). The point of big O notation is to 
understand the asymptotic behaviour of one variable as it becomes 
large because of changes in other variables. If k is small then you 
can often guess that O(k) is small. To say that an operation is O(k), 
however, is a statement about what happens when k is big (and is not 
refuted by saying that k is typically not big).

Oscar




More information about the Python-list mailing list