How to demonstrate bigO cost of algorithms?

Tim Daneliuk tundra at tundraware.com
Wed Jun 2 14:34:41 EDT 2004


Rusty Shackleford wrote:

> Hi -
> 
> I'm studying algorithms and I want to write a python program that
> calculates the actual runtimes.
> 
> I want to increment up some global variable called n in my program so
> that I can see the relationship between the size of the problem and the
> number of steps.
> 
> Here's my clumsy attempt to implement this:

One Thought:

Bear in mind that O(n) is the _asymptotic_ behavior for an algorithm.  It
does not consider the "constant time" factors in an algorithm like the initial
instantiation of variables, loops, objects and so forth.   It only
considers the complexity of the algorithm itself.

For example, suppose you have two algorithms with an asymptotic
complexity of O(n), but the first takes 5 seconds to set up its outer
loops and control variables. Say the second takes 5 milliseconds to do
the same thing. From an algorithmic complexity POV these are both O(n).

The point is that you have to use a sufficiently large 'n' to
make this constant overhead irrelevant to the test.  If you are looking
for n*log(n) growth, you're probably not going to see it for
n=1, n=2, n=3 ...  You're going to need n=100, n=1000, n=10000.
IOW, choose a large enough iteration count to dwarf the constant time
overhead of the program...

----------------------------------------------------------------------------
Tim Daneliuk     tundra at tundraware.com
PGP Key:         http://www.tundraware.com/PGP/



More information about the Python-list mailing list