slowdown on looping

Bengt Richter bokr at oz.net
Sat Oct 4 03:59:12 EDT 2003


On 3 Oct 2003 22:50:42 -0700, trevp at trevp.net (Trevor Perrin) wrote:

>After running a simple loop around 100-200 million times, there's a
>speed drop of about a factor of 7.  This happens within several
>minutes (on a 1.7 Ghz machine), so it's not that hard to see.  I'm
>using Python 2.3.2 under Windows.
>
>Is anyone familiar with this?
>
>import time
>startTime = time.time()
>x = 0
>while 1:
>    x += 1
>    if x % 1000000 == 0:
>        endTime = time.time()
>        print "time = ", endTime - startTime, x
>        startTime = endTime            
>               
There will be a point where it switches gears from 32-bit int counting to unlimited
(except for memory) bitwidth long integer counting, i.e., when the count becomes 2**31.
But that is over 2,000 million, not 100-200 million, so I would wonder if you have
a hot-running laptop that decides to slow down when it gets to a certain temp. Or maybe
your fan/heatsink is clogged with dust, or the heatsink isn't making good thermal contact
with the CPU?

 >>> 2**31-1
 2147483647L

That says "L" for long, because it became long before the 1 was subtracted.

 >>> 2**30-1+2**30
 2147483647

One way to avoid that overflow, by doing in another order

 >>> x = 2**30-1+2**30
 >>> x
 2147483647

Simulating your code

 >>> x += 1
 >>> x
 2147483648L

Now you are in the long-integer representation mode, which is not as fast as int,
and as the size of the number grows, it will get gradually get slower, because it
will be manipulating larger and larger representations. A rough measure of how big
would be len('%x'%x). But as mentioned, I don't think this is the problem.
To see that kind of effect, calculate pi to thousands of decimals or monster
factorials or something like that ;-)

Regards,
Bengt Richter




More information about the Python-list mailing list