problems caused by very large for-loop

sam python.sam at googlemail.com
Thu Dec 7 20:15:44 EST 2006


hi all,

i am writing some software to model polymerisation kinetics and have
created a way of doing so which involves taking ever smaller slices of
time to model until some sort of convergence was observed in the
results.

so far so good, but i was using 'for i in the range(iterations):' a for
loop over each slice of time, where the number of iterations was
getting into the tens of millions. up until about 125,000,000 it
worked, then i got a MemoryError.

now i wasn't surprised to get a memory error eventually because what i
was doing was some serous number-crunching. but it didn't occur to me
where it was at first. i had to keep some results in memory to
calculate something at the end, but that was about a million integers,
not that hard with 512 Mb of RAM. then i thought it might be something
structural in the way python works, but i couldn't guess at what it
might be.

thinking about it a bit more, i remembered having read that the
for-loop declaration i used actually created a list of integers, which
would be eating up over half of my memory before i'd even started doing
anything. the program was creating a directory as instructed, then
doing nothing, just churning, which made it clear that the problem was
encountered right at the start of the program. that helped clear it up,
but i was surprised that something as innocuous as the for-loop was
creating such a problem. i changed it to a while-loop and it worked
fine.

has anyone else bumped up against this problem before? i suppose
for-loops with 250 million iterations are seldom used in most
applications. it was just the first time i'd ever solved a problem by
actually having some insight into how python works at a slightly lower
level.

anyway, sorry to be so long-winded. i'm just glad the damn thing's
working again... :)

sam




More information about the Python-list mailing list