why does memory consumption keep growing?

Fetchinson . fetchinson at googlemail.com
Thu Oct 5 17:06:49 EDT 2017


Hi folks,

I have a rather simple program which cycles through a bunch of files,
does some operation on them, and then quits. There are 500 files
involved and each operation takes about 5-10 MB of memory. As you'll
see I tried to make every attempt at removing everything at the end of
each cycle so that memory consumption doesn't grow as the for loop
progresses, but it still does.

import os

for f in os.listdir( '.' ):

    x = [ ]

    for ( i, line ) in enumerate( open( f ) ):

        import mystuff
        x.append( mystuff.expensive_stuff( line ) )
        del mystuff

    import mystuff
    mystuff.some_more_expensive_stuff( x )
    del mystuff
    del x


What can be the reason? I understand that mystuff might be leaky, but
if I delete it, doesn't that mean that whatever memory was allocated
is freed? Similary x is deleted so that can't possibly make the memory
consumption go up.

Any hint would be much appreciated,
Daniel




-- 
Psss, psss, put it down! - http://www.cafepress.com/putitdown



More information about the Python-list mailing list