why does memory consumption keep growing?

Pankaj L Ahire pankajahire at gmail.com
Thu Oct 5 17:35:06 EDT 2017


On Thu, Oct 5, 2017 at 17:07 Fetchinson . via Python-list <
python-list at python.org> wrote:

> Hi folks,
>
> I have a rather simple program which cycles through a bunch of files,
> does some operation on them, and then quits. There are 500 files
> involved and each operation takes about 5-10 MB of memory. As you'll
> see I tried to make every attempt at removing everything at the end of
> each cycle so that memory consumption doesn't grow as the for loop
> progresses, but it still does.
>
> import os
>
> for f in os.listdir( '.' ):
>
>     x = [ ]
>
>     for ( i, line ) in enumerate( open( f ) ):
>
>         import mystuff
>         x.append( mystuff.expensive_stuff( line ) )
>         del mystuff
>
>     import mystuff
>     mystuff.some_more_expensive_stuff( x )
>     del mystuff
>     del x
>
>
> What can be the reason? I understand that mystuff might be leaky, but
> if I delete it, doesn't that mean that whatever memory was allocated
> is freed? Similary x is deleted so that can't possibly make the memory
> consumption go up.
>
> Any hint would be much appreciated,
> Daniel


You are not closing f anywhere.
Better to use a context manager, so it does the clean up on exit.

for f in os.listdir( '.' ):
    with open(f) as fh:
        for (i, line) in enumerate(fh):
            #your code.
        #for loop done
    #context manager flushes, closes fh nicely.

P



>
>
>
> --
> Psss, psss, put it down! - http://www.cafepress.com/putitdown
> --
> https://mail.python.org/mailman/listinfo/python-list
>



More information about the Python-list mailing list