Python garbage collection: not releasing memory to OS!

Michael Torrie torriem at gmail.com
Fri Apr 15 09:59:47 EDT 2016


On 04/15/2016 04:25 AM, cshintov at gmail.com wrote:
> The input was a 4MB file. Even after returning from the 'fileopen'
> function the 4MB memory was not released. I checked htop output while
> the loop was running, the resident memory stays at 14MB. So unless
> the process is stopped the memory stays with it.

I guess the question is, why is this a problem?  If there are no leaks,
then I confess I don't understand what your concern is.  And indeed you
say it's not leaking as it never rises above 14 MB.

Also there are ways of reading a file without allocating huge amounts of
memory.  Why not read it in in chunks, or in lines.  Take advantage of
Python's generator facilities to process your data.

> So if the celery worker is not killed after its task is finished it
> is going to keep the memory for itself. I know I can use
> **max_tasks_per_child** config value to kill the process and spawn a
> new one. **Is there any other way to return the memory to OS from a
> python process?.**

Have you tried using the subprocess module of python? If I understand it
correctly, this would allow you to run python code as a subprocess
(completely separate process), which would be completely reaped by the
OS when it's finished.




More information about the Python-list mailing list