[issue38501] multiprocessing.Pool hangs atexit (and garbage collection sometimes)

Pablo Galindo Salgado report at bugs.python.org
Fri Apr 10 13:54:06 EDT 2020


Pablo Galindo Salgado <pablogsal at gmail.com> added the comment:

> If that's out of contract, perhaps there should probably a big, visible warning at the top of the multiprocessning docs stating that creating one of these objects requires either using a context manager or ensuring manual `.close()`ing?

Why? This is a resource like any other and it requires proper resource management. Would you also put a big warning on "open()" stating that opening a file requires either using a context manager or ensure a manual close()?

>  the first Python 2.7 example in the docs 

Python 2.7 is not supported and the pool has changed *a lot* since Python 2. For instance, the pool now does more correct resource management, it does not leak threads and it supports more safe mechanisms as a context manager. The reason it didn't hang that much in Python2.7 is likely because some threads were being leaked.

> This is called automatically when the queue is garbage collected.

Yeah, and CPython does not promise that the __del__ method of any object will be called, so is not assured that the finalized will call close():

https://docs.python.org/3/reference/datamodel.html#object.__del__

"It is not guaranteed that __del__() methods are called for objects that still exist when the interpreter exits"

----------

_______________________________________
Python tracker <report at bugs.python.org>
<https://bugs.python.org/issue38501>
_______________________________________


More information about the Python-bugs-list mailing list