[issue33081] multiprocessing Queue leaks a file descriptor associated with the pipe writer

Chris Langton report at bugs.python.org
Thu Jan 31 19:31:03 EST 2019


Chris Langton <chrislangton84 at gmail.com> added the comment:

@pitrou I am interested in a fix for Python 2.7 because in Python 3.x the manner in which arithmetic is output is not arbitrary precise.

So I will continue using Python 2.7 until another language I am familiar with that has superior arbitrary precise arithmetic compared to python 3.x reaches a point where the lib ecosystem is as mature as python 2.7

I heavily use multiprocessing and have many use cases where i work around this issue, because i encounter it almost every time i find i need multiprocessing, basically i decide i need multiprocessing when i have too many external resources being processed by 1 CPU, meaning that multiprocessing will be managing thousands of external resources on immediate use!

I work around this issue with this solution instead of the Queue with always failed!

========================================================================================

#!/usr/bin/env python      

import multiprocessing, time

ARBITRARY_DELAY = 10

processes = []
for data in parse_file(zonefile_path, regex):
    t = multiprocessing.Process(target=write_to_json, args=(data, ))
    processes.append(t)

i = 0
for one_process in processes:
    i += 1
    if i % 1000 == 0:
        time.sleep(ARBITRARY_DELAY)
    one_process.start()

for one_process in processes:
    one_process.join()

========================================================================================

At the time (years ago) i don't think i knew enough about fd to be good enough to solve the root cause (or be elegant) and i've reused this code every time Queue failed me (every time i use Queue basically)

To be frank, i ask anyone and they say Queue is flawed.

Now, I am older, and I had some free time, I decided to fix my zonefile parsing scripts and use more elegant solutions, i finally looked at the old code i reused in many projects and identified it was actually a fd issue (yay for knowledge) and was VERY disappointed to see here that you didn't care to solve the problem for python 2.7.. very unprofessional..

Now i am disappointed to by unpythonic and add to my script gc... you're unprofessionalism now makes me be unprofessional

----------
nosy: +Chris Langton

_______________________________________
Python tracker <report at bugs.python.org>
<https://bugs.python.org/issue33081>
_______________________________________


More information about the Python-bugs-list mailing list