[issue32485] Multiprocessing dict sharing between forked processes
André Neto
report at bugs.python.org
Wed Jan 3 05:12:58 EST 2018
André Neto <andre.c.neto at gmail.com> added the comment:
You are right, it does not segfault (sorry for the abuse of language). It raises an exception while accessing the shared dictionary. The exception varies but typically is:
Traceback (most recent call last):
File "multiprocessbug.py", line 156, in <module>
test_manyForkedProcessesSingleThreaded(inst1, inst2, nRuns, nProcesses)
File "multiprocessbug.py", line 77, in test_manyForkedProcessesSingleThreaded
run(inst1, nRuns)
File "multiprocessbug.py", line 29, in run
inst.run()
File "multiprocessbug.py", line 18, in run
if (self.d.has_key(self.key)):
File "<string>", line 2, in has_key
File "/usr/local/lib/python2.7/multiprocessing/managers.py", line 759, in _callmethod
kind, result = conn.recv()
cPickle.UnpicklingError: invalid load key, '#'.
----------
_______________________________________
Python tracker <report at bugs.python.org>
<https://bugs.python.org/issue32485>
_______________________________________
More information about the Python-bugs-list
mailing list