[issue21163] asyncio doesn't warn if a task is destroyed during its execution

STINNER Victor report at bugs.python.org
Fri Jun 20 17:16:22 CEST 2014


STINNER Victor added the comment:

> The more I use asyncio, the more I am convinced that the correct fix is to keep a strong reference to a pending task (perhaps in a set in the eventloop) until it starts.

The problem is not the task, read again my message. The problem is that nobody holds a strong reference to the Queue, whereas the producer is supposed to fill this queue, and the task is waiting for it.

I cannot make a suggestion how to fix your example, it depends on what you want to do.

> Without realizing it, I implicitly made this assumption when I began working on my asyncio project (a bitcoin node) in Python 3.3. I think it may be a common assumption for users. Ask around. I can say that it made the transition to Python 3.4 very puzzling.

Sorry, I don't understand the relation between this issue and the Python version (3.3 vs 3.4). Do you mean that Python 3.4 behaves differently?

The garbage collection of Python 3.4 has been *improved*. Python 3.4 is able to break more reference cycles.

If your program doesn't run anymore on Python 3.4, it means maybe that you rely on reference cycle, which sounds very silly.

> In several cases, I've needed to create a task where the side effects are important but the result is not. Sometimes this task is created in another task which may complete before its child task begins, which means there is no natural place to store a reference to this task. (Goofy workaround: wait for child to finish.)

I'm not sure that this is the same issue. If you think so, could you please write a short example showing the problem?

----------

_______________________________________
Python tracker <report at bugs.python.org>
<http://bugs.python.org/issue21163>
_______________________________________


More information about the Python-bugs-list mailing list