[Python-Dev] PEP 380 (yield from a subgenerator) comments

Guido van Rossum guido at python.org
Thu Mar 26 05:24:21 CET 2009


ISTR that the motivation for adding new syntax is that the best you
can do using a trampoline library is still pretty cumbersome to use
when you have to write a lot of tasks and subtasks, and when using
tasks is just a tool for getting things done rather than an end goal
in itself. I agree that the motivation and the comparison should be
added to the PEP (perhaps moving the trampoline sample
*implementation* to a reference or an appendix, since it is only the
appearance of the trampoline-*using* code that matters).

--Guido

On Wed, Mar 25, 2009 at 7:26 AM, P.J. Eby <pje at telecommunity.com> wrote:
> At 06:03 PM 3/25/2009 +1200, Greg Ewing wrote:
>>
>> I wanted a way of writing suspendable functions that
>> can call each other easily. (You may remember I
>> originally wanted to call it "call".) Then I noticed
>> that it would also happen to provide the functionality
>> of earlier "yield from" suggestions, so I adopted that
>> name.
>
> I still don't see what you gain from making this syntax, vs. putting
> something like this in the stdlib (rough sketch):
>
> class Task(object):
>    def __init__(self, geniter):
>        self.stack = [geniter]
>
>    def __iter__(self):
>        return self
>
>    def send(self, value=None):
>        if not self.stack:
>            raise RuntimeError("Can't resume completed task")
>        return self._step(value)
>
>    send = next
>
>    def _step(self, value=None, exc_info=()):
>        while self.stack:
>            try:
>                it = self.stack[-1]
>                if exc_info:
>                    try:
>                        rv = it.throw(*exc_info)
>                    finally:
>                        exc_info = ()
>                elif value is not None:
>                    rv = it.send(value)
>                else:
>                    rv = it.next()
>            except:
>                value = None
>                exc_info = sys.exc_info()
>                if exc_info[0] is StopIteration:
>                    exc_info = ()   # not really an error
>                self.pop()
>            else:
>                value, exc_info = yield_to(rv, self)
>        else:
>            if exc_info:
>                raise exc_info[0], exc_info[1], exc_info[2]
>            else:
>                return value
>
>    def throw(self, *exc_info):
>        if not self.stack:
>            raise RuntimeError("Can't resume completed task")
>        return self._step(None, exc_info)
>
>    def push(self, geniter):
>        self.stack.append(geniter)
>        return None, ()
>
>    def pop(self, value=None):
>        if self.stack:
>            it = self.stack.pop()
>            if hasattr(it, 'close'):
>                try:
>                    it.close()
>                except:
>                    return None, sys.exc_info()
>        return value, ()
>
>    @classmethod
>    def factory(cls, func):
>        def decorated(*args, **kw):
>            return cls(func(*args, **kw))
>        return decorated
>
>
> def yield_to(rv, task):
>    # This could/should be a generic function, to allow yielding to
>    # deferreds, sockets, timers, and other custom objects
>    if hasattr(rv, 'next'):
>        return task.push(rv)
>    elif isinstance(rv, Return):
>        return task.pop(rv.value)
>    else:
>        return rv, ()
>
> class Return(object):
>    def __init__(self, value=None):
>        self.value = value
>
>
> @Task.factory
> def sample_task(arg1, another_arg):
>    # blah blah
>    something = (yield subtask(...))
>
>    yield Return(result)
>
> def subtask(...):
>    ...
>    yield Return(myvalue)
>
>
> The trampoline (the _step() method) handles the co-operative aspects, and
> modifying the yield_to() function allows you to define how yielded values
> are processed.  By default, they're sent back into the generator that yields
> them, but you can pass a Return() to terminate the generator and pass the
> value up to the calling generator.  Yielding another generator, on the other
> hand, "calls" that generator within the current task, and the same rules
> apply.
>
> Is there some reason why this won't do what you want, and can't be modified
> to do so?  If so, that should be part of the PEP, as IMO it otherwise lacks
> motivation for a language feature vs. say, a stdlib module.  If 'yield_to'
> is a generic function or at least supports registration of some kind, a
> feature like this would be interoperable with a wide variety of frameworks
> -- you could register deferreds and delayed calls and IO objects from
> Twisted, for example.  So it's not like the feature would be creating an
> entire new framework of its own.  Rather, it'd be a front-end to whatever
> framework (or no framework) you're using.
>
> _______________________________________________
> Python-Dev mailing list
> Python-Dev at python.org
> http://mail.python.org/mailman/listinfo/python-dev
> Unsubscribe:
> http://mail.python.org/mailman/options/python-dev/guido%40python.org
>



-- 
--Guido van Rossum (home page: http://www.python.org/~guido/)


More information about the Python-Dev mailing list