[Python-ideas] lazy tuple unpacking

Chris Rebert pyideas at rebertia.com
Sat Jul 5 03:10:44 CEST 2014


On Fri, Jul 4, 2014 at 5:59 PM, Paul Tagliamonte <paultag at gmail.com> wrote:
<snip>
> I notice that:
>
>     >>> a, b, c, *others = g_range(100)
>
> Works great. Super useful stuff there. Looks good.
>
> I also notice that this causes *others to consume the generator
> in a greedy way.
>
>     >>> type(others)
>     <class 'list'>
>
> And this makes me sad.
>
>     >>> a, b, c, *others = g_range(10000000000)
>     # will also make your machine very sad. Eventually resulting
>     # (ok, unless you've got a really fancy bit of kit) in:
>     Killed
>
> Really, the behavior (I think) should be more similar to:
>
>     >>> _x = g_range(1000000000000)
>     >>> a = next(_x)
>     >>> b = next(_x)
>     >>> c = next(_x)
>     >>> others = _x
>     >>>
>
>
> Of course, this leads to all sorts of fun errors, like the fact you
> couldn't iterate over it twice. This might not be expected. However, it
> might be nice to have this behavior when you're unpacking a generator.
>
> Thoughts?

It would mean an (IMHO undesirable) loss of consistency/symmetry,
type-wise, with other unpackings where this generator optimization
isn't possible:

Python 3.4.1 (default, May 19 2014, 13:10:29)
>>> x = [1,2,3,4,5,6,7]
>>> a,b,*c,d = x
>>> c
[3, 4, 5, 6]
>>> *e,f,g = x
>>> e
[1, 2, 3, 4, 5]

Cheers,
Chris


More information about the Python-ideas mailing list